Download 6 per page - Per-Olof Persson - University of California, Berkeley

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Non-negative matrix factorization wikipedia , lookup

Hilbert space wikipedia , lookup

Fundamental theorem of algebra wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Jordan normal form wikipedia , lookup

Tensor operator wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Euclidean vector wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Matrix multiplication wikipedia , lookup

System of linear equations wikipedia , lookup

Matrix calculus wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Vector space wikipedia , lookup

Cartesian tensor wikipedia , lookup

Four-vector wikipedia , lookup

Bra–ket notation wikipedia , lookup

Dual space wikipedia , lookup

Linear algebra wikipedia , lookup

Basis (linear algebra) wikipedia , lookup

Transcript
Linear Transformations
Chapter 2 – Linear Transformations and Matrices
Definition
We call a function T : V → W a linear transformation from V to W
if, for all x, y ∈ V and c ∈ F , we have
(a) T(x + y) = T(x) + T(y) and
(b) T(cx) = cT(x)
Per-Olof Persson
[email protected]
Department of Mathematics
University of California, Berkeley
1
If T is linear, then T(0 ) = 0
2
T is linear ⇐⇒ T(cx + y) = cT(x) + T(y) ∀x, y ∈ V, c ∈ F
3
4
Math 110 Linear Algebra
If T is linear, then T(x − y) = T(x) − T(y) ∀x, y ∈ V
T isPlinear ⇐⇒ for
Px1 , . . . , xn ∈ V and a1 , . . . , an ∈ F ,
T ( ni=1 ai xi ) = ni=1 ai T(xi )
Special linear transformations
The identity transformation IV : V → V: IV (x) = x, ∀x ∈ V
The zero transformation T0 : V → W: T0 (x) = 0 ∀x ∈ V
Null Space and Range
Definition
For linear T : V → W, the null space (or kernel) N(T) of T is the
set of all x ∈ V such that T(x) = 0 : N(T) = {x ∈ V : T(x) = 0 }
The range (or image) R(T) of T is the subset of W consisting of
all images of vectors in V: R(T) = {T(x) : x ∈ V}
Theorem 2.1
For vector spaces V, W and linear T : V → W, N(T) and R(T) are
subspaces of V and W, respectively.
Theorem 2.2
For vector spaces V, W and linear T : V → W, if β = {v1 , . . . , vn }
is a basis for V, then
Nullity and Rank
Definition
For vector spaces V, W and linear T : V → W, if N(T) and R(T)
are finite-dimensional, the nullity and the rank of T are the
dimensions of N(T) and R(T), respectively.
Theorem 2.3 (Dimension Theorem)
For vector spaces V, W and linear T : V → W, if V is
finite-dimensional then
nullity(T) + rank(T) = dim(V)
R(T) = span(T(β)) = span({T(v1 ), . . . , T(vn )})
Properties of Linear Transformations
Theorem 2.4
For vector spaces V, W and linear T : V → W, T is one-to-one if
and only if N(T) = {0 }.
Theorem 2.5
For vector spaces V, W of equal (finite) dimension and linear
T : V → W, the following are equivalent:
(a) T is one-to-one
(b) T is onto
(c) rank(T) = dim(V)
Linear Transformations and Bases
Theorem 2.6
Let V, W be vector spaces over F and {v1 , . . . , vn } a basis for V.
For w1 , . . . , wn in W, there exist exactly one linear transformation
T : V → W such that T(vi ) = wi for = 1, . . . , n.
Corollary
Suppose {v1 , . . . , vn } is a finite basis for V, then if U, T : V → W
are linear and U(vi ) = T(vi ) for i = 1, . . . , n, then U = T.
Coordinate Vectors
Matrix Representations
Definition
For a finite-dimensional vector space V, an ordered basis for V is a
basis for V with a specific order. In other words, it is a finite
sequence of linearly independent vectors in V that generates V.
Definition
Let β = {u1 , . . . , un } be an ordered basis for V, and for x ∈ V let
a1 , . . . , an be the unique scalars such that
n
X
x=
ai ui .
i=1
The coordinate vector of x relative to β is
 
a1
 
[x]β =  ... 
an
Addition and Scalar Multiplication
Definition
Let T, U : V → W be arbitrary functions of vector spaces V, W over
F . Then T + U, aT : V → W are defined by
(T + U)(x) = T(x) + U(x) and (aT)(x) = aT(x), respectively, for
all x ∈ V and a ∈ F .
Theorem 2.7
With the operations defined above, for vector spaces V, W over F
and linear T, U : V → W:
(a) aT + U is linear for all a ∈ F
Definition
Suppose V, W are finite-dimensional vector spaces with ordered
bases β = {v1 , . . . , vn }, γ = {w1 , . . . , wm }. For linear T : V → W,
there are unique scalars aij ∈ F such that
T(vj ) =
m
X
for 1 ≤ j ≤ n.
aij wi
i=1
The m × n matrix A defined by Aij = aij is the matrix
representation of T in the ordered bases β and γ, written
A = [T]γβ . If V = W and β = γ, then A = [T]β .
Note that the jth column of A is [T(vj )]γ , and if [U]γβ = [T]γβ for
linear U : V → W, then U = T.
Matrix Representations
Theorem 2.8
For finite-dimensional vector spaces V, W with ordered bases β, γ,
and linear transformations T, U : V → W:
(a) [T + U]γβ = [T]γβ + [U]γβ
(b) [aT]γβ = a[T]γβ for all scalars a
(b) The collection of all linear transformations from V to W is a
vector space over F
Definition
For vector spaces V, W over F , the vector space of all linear
transformations from V into W is denoted by L(V, W), or just
L(V) if V = W.
Composition of Linear Transformations
Theorem 2.9
Let V, W, Z be vector spaces over a field F , and T : V → W,
U : W → Z be linear. Then UT : V → Z is linear.
Theorem 2.10
Let V be a vector space and T, U1 , U2 ∈ L(V). Then
(a) T(U1 + U2 ) = TU1 + TU2 and (U1 + U2 )T = U1 T + U2 T
(b) T(U1 U2 ) = (TU1 )U2
(c) TI = IT = T
(d) a(U1 U2 ) = (aU1 )U2 = U1 (aU2 ) for all scalars a
Matrix Multiplication
Let T : V → W, U : W → Z, be linear, α = {v1 , . . . , vn },
β = {w1 , . . . , wm }, γ = {z1 , . . . , zp } ordered bases for U, W, Z,
and A = [U ]γβ , B = [T ]βα . Consider [UT]γα :
!
m
m
X
X
(UT)(vj ) = U(T(vj )) = U
Bkj wk =
Bkj U(wk )
k=1
=
m
X
Bkj
k=1
p
X
i=1
Aik zi
!
k=1
=
p
m
X
X
i=1
k=1
Aik Bkj
!
zi
Definition
Let A, B be m × n, n × p matrices. The product AB is the m × p
matrix with
n
X
(AB)ij =
Aik Bkj , for 1 ≤ i ≤ m, 1 ≤ j ≤ p
k=1
Matrix Multiplication
Properties
Theorem 2.11
Let V, W, Z be finite-dimensional vector spaces with ordered bases
α, β, γ, and T : V → W, U : W → Z be linear. Then
[UT]γα =
[U]γβ [T]βα
Corollary
Let V be a finite-dimensional vector space with ordered basis β,
and T, U ∈ L(V). Then [UT]β = [U]β [T]β .
Definition
The Kronecker delta is defined by δij = 1 if i = j and δij = 0 if
i 6= j. The n × n identity matrix In is defined by (In )ij = δij .
Theorem 2.12
Let A be m × n matrix, B, C be n × p matrices, and D, E be
q × m matrices. Then
(a) A(B + C) = AB + AC and (D + E)A = DA + EA
(b) a(AB) = (aA)B = A(aB) for any scalar a
(c) Im A = A = AIn
(d) If V is an n-dimensional vector space with ordered basis β,
then [IV ]β = In
Corollary
Let A be m × n matrix, B1 , . . . , Bk be n × p matrices, C1 , . . . , Ck
be q × m matrices, and a1 , . . . , ak be scalars. Then
!
!
k
k
k
k
X
X
X
X
A
ai Bi =
ai ABi and
a i Ci A =
ai Ci A
i=1
Properties
i=1
i=1
i=1
Left-multiplication Transformations
Theorem 2.13
Let A be m × n matrix and B be n × p matrix, and uj , vj the jth
columns of AB, B. Then
(a) uj = Avj
(b) vj = Bej
Theorem 2.14
Let V, W be finite-dimensional vector spaces with ordered bases
β, γ, and T : V → W be linear. Then for u ∈ V:
[T(u)]γ =
[T]γβ [u]β
Definition
Let A be m × n matrix. The left-multiplication transformation LA
is the mapping LA : Fn → Fm defined by LA (x) = Ax for each
column vector x ∈ Fn .
Theorem 2.15
Let A be m × n matrix, then LA : Fn → Fm is linear, and if B is
m × n matrix and β, γ are standard ordered bases for Fn , Fm , then:
(a) [LA ]γβ = A
(b) LA = LB if and only if A = B
(c) LA+B = LA + LB and LaA = aLA for all a ∈ F
(d) For linear T : Fn → Fm , there exists a unique m × n matrix C
such that T = LC , and C = [T]γβ
(e) If E is an n × p matrix, then LAE = LA LE
(f) If m = n then LIn = IFn
Associativity of Matrix Multiplication
Inverse of Linear Transformations
Definition
Let V, W be vector spaces and T : V → W be linear. A function U
: W → V is an inverse of T if TU=IW and UT=IV . If T has an
inverse, it is invertible and the inverse T−1 is unique.
Theorem 2.16
Let A, B, C be matrices such that A(BC) is defined. Then
(AB)C is also defined and A(BC) = (AB)C.
For invertible T,U:
1
(TU)−1 = U−1 T−1
2
(T−1 )−1 = T (so T−1 is invertible)
3
If V,W have equal dimensions, linear T : V → W is invertible
if and only if rank(T) = dim(V)
Theorem 2.17
For vector spaces V,W and linear and invertible T : V → W,
T−1 : W → V is linear.
Inverses
Definition
An n × n matrix A is invertible if there exists an n × n matrix B
such that AB = BA = I.
Lemma
For invertible and linear T from V to W, V is finite-dimensional if
and only if W is finite-dimensional. Then dim(V) = dim(W).
Theorem 2.18
Let V,W be finite-dimensional vector spaces with ordered bases
β, γ, and T : V → W be linear. Then T is invertible if and only if
[T]γβ is invertible, and [T−1 ]βγ = ([T]γβ )−1 .
Isomorphisms
Definition
Let V,W be vector spaces. V is isomorphic to W if there exists a
linear transformation T : V → W that is invertible. Such a T is an
isomorphism from V onto W.
Theorem 2.19
For finite-dimensional vector spaces V,W, V is isomorphic to W if
and only if dim(V) = dim(W).
Corollary
A vector space V over F is isomorphic to Fn if and only if
dim(V) = n.
The Standard Representation
Definition
Let β be an ordered basis for an n-dimensional vector space V over
the field F . The standard representation of V with respect to β is
the function φβ : V → Fn defined by φβ (x) = [x]β for each x ∈ V.
Theorem 2.21
For any finite-dimensional vector space V with ordered basis β, φβ
is an isomorphism.
Inverses
Corollary 1
For finite-dimensional vector space V with ordered basis β and
linear T : V → V, T is invertible if and only if [T]β is invertible,
and [T−1 ]β = ([Tβ ])−1 .
Corollary 2
An n × n matrix A is invertible if and only if LA is invertible, and
(LA )−1 = LA−1 .
Linear Transformations and Matrices
Theorem 2.20
Let V,W be finite-dimensional vector spaces over F of dimensions
n, m with ordered bases β, γ. Then the function
Φ : L(V, W) → Mm×n (F ), defined by Φ(T) = [T]γβ for
T ∈ L(V, W), is an isomorphism.
Corollary
For finite-dimensional vector spaces V,W of dimensions n, m,
L(V, W) is finite-dimensional of dimension mn.
The Change of Coordinate Matrix
Theorem 2.22
Let β and β 0 be ordered bases for a finite-dimensional vector space
V, and let Q = [IV ]ββ 0 . Then
(a) Q is invertible
(b) For any v ∈ V, [v]β = Q[v]β 0
Q = [IV ]ββ 0 is called a change of coordinate matrix, and we say that
Q changes β 0 -coordinates into β-coordinates.
Note that if Q changes from β 0 into β coordinates, then Q−1
changes from β into β 0 coordinates.
Linear Operators
Linear Functionals
A linear operator is a linear transformation from a vector space V
into itself.
Theorem
Let T be a linear operator on a finite-dimensional vector space V
with ordered bases β, β 0 . If Q is the change of coordinate matrix
from β 0 into β-coordinates, then
[T]β 0 = Q−1 [T]β Q
Corollary
Let A ∈ Mn×n (F ), and γ an ordered basis for Fn . Then
[LA ]γ = Q−1 AQ, where Q is the n × n matrix with the vectors in
γ as column vectors.
Definition
For A, B ∈ Mn×n (F ), B is similar to A if the exists an invertible
matrix Q such that B = Q−1 AQ.
Coordinate Functions
A linear functional on a vector space V is a linear transformation
from V into its field of scalars F .
Example
Let V be the continuous real-valued functions on [0, 2π]. For a fix
g ∈ V, a linear functional h : V → R is given by
Z 2π
1
h(x) =
x(t)g(t) dt
2π 0
Example
Let V = Mn×n (F ), then f : V → F with f(A) = tr(A) is a linear
functional.
Dual Spaces
Example
Let β = {x1 , . . . , xn } be a basis for a finite-dimensional vector
space V. Define fi (x) = ai , where
 
a1
 
[x]β =  ... 
an
is the coordinate vector of x relative to β. Then fi is a linear
functional on V called the ith coordinate function with respect to
the basis β. Note that fi (xj ) = δij .
Dual Bases
Definition
For a vector space V over F , the dual space of V is the vector
space V∗ = L(V, F ).
Note that for finite-dimensional V,
dim(V∗ ) = dim(L(V, F )) = dim(V) · dim(F ) = dim(V)
so V and V∗ are isomorphic. Also, the double dual V∗∗ of V is the
dual of V∗ .
Double Dual Isomorphism
Theorem 2.24
Let β = {x1 , . . . , xn } be an ordered basis for finite-dimensional
vector space V, and let fi be the ith coordinate function w.r.t. β,
and β ∗ = {f1 , . . . , fn }. Then β ∗ is an ordered basis for V∗ and for
n
any f ∈ V∗ ,
X
f=
f(xi )fi .
i=1
Definition
The ordered basis β ∗ = {f1 , . . . , fn } of V∗ that satisfies fi (xj ) = δij
is called the dual basis of β.
Theorem 2.25
Let V, W be finite-dimensional vector spaces over F with ordered
bases β, γ. For any linear T : V → W, the mapping Tt : W∗ → V∗
defined by Tt (g) = gT for all g ∈ W∗ is linear with the property
∗
[Tt ]βγ ∗ = ([Tγβ )t .
For a vector x ∈ V, define x̂ : V∗ → F by x̂(f) = f(x) for every
f ∈ V∗ . Note that x̂ is a linear functional on V∗ , so x̂ ∈ V∗∗ .
Lemma
For finite-dimensional vector space V and x ∈ V, if x̂(f) = 0 for all
f ∈ V∗ , then x = 0.
Theorem 2.26
Let V be a finite-dimensional vector space, and define ψ : V → V∗∗
by ψ(x) = x̂. Then ψ is an isomorphism.
Corollary
For finite-dimensional V with dual space V∗ , every ordered basis for
V∗ is the dual basis for some basis for V.