* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download PDF
Exterior algebra wikipedia , lookup
Euclidean vector wikipedia , lookup
Linear least squares (mathematics) wikipedia , lookup
Vector space wikipedia , lookup
Rotation matrix wikipedia , lookup
Jordan normal form wikipedia , lookup
Covariance and contravariance of vectors wikipedia , lookup
Determinant wikipedia , lookup
Principal component analysis wikipedia , lookup
Eigenvalues and eigenvectors wikipedia , lookup
System of linear equations wikipedia , lookup
Matrix (mathematics) wikipedia , lookup
Perron–Frobenius theorem wikipedia , lookup
Non-negative matrix factorization wikipedia , lookup
Singular-value decomposition wikipedia , lookup
Orthogonal matrix wikipedia , lookup
Gaussian elimination wikipedia , lookup
Cayley–Hamilton theorem wikipedia , lookup
Ordinary least squares wikipedia , lookup
Four-vector wikipedia , lookup
matrix representation of a linear transformation∗ CWoo† 2013-03-21 23:43:28 Linear transformations and matrices are the two most fundamental notions in the study of linear algebra. The two concepts are intimately related. In this article, we will see how the two are related. We assume that all vector spaces are finite dimensional and all vectors are written as column vectors. Linear transformations as matrices Let V, W be vector spaces (over a common field k) of dimension n and m respectively. Fix bases A = {v1 , . . . , vn } and B = {w1 , . . . , wm } for V and W respectively. We shall order these bases so that vi < vj and wi < wj whenever i < j. To distinguish an ordinary set from an ordered set, we shall adopt the notation hv1 , . . . , vn i to mean the set {v1 , . . . , vn } with ordering vi ≤ vj whenever i ≤ j. The importance of ordering these bases will be apparent shortly. For any linear transformation T : V → W , we can write T (vj ) = m X αij wi i=1 for each j ∈ {1, . . . , n} and αij ∈ k. We define the matrix associated with the linear transformation T and ordered bases A, B by [T ]A B := (αij ), where 1 ≤ i ≤ n and 1 ≤ j ≤ m. [T ]A B is a m × n matrix whose entries are in k. When A = B, we often write [T ]A := [T ]A A . In addition, when both ordered bases are standard bases En , Em ordered in the obvious way, we write n [T ] := [T ]E Em . Examples. ∗ hMatrixRepresentationOfALinearTransformationi created: h2013-03-21i by: hCWooi version: h39888i Privacy setting: h1i hDefinitioni h15A04i † This text is available under the Creative Commons Attribution/Share-Alike License 3.0. You can reuse this document or portions thereof only if you do so under terms that are compatible with the CC-BY-SA license. 1 1. Let T : R3 → R4 be given by x + 2y + z x z T y = −x + y − 5z . z 3x + 2z Using the standard ordered bases * 1 0 0 + E3 = 0 , 1 , 0 for R3 0 0 1 * and ordered in the obvious way. Then, 2 1 0 1 0 0 T 0 = −1 , T 1 = 1 , 0 0 0 3 E4 = 0 0 + 0 1 0 1 0 0 , , , for R4 0 0 1 0 0 1 0 0 1 0 1 T 0 = −5 , 1 2 3 so the matrix [T ]E E4 associated with T and the standard ordered bases E3 and E4 is the 4 × 3 matrix 1 2 1 0 0 1 −1 1 −5 . 3 0 2 2. Let T be the same linear transformation as above. However, let E30 be the same basis as E3 except that the order is reversed: e3 < e2 < e1 . Then 1 2 1 1 0 0 E0 [T ]E34 = −5 1 −1 . 2 0 3 Note that this matrix is just the matrix from the previous example except that the first and the last columns have been switched. 3. Again, let T be the same as before. Now, let E40 be the ordered basis whose elements are those of E4 but the order is now given by e2 < e1 < e4 < e3 . Then 1 0 0 1 2 1 E0 [T ]E30 = 2 0 3 . 4 −5 1 −1 Note that this matrix is just the matrix from the previous example except that the first two rows and the last two rows have been interchanged. 2 Remarks. • From the examples above, we note several important features of a matrix representation of a linear transformation: 1. the matrix depends on the bases given to the vector spaces 2. the ordering of a basis is important 3. switching the order of a given basis amounts to switching columns and rows of the matrix, essentially multiplying a matrix by a permutation matrix. • Some basic properties of matrix representations of linear transformations are A 1. If T : V → W is a linear transformation, then [rT ]A B = r[T ]B , where A, B are ordered bases for V, W respectively. A 2. If S, T : V → W are linear transformations, then [S + T ]A B = [S]B + A [T ]B , where A and B are ordered bases for V and W respectively. A B 3. If S : U → V and T : V → W , then [T S]A C = [T ]C [S]B , where A, B, C are ordered bases for U, V, W respectively. 4. As a result, T is invertible iff [T ]A B is an invertible matrix iff dim(V ) = dim(W ). • We could have represented all vectors as row vectors. However, doing so would mean that the matrix representation M1 of a linear transformation T would be the transpose of the matrix representation M2 of T if the vectors were represented as column vectors: M1 = M2T , and that the application of the matrices to vectors would be from the right of the vectors: 1 2 1 1 0 −1 3 0 0 1 a a b c 2 0 1 0 instead of −1 1 −5 b . c 1 1 −5 2 3 0 2 Matrices as linear transformations Every m×n matrix A over a field k can be thought of as a linear transformation from k n to k m if we view each vector v ∈ k n as a n × 1 matrix (a column) and the mapping is done by the matrix multiplication Av, which is a m × 1 matrix (a column vector in k m ). Specifically, we define TA : k n → k m by TA (v) := Av. It is easy to see that TA is indeed a linear transformation. Furthermore, [TA ] = n [TA ]E Em = A, since the representation of vectors as n-tuples of elements in k is the same as expressing each vector under the standard basis (ordered) in the vector space k n . Below we list some of the basic properties: 3 1. TrA = rTA , for any r ∈ k, 2. TA + TB = TA+B , where A, B are m × n matrices over k 3. TA ◦ TB = TAB , where A is an m × n matrix and B is an n × p matrix over k 4. TA is invertible iff A is an invertible matrix. Remark. As we can see from the discussion above, if we fix sets of base elements for a vector space V and W , there is a one-to-one correspondence between the set of matrices (of the same size) over the underlying field k and the set of linear transformations from V to W . References [1] Friedberg, Insell, Spence. Linear Algebra. Prentice-Hall Inc., 1997. 4