* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Ch 6 PPT (V1)
Rotation matrix wikipedia , lookup
Cross product wikipedia , lookup
Matrix (mathematics) wikipedia , lookup
Determinant wikipedia , lookup
Laplace–Runge–Lenz vector wikipedia , lookup
Exterior algebra wikipedia , lookup
Perron–Frobenius theorem wikipedia , lookup
Jordan normal form wikipedia , lookup
Gaussian elimination wikipedia , lookup
Cayley–Hamilton theorem wikipedia , lookup
Orthogonal matrix wikipedia , lookup
Eigenvalues and eigenvectors wikipedia , lookup
Non-negative matrix factorization wikipedia , lookup
Euclidean vector wikipedia , lookup
Singular-value decomposition wikipedia , lookup
System of linear equations wikipedia , lookup
Vector space wikipedia , lookup
Matrix multiplication wikipedia , lookup
Covariance and contravariance of vectors wikipedia , lookup
Ch 6 Vector Spaces Vector Space Axioms • • • • X,Y,Z elements of and α, β elements of Def of vector addition Def of multiplication of scalar and vector These defs satisfy the 10 axioms (pg. 155) • Addition • – Uniqueness, closure – Commutativity – Associativity – Identity – Inverse Scalar Multiplication – Uniqueness, closure – Associativity – Right Distributivity – Left Distributivity – Unit scalar mult. Subspaces • A set is a subspace of a vector space if – Every element of is in , and – is a vector space. • A line through the origin is a subspace of 2-D Euclidean space • A plane that contains the origin is a subspace of 3-D Euclidean space Linear Independence of Vectors • The vectors A1, A2, … An over a fieled are linearly independent if – any set {k1, k2, … kn} of elements of , for which At least one A1k1 A2 k2 An kn 0 ki is zero. Linear Combinations • Consider a set of vectors, and a set of scalars • A linear combination of the vectors is A1k1 A2 k2 An kn or k1 A1 k2 A2 kn An Linear Dependence and Rank • Multiplication of a matrix by a column vector produces a linear combination of the columns • Multiplication of a matrix by a matrix can be viewed as multiplication of left matrix by each column of right matrix. • Full column rank implies columns are linearly independent • Rank deficient matrices yield infinite solutions Range or Image • View a matrix as a collection of column vectors. • Consider the set of vectors formed by an arbitrary linear combination of the column vectors (result of multiplying a matrix by an arbitrary column vector) • The range space or the image of a matrix is the set of vectors generated by multiplying the matrix by an arbitrary column vector. • (A)= Basis • If the columns of A are linearly independent, they are called a basis for (A) • If A has full column rank, n, and vector X in (A) is a unique linear combination of the basis vectors in A, i.e. X = AK has a unique solution for K. • The entries in K are the coordinates of X wrt the Basis A. Dimension • The dimension of a vector space is the number of (linearly independent) vectors in a basis for the space. Standard Basis • The column vectors in the identity matrix for the standard basis [e1 e2 …en]=In • Remember i, j, k from physics • AX=B has a solution only if each and every column of B is in (A), i.e. – (B) is a subset of (A) • This can be tested by – constructing the matrix [A B]; – computing an upper-row compression (Sec. 5.7) and noting that (B) is a subset of (A) if and only if P2B=0 Null Space or Kernel • The null space, or kernel, of a matrix is the set (A) = {X: AX = 0} • Set of solutions of the homogeneous equation AX = 0 – Contains non zero vectors only if rank(A) < cols(A) • If X is a solution of AX=B, then so is X’=X+H for any H in (A) – Hence solution is unique only if A has full column rang Basis for (A) and (A) Orthogonal Basis Change of Basis Similarity Transformation