* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Eigenvalues and Eigenvectors of n χ n Matrices
Survey
Document related concepts
Covariance and contravariance of vectors wikipedia , lookup
Rotation matrix wikipedia , lookup
Symmetric cone wikipedia , lookup
Matrix (mathematics) wikipedia , lookup
Non-negative matrix factorization wikipedia , lookup
Determinant wikipedia , lookup
Orthogonal matrix wikipedia , lookup
Principal component analysis wikipedia , lookup
Four-vector wikipedia , lookup
Matrix calculus wikipedia , lookup
System of linear equations wikipedia , lookup
Gaussian elimination wikipedia , lookup
Singular-value decomposition wikipedia , lookup
Matrix multiplication wikipedia , lookup
Cayley–Hamilton theorem wikipedia , lookup
Jordan normal form wikipedia , lookup
Transcript
44 MATH10212 • Linear Algebra • Brief lecture notes Eigenvalues and Eigenvectors of n × n Matrices Now that we have defined the determinant of an n × n matrix, we can continue our discussion of eigenvalues and eigenvectors in a general context. Recall that a column-vector ~v ∈ Rn is an eigenvector of an n × n matrix A for eigenvalue λ ∈ R if A~v = λ~v and ~v 6= ~0. Thus, ~v is a non-trivial solution of the homogeneous linear system (A − λI)~x = λ~x, which is (A − λI)~x = ~0. It has non-trivial solutions if and only if (A − λI) is non-invertible. By Theorem 4.6, this is true if and only if det(A − λI) = 0. To summarize: Theorem. The eigenvalues of a square matrix A are precisely the solutions λ of the equation det(A − λI) = 0. When we expand det(A − λI), we get a polynomial in λ, called the characteristic polynomial of A. The equation det(A − λI) = 0 is called the characteristic equation of A. For example, if · ¸ a b A= , c d its characteristic polynomial is det(A − λI) ¯ ¯ a−λ b = ¯¯ c d−λ ¯ ¯ ¯ ¯ = (a − λ)(d − λ) − bc = λ2 − (a + d)λ + (ad − bc) Let’s summarize the procedure we will follow (for now) to find the eigenvalues and eigenvectors (eigenspaces) of a matrix. Let A be an n × n matrix. 1. Compute the characteristic polynomial det(A − λI) of A. 2. Find the eigenvalues of A by solving the characteristic equation det(A− λI) = 0 for λ. 3. For each eigenvalue λ, find the null space of the matrix det(A − λI). This is the eigenspace Eλ , the nonzero vectors of which are the eigenvectors of A corresponding to λ. 4. Find a basis for each eigenspace. 1 2 2 Example. For A = 2 1 2 the characteristic polynomial is 2 2 1 ¯ ¯ ¯1 − λ 2 2 ¯¯ ¯ 1−λ 2 ¯¯ = det(A − λI) = ¯¯ 2 ¯ 2 2 1 − λ¯ MATH10212 • Linear Algebra • Brief lecture notes 45 (1 − λ)3 + 8 + 8 − 4(1 − λ) − 4(1 − λ) − 4(1 − λ) = · · · = −(λ − 5)(λ + 1)2 . (Of course, we are lucky to spot factorization here; in general it is polynomial of degree n for n × n matrix and roots may not be easy to find...). Thus, roots=eigenvalues are 5 and −1. 2 2 2 x1 0 Let us find eigenspace E−1 : (A − (−1)I)~x = ~0; 2 2 2 x2 = 0; x3 0 2 2 2 −s − t x1 = −x2 − x3 , where x2 , x3 are free var.; E−1 = s | s, t ∈ R ; t −1 −1 1 a basis of E−1 : , 0 . 1 0 −4 2 2 x1 0 Find the eigenspace E5 : (A − 5I)~x = ~0; 2 −4 2 x2 = 0; solve 2 2 −4 0 x 3 t this system....: x1 = x2 = x3 , where x3 is a free var.; E5 = t | t ∈ R ; t 1 1 . a basis of E5 : 1 Definitions. The multiplicity of a given eigenvalue as a root of the characteristic polynomial det(A − λI) is called the algebraic multiplicity of λ. The dimension of the eigenspace Eλ is called the geometric multiplicity of λ. In the above example, e.g., −1 was eigenvalue of algebraic multiplicity 2, and its geom. multiplicity was also 2. It can be shown that the geometric multiplicity of λ is at most its algebraic multiplicity. And there are examples where geometric multiplicity is less than the algebraic multiplicity. ¸ · 3 2 Example. For A = the characteristic polynomial is det(A − λI) = 0 3 ¯· ¸¯ ¯ 3−λ 2 ¯¯ ¯ = (3 − λ)2 , so λ = 3 is a repeated root with multiplicity 2. ¯ 0 3−λ ¯ So λ = 3 is an eigenvalue of A of algebraic · ¸ · multiplicity ¸ · ¸ 2. Let us find the 0 2 x 0 1 eigenspace E3 : (A − λI)~x = ~0 is = , so x2 = 0, while x1 0 0 x 0 2 ½· ¸ ¾ t is a free var.; thus, E3 = | t ∈ R has dimension 1, so the geometric 0 multiplicity of the eigenvalue λ = 3 is 1. Theorem 4.15. The eigenvalues of a triangular matrix are the entries on its main diagonal. MATH10212 • Linear Algebra • Brief lecture notes 46 Theorem 4.16. A square matrix A is invertible if and only if 0 is not an eigenvalue of A. Theorem 4.17. Revisited The Fundamental Theorem of Invertible Matrices, Let A be an n × n matrix. The following statements are equivalent: a. A is invertible. b. A~x = ~b has a unique solution for every ~b in Rn . c. A~x = ~0 has only the trivial solution. d. The reduced row echelon form of A is In . e. A is a product of elementary matrices. f. rank (A) = n g. nullity (A) = 0 h. The column vectors of A are linearly independent. i. The column vectors of A span Rn . j. The column vectors of A form a basis for Rn . k. The row vectors of A are linearly independent. l. The row vectors of A span Rn . m. The row vectors of A form a basis for Rn . n. det A 6= 0 o. 0 is not an eigenvalue of A. Theorem 4.18. Let A be a square matrix with eigenvalue λ and corresponding eigenvector ~x. a. For any positive integer n, λn is an eigenvalue of An with corresponding eigenvector ~x. b. If A is invertible, then eigenvector ~x. 1 λ is an eigenvalue of A−1 with corresponding c. For any integer n, λn is an eigenvalue of An with corresponding eigenvector ~x. MATH10212 • Linear Algebra • Brief lecture notes 47 Theorem 4.19. Suppose the n×n matrix A has eigenvectors ~v1 , ~v2 , . . . , ~vm with corresponding eigenvalues λ1 , λ2 , . . . , λm . If ~x is a vector in Rn that can be expressed as a linear combination of these eigenvectors – say, ~x = c1~v1 + c2~v2 + . . . + cm~vm then for any integer k, Ak ~x = c1 λk1 ~v1 + c2 λk2 ~v2 + . . . + cm λkm~vm Theorem 4.20. Let A be an n × n matrix and let λ1 , λ2 , . . . , λm be distinct eigenvalues of A with corresponding eigenvectors ~v1 , ~v2 , . . . , ~vm . Then ~v1 , ~v2 , . . . , ~vm are linearly independent.