Download = 0. = 0. ∈ R2, B = { B?

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Rotation matrix wikipedia , lookup

Linear least squares (mathematics) wikipedia , lookup

Euclidean vector wikipedia , lookup

Exterior algebra wikipedia , lookup

Vector space wikipedia , lookup

Determinant wikipedia , lookup

Matrix (mathematics) wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Gaussian elimination wikipedia , lookup

System of linear equations wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Jordan normal form wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Matrix multiplication wikipedia , lookup

Matrix calculus wikipedia , lookup

Four-vector wikipedia , lookup

Transcript
Selected Solutions
Math 420
Homework 6
2/17/12
3.4.10 We have seen that the linear operator T on R2 defined by T ( x1 , x2 ) = ( x1 , 0) is
represented in the standard ordered basis by the matrix
1 0
A=
.
0 0
This operator satisfies T 2 = T. Prove that if S is a linear operator on R2 such that S2 = S,
then S = 0, or S = I, or there is an ordered basis B for R2 such that [S]B = A (above).
Since S is a linear operator on a 2-dimensional vector space, S has rank 0, 1, or 2.
If S has rank 0, S = 0.
If S has rank 1, there exist nonzero vectors v, w ∈ R2 such that Sv 6= 0 and Sw = 0.
We prove that {Sv, w} is a basis for R2 . Since dim(R2 ) = 2, it is enough to show Sv
and w are linearly independent. Suppose c1 Sv + c2 w = 0. Then, applying S, we have
S(c1 Sv + c2 w) = c1 S2 v + c2 Sw = c1 Sv + c2 (0) = c1 Sv = 0, so since Sv 6= 0, c1 = 0,
and then since w 6= 0 we have c2 = 0 as well. Hence, {Sv, w} is a basis for R2 . Since
S(Sv) = S2 v = Sv and Sw = 0, the matrix of S with respect to the ordered basis {Sv, w}
is exactly A.
If S has rank 2, S is onto. Hence, for any v ∈ R2 , we have v = Sw for some w ∈ R2 ,
and thus Sv = S(Sw) = S2 w = Sw = v, so S = I.
3.4.12 Let V be an n-dimensional vector space over the field F, and let B = {α1 , . . . , αn }
be an ordered basis for V.
(a) According to Theorem 1, there is a unique linear operator T on V such that
Tα j = α j+1 ,
j = 1, . . . , n − 1,
Tαn = 0.
What is the matrix A of T in the ordered basis B ?
(b) Prove that T n = 0 but T n−1 6= 0.
(c) Let S be any linear operator on V such that Sn = 0 but Sn−1 6= 0. Prove that there
is an ordered basis B 0 for V such that the matrix of S in the ordered basis B 0 is the
matrix A of part (a).
(d) Prove that if M and N are n × n matrices over F such that Mn = N n = 0 but
Mn−1 6= 0 6= N n−1 , then M and N are similar.
(a) A is the n × n matrix with 1s directly below the main diagonal and 0s elsewhere:
Ai,i−1 = 1 for 2 ≤ i ≤ n and Aij = 0 for all other i, j.
1
(b) It is easy to see (and it may be formally proved by induction) that T k α j = α j+k if
j + k ≤ n, and T k α j = 0 if j + k > n. Hence, T n α j = 0 for 1 ≤ j ≤ n, so since the α j
form a basis, T n = 0. Also, T n−1 α1 = αn , so T n−1 6= 0.
(c) Since Sn−1 6= 0, there exists v ∈ V such that Sn−1 v 6= 0. Let B 0 = {v, Sv, S2 v, . . . , Sn−1 v}.
If we can show that B 0 is a basis, the definition of B 0 makes it clear that [S]B 0 = A.
Since there are n vectors in B 0 and V is an n-dimensional vector space, it is enough
to show B 0 is linearly independent. Suppose c1 v + c2 Sv + · · · + cn Sn−1 v = 0. Apply
Sn−1 to both sides. Since Sn = 0, all the terms except the first one vanish, and we
have c1 Sn−1 v = 0, and hence c1 = 0 because Sn−1 v 6= 0. Now we can similarly apply
Sn−2 to show that c2 = 0, and so on (this may again be formalized by induction if
desired), and we conclude that all the c j are 0 and B 0 is linearly independent, and
hence a basis.
(d) Let U be the linear operator on V whose matrix with respect to the ordered basis B
is M. Then U satisfies the conditions of part (c), so there exists an ordered basis for
which the matrix of U is A. Hence, M is similar to A. Similarly, N is similar to A, so
since similarity is an equivalence relation, M and N are similar.
2