Download MATH 108A HW 6 SOLUTIONS Problem 1. [§3.15] Solution. `⇒` Let

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Jordan normal form wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Matrix multiplication wikipedia , lookup

Vector space wikipedia , lookup

Matrix calculus wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

System of linear equations wikipedia , lookup

Four-vector wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Transcript
MATH 108A HW 6 SOLUTIONS
RAHUL SHAH
Problem 1. [§3.15]
Solution.
‘⇒’ Let V be a finite dimensional vector space and let T ∈ L(V, W ). Assume that T is surjective. Let B =
{v1 , . . . vn } be a basis for V . Notice that even though a priori W need not be finite dimensional, the fact
that T is surjective implies that T (B) = {T (v1 ), . . . T (vn )} is a spanning set for W and thus W is finite
dimensional. Since T (B) spans W (a finite dimensional vector space), we can reduce it to a basis, B 0 , for
W . We will denote B 0 = {w1 , . . . wk } where for each wi ∈ B 0 , wi = T (vj ) for some vj ∈ B. Since for each
wi ∈ B 0 , there exists at least one (there might be more, but that won’t matter) vj ∈ B such that T (vj ) = wi ,
we will define S(wi ) to be vj (i.e. vj is one of the elements of B such that T (vj ) = wi ). Since S has been
defined on B 0 , a basis of W , S ∈ L(W, V ). Thus T ◦ S ∈ L(W, W ) = L(W ). To show that T ◦ S = IW ,
the identity linear map on W , it is enough to show that for each wi ∈ B 0 , S ◦ T (wi ) = wi . So let wi ∈ B 0 .
T ◦ S(wi ) = T (vj ), where vj is an element in B such that T (vj ) = wi . Thus T ◦ S(wi ) = T (vj ) = wi . Thus
T ◦ S = IW and we are done.
‘⇐’ Let V be a finite dimensional vector space and let T ∈ L(V, W ). Assume that there exists S ∈ L(W, V )
such that T ◦ S = IW , the identity map on W . Assume for contradiction that T is not surjective. Thus,
there exists w ∈ W such that w ∈
/ T (V ), the image of V . T ◦ S(w) = IW (w) = w. Thus w ∈ T (S(W )).
However, since S(W ) ⊂ V (it is actually a linear subspace of V ), we know that T (S(W )) ⊂ T (V ). But then
w ∈ T (S(W )) ⇒ w ∈ T (V ) which contradicts the assumption that w ∈
/ T (V ) and thus it is a contradiction
to assume that such a w exists. Thus ∀w ∈ W , there exists some v ∈ V such that T (v) = w and ergo, T is
surjective.
Problem 2. [§3.18]
Solution. Let A, B and C be matrices such that (A · B) · C and A · (B · C) make sense. Thus if A = (ai,j )m,n , i.e. the
i, j-th entry of A is given by ai,j , and A is an m × n matrix, then B = (bi,j )n,o and C = (ci,j )o,p . We will repeatedly
use the representation of matrix multiplication given at the bottom of page 51 [Axler, §3]. Thus
(A · B) =
n
X
r=1
1
!
ai,r br,j
.
m,o
2
RAHUL SHAH
We thus find that
(A · B) · C
o
n
X
X
=
s=1
ai,r br,s
!
cs,j
r=1
o
n
X
X
=
(2.1)
!
s=1
m,p
!!
ai,r br,s cs,j
r=1
m,p
And similarly,
(B · C) =
o
X
!
bi,r cr,j
r=1
.
n,p
Which thus gives us that
A · (B · C)
=
n
X
ai,s
s=1
=
=
(2.2)
=
o
X
bs,r cr,j
r=1
n
o
X
X
s=1
r=1
o
X
n
X
r=1
s=1
o
n
X
X
s=1
!
m,p
!!
ai,s bs,r cr,j
m,p
!!
ai,s bs,r cr,j
m,p
!!
by switching r and s as indices.
ai,r br,s cs,j
r=1
m,p
But the i, j-th entry of A · (B · C), given by Equation 2.2, equals the i, j-th entry of (A · B) · C, given by Equation
2.1 and thus we find that matrix multiplication is associative.
Problem 3. [§3.20]
Solution. Let B = (v1 , . . . vn ) be a basis for V . Thus dim V = n. For v ∈ V , define M(v) to be the matrix of
v with respect to the basis B of V . This is well defined since for each v ∈ V , v can be uniquely represented as
v = a1 v1 + . . . + an vn , where ai ∈ F. Thus M(v) is well defined as (a1 , . . . an )T ∈ Mat(n, 1, F). For each vi ∈ B, we
define T 0 (vi ) = ei ∈ Mat(n, 1, F), where ei is the n×1 matrix that has 0 in all entries except for the i, 1-th entry, which
is 1. Since T 0 is defined on the basis B of V , T 0 ∈ L(V, Mat(n, 1, F)). We will show that T (v) = T 0 (v) and thus T (v)
is equal to and thus is a linear transformation from V to Mat(n, 1, F). For an arbitrary v ∈ V , v = a1 v1 + . . . + an vn .
Thus T (v) = M(v) = (a1 , . . . an )T . However,
T 0 (v)
=
T 0 (a1 v1 + . . . + an vn )
=
T 0 (a1 v1 ) + . . . + T (an vn )
=
a1 T (v1 ) + . . . + an + T (vn )
=
a1 e1 + . . . + an en
=
(a1 , a2 , . . . an )T
=
M(v)
=
T (v)
MATH 108A HW 6 SOLUTIONS
3
For each ei , an element in the standard basis of Mat(n, 1, F), define S(ei ) = vi ∈ B. Notice that since we have defined
S on a basis for Mat(n, 1, F), S ∈ L(Mat(n, 1, F), V ).
To show that T ◦ S = IMat(n,1,F) , we note that T ◦ S(ei ) = T (vi ) = ei = IMat(n,1,F) (ei ). Similarly, S ◦ T (vi ) =
S(ei ) = vi = IV . Thus T is invertible (and by [§3.13], it is onto Mat(n, 1, F)).
Problem 4. [§3.22]
Solution.
‘⇒’ Suppose that V is a finite dimensional vector space and S, T ∈ L(V ). Assume that S ◦ T is invertible.
By Theorem 3.21, S ◦ T is surjective. Thus S is surjective (if it was not, then there exists v ∈ V such
that v ∈
/ S(V ) ⊃ S(T (V )) = S ◦ T (V ) which contradicts the assumption that S ◦ T is surjective). Again
by Theorem 3.21, we have that since S ◦ T is invertible, it is injective. Thus T is injective (if it was not,
then ∃v1 , v2 ∈ V such that T (v1 ) = T (v2 ) and thus S(T (v1 )) = S(T (v2 )) ⇒ S ◦ T (v1 ) = S ◦ T (v2 ), which
contradicts the assumption that S ◦ T is injective). Since S, T ∈ L(V ) such that S is surjective and T is
injective, by Theorem 3.21, we have that S, T are both invertible.
‘⇐’ Suppose that V is a finite dimensional vector space and S, T ∈ L(V ). Assume that both S, T are invertible.
Thus by Theorem 3.21, both S and T are injective. Thus the composition, S ◦ T , is injective and thus by
Theorem 3.21, S ◦ T is invertible.
Problem 5. [§3.23]
Solution.
‘⇒’ Suppose that V is a finite dimensional vector space and S, T ∈ L(V ). Assume that S ◦ T = I. Since I is
invertible, so is S ◦ T . By [§3.22], we then find that both S and T are invertible and thus T is surjective
by Theorem 3.21. Thus for each wα ∈ V , there exists vα ∈ V such that T (vα ) = wα . Since S ◦ T = I,
S ◦ T (vα ) = vα ⇒ S(wα ) = vα . Now for an arbitrary wβ ∈ V , T ◦ S(wβ ) = T (S(wβ )) = T (vβ ) = vβ = I(vβ ).
Thus T ◦ S = I.
‘⇐’ By symmetry of T and S (i.e. T and S are interchangable), the result follows.
Problem 6. [§4.3]
Solution. Assume that for p, q ∈ P(F), p 6= 0 there exist s1 , r1 , s2 , r2 ∈ P(F) such that
q
=
s1 p + r1 and
q
=
s2 p + r2
such that deg r1 < deg p and deg r2 < deg p. We thus find that s1 p + r1 = q = s2 p + r2 ⇒ (s1 − s2 )p = r2 − r1 . Thus
(6.3)
deg((s1 − s2 )p) = deg(r2 − r1 ).
4
RAHUL SHAH
But deg(r2 − r1 ) ≤ max{deg r1 , deg r2 } < deg p. Since for all a(x), b(x) ∈ P(F), deg(a(x)b(x)) = deg a(x) + deg b(x),
and the degree is always non-negative, we thus find that if neither p nor s1 − s2 has degree −∞, i.e. is the zero
polynomial, we have that deg((s1 − s2 )p) = deg(s1 − s2 ) + deg p ≥ deg p > deg(r1 − r2 ), which is a contradiction,
because deg(r2 − r1 ) = deg(p(s1 − s2 )) by Equation 6.3. We have assumed that p 6= 0 and thus s1 − s2 = 0 and
thus s1 = s2 . We hence have that s1 p + r1 = s1 p + r2 ⇒ r1 = r2 . Thus the division algorithm gives us a unique
solution.