Download Problem set 3 solution outlines

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Eigenvalues and eigenvectors wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Jordan normal form wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Matrix multiplication wikipedia , lookup

Matrix calculus wikipedia , lookup

Four-vector wikipedia , lookup

System of linear equations wikipedia , lookup

Gaussian elimination wikipedia , lookup

Transcript
Mathematics 223, January - April 2007. Instructor: Reichstein.
Problem set 3 solution outlines
In the problems below V will denote a finite-dimensional vector space and
v1 , . . . , vm will denote m vectors in V .
(1) Show that v1 , . . . , vm are linearly independent if and only if Span(v1 , . . . , vm )
has dimension m.
Solution: Let W = Span(v1 , . . . , vm ). Note that v1 , . . . , vm
span W .
If v1 , . . . , vm are linearly independent then they form a basis of
W and thus dim(W ) = m.
Conversely, suppose dim(W ) = m. Then any spanning set of m
vectors in W is a basis. In particular, v1 , . . . , vm is a basis of W ;
hence, linearly independent.
(2) Can the matrix B be obtained from the matrix A by a sequence of
row operations, if




1 2 3
1 2 3
(a) A = 4 5 6 and B = 1 4 9 ?
7 8 9
1 8 27




0 2 1
0 3 1
(b) A = 2 1 0 and B = 3 1 0?
4 8 3
3 4 1
Solution: (a) No. The Row Echelon Form of A has a zero row
and the Row Echelon Form of B does not (check!). This means that
the rows of A are linearly dependent and the rows of B are linearly
independent.
(b) Denote the span of the rows of A by U and the span of the
rows of B by W . I claim that U 6= W , so the answer to the question
of the problem is again “no”.
Clearly (0, 3, 1), being a row of B is in W . To prove that U 6= W ,
I will show that (0, 3, 1) does not lie in U . Assume the contrary.
Reducing A to the Row Echelon Form, we see that v1 = (2, 1, 0)
and v2 = (0, 2, 1) form a basis of U . By our assumption,
(0, 3, 1) = av1 + bv2 = (2a, a, 0) + (0, 2b, b) .
Equating the three components on both sides, we obtain, 2a = 0,
a + 2b = 3 and b = 1. No a and b can satisfy all three of these
equations.
(3) Suppose v1 , . . . , vn and w1 , . . . , wn are two bases of V . Show that
v1 , . . . , vn of V can be transformed into w1 , . . . , wn by a sequence
of elementary operations. (In the case where V = Rn we proved this
in class.)
1
Solution: Since v1 , . . . , vn form a basis of V , we can write
w1 = a11 v1 + · · · + a1n vn
..
.
wn = an1 v1 + · · · + ann vn
Let us now collect the coefficients aij into a matrix


a11 . . . a1n

..
..  .
A =  ...
.
. 
an1 . . . ann
Note that applying elementary operations to w1 , . . . , wn amounts to
applying row operations to the rows of A. Conversely, elementary
operations on the rows of A translate into elementary operations on
w1 , . . . , wn .
I now claim that the rows of A are linearly independent. Indeed,
assume the contrary. Then the Row Echelon Form of A would have a
zero row. This means that w1 , . . . , wn can be reduced, by a sequence
of row operations, to a collection of vectors containing 0. This contradicts our assumption that w1 , . . . , wn are linearly independent,
thus proving the claim.
Now we know that the rows of A are linearly independent. As we
showed in class, A can be reduced, by a sequence of row operations
to the identity matrix


1 0 ... 0
0 1 . . . 0 



.
..
..


.
.
0 0 ...
1
The same sequence of elementary operations changes w1 , . . . , wn
into v1 , . . . , vn .


a1 ∗ . . . . . . ∗
∗ ... ∗ 
 0 a2
.

.
..
.


..
..
.
∗  be an upper triangular matrix.
(4) Let A =  ..
.

..
..
..
 ..
. ∗
.
.
0 . . . . . . 0 an
Show that if one of the diagonal entries a1 , . . . , an equals zero then
the rows of A are linearly dependent.
Solution: Conversely, suppose ad = 0 for some d between 1 and
n. Choose the smallest possible such d. Then the Row Echelon Form
B of A has no pivot in row d. In this case B has fewer than n pivots;
hence, B has a zero row. As we showed in class, this means that the
rows of A are linearly dependent.
(5) Are the vectors v1 , v2 , v3 linearly independent in V ?
(a) v1 = (1, 2, 3), v2 = (2, −1, 1), v3 = (3, −4, −1). Here V = R3 .
(b) v1 = 1 − x + x3 − x7 , v2 = 1 + 3x + x3 − x7 , v3 = 2x. Here
V is the vector space of polynomials of degree ≤ 7.
Solution: (a) No. To see this, use row operations.
(b) No. −v1 + v2 − 2v3 = 0.
(6) Set Span(v1 ) = U1 , . . . , Span(vm ) = Um . Assume v1 , . . . , vm 6= 0.
Show that
(a) v1 , . . . , vm span V if and only if V = U1 + · · · + Um .
(b) v1 , . . . , vm form a basis of V if and only if V = U1 ⊕ · · · ⊕ Um .
(c) v1 , . . . , vm are linearly independent if and only if
Span(v1 , . . . , vm ) = U1 ⊕ · · · ⊕ Um .
Solution: (a) Suppose v1 , . . . , vm span V . Then every element
v of V can be written as
v = a1 v1 + . . . an vn .
Since ai vi ∈ Ui , we see that v ∈ U1 + · · · + Un . Thus every v ∈ V
lies in U1 + · · · + Un . In other words, V = U1 + · · · + Un .
Conversely, suppose V = U1 + · · · + Un . Then every v ∈ V can be
written as
v = u1 + . . . un
for some u1 ∈ U1 , . . . , un ∈ Un . Writing ui = ai vi for a suitable
real number ai , we see that every v ∈ V is a linear combination of
v1 , . . . , vn . In other words, v1 , . . . , vm span V .
(c) Let W = Span(v1 , . . . , vm ). Part (a) tells us that W = U1 +
· · · + Un . We want to show that this sum is direct if an only if
v1 , . . . , vm are linearly independent.
Suppose v1 , . . . , vm are linearly independent. We want to show
that in this case the sum W = U1 + · · · + Un is direct. Suppose
u1 + · · · + un = 0 for some u1 ∈ U1 , . . . , un ∈ Un . Writing ui = ai vi ,
for i = 1, 2, . . . , n, we obtain a1 v1 +· · ·+an vn = 0. Since v1 , . . . , vm
are linearly independent, this is only possible if a1 = · · · = an = 0.
Hence,
u1 = a1 v1 = 0,
...
un = an vn = 0 .
Thus proves that V = U1 ⊕ · · · ⊕ Un .
Conversely, assume that V = U1 ⊕ · · · ⊕ Un . We want to show
that the vectors v1 , . . . , vm are linearly independent. Suppose
a1 v1 + · · · + an vn = 0 .
The first term in this sum lies in U1 , the second in U2 , etc. Since
V = U1 ⊕ · · · ⊕ Un , this is only possible if
a1 v1 = 0,
...
an vn = 0 .
Since we are assuming that v1 , . . . vn 6= 0, we conclude that a1 =
· · · = an = 0. Thus v1 , . . . , vm are linearly independent, as claimed.
(b) is immediate from from (a) and (c).
(7) Chapter 2, Problem 14. Done in review session.
(8) Chapter 2, Problem 16. Done in review session.