Download Homework 6, Monday, July 11

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Tensor operator wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Determinant wikipedia , lookup

Euclidean vector wikipedia , lookup

Gaussian elimination wikipedia , lookup

Jordan normal form wikipedia , lookup

Geometric algebra wikipedia , lookup

Vector space wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Matrix calculus wikipedia , lookup

Four-vector wikipedia , lookup

Matrix multiplication wikipedia , lookup

Dual space wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Cartesian tensor wikipedia , lookup

System of linear equations wikipedia , lookup

Bra–ket notation wikipedia , lookup

Linear algebra wikipedia , lookup

Basis (linear algebra) wikipedia , lookup

Transcript
1
Homework 6: July 11, 2011
Page 138, Ex. 17. Let x1 , . . . , xk be linearly independent vectors in Rn , and
let A be a nonsingular n × n matrix. Define yi = Axi for i = 1, . . . , k. Prove
that y1 , . . . , yk are linearly independent.
Note first that matrix multiplication by any matrix B preserves linear combinations; that is,
B(c1 v1 + · · · + ck vk ) = c1 Bv1 + · · · + ck Bvk .
Suppose now that there are scalars c1 , . . . , ck , not all zero, with
c1 y1 + · · · + ck yk = c1 Ax1 + · · · + ck Axk = 0.
Multiply by A−1 (which exists because A is nonsingular) to get c1 x1 + · · · +
ck xk = 0, which contradicts the linear independence of the x’s.
Page 143, Ex. 7. Find a basis for the subspace S of R4 consisting of all vectors
of the form (a + b, a − b + 2c, b, c)> , where a, b, c are real numbers. What is the
dimension of S?
Remember that the superscript > means transpose; it is here because we
regard elements of euclidean space as columns.
Just factor out a, b, c:
(a + b, a − b + 2c, b, c)> = a(1, 1, 0, 0)> + b(1, −1, 1, 0)> + c(0, 2, 0, 1)> .
These three vectors span S because every vector in S is a linear combination of
them (this is the definition of S!).
These vector are linearly independent, because if A is the matrix having
these vectors as its columns, then the homogeneous linear system Ax = 0 has
no non-trivial solutions (remember that if x = (r, s, t)> , then Ax is a linear
combination of the columns of A). A reduced row-echelon form for A is


1 0 0
0 1 0

E=
0 0 1 .
0 0 0
Now the system Ex = 0 has the same solutions as Ax = 0, and if E(r, s, t)> = 0,
then r = 0 = s = t (because E(r, s, t)> = (r, s, t)> ).
3. Show that if U and V are subspaces of Rn , then
dim(U ) + dim(V ) = dim(U + V ) + dim(U ∩ V ).
2
Here, U + V = {u + v : u ∈ U and v ∈ V }; you may assume the easily proved
facts that both U ∩ V and U + V are subspaces of Rn .
Let x1 , . . . , xk be a basis of U ∩ V , so that dim(U ∩ V ) = k. Since U ∩ V
is a subspace of U and x1 , . . . , xk is linearly independent, there are vectors
y1 , . . . , ym ∈ U with x1 , . . . , xk , y1 , . . . , ym a basis of U ; that is, dim(U ) = k +m.
Similarly, there are vectors z1 , . . . , zn ∈ U with x1 , . . . , xk , z1 , . . . , zn a basis of V ;
that is, dim(V ) = k + n. It suffices to show that
B = x1 , . . . , xk , y1 , . . . , ym , z1 , . . . , zn
is a basis of U + V , for then dim(U + V ) = k + m + n, as desired. Hence, we
show that the list B spans U + V and that it is linearly independent.
B spans: If w ∈ U + V , then w = u + v, where u ∈ U and v ∈ V . But u is
a linear combination of x’s and y’s, while v is a linear combination of x’s and
z’s. Therefore, w = u + v is a linear combination of x’s, y’s, and z’s; that is, B
spans U + V .
B is independent: Suppose there are scalars ai , bj , ck with
X
X
X
ai xi +
bj yj +
ck zk = 0.
(1)
i
j
k
P
We show that all these scalars must be zero. Now kPck zk ∈ V ; it also lies in U
(since the x’s and y’s are a basis for U ). Therefore, k ck zk ∈ U ∩ V ; as such,
it is a linear combination of the x’s: there are scalars di with
X
X
ck zk =
di xi .
i
k
Substituting, we have
X
ai xi +
i
X
bj yj +
j
X
di xi = 0.
i
But the x’s and y’s are independent, and so the b’s are all zero [so are the
(ai + di )’s]. Thus, Eq. (1) reads
X
X
ai xi +
ck zk = 0.
i
k
Since the x’s and z’s are linearly independent, all the a’s and c’s are zero.
Therefore, B is linearly independent.