Download Let u1,u2,... ,uk ∈ Rn, and let v1,v2,... ,vm ∈ span(u 1,u2,... ,uk).

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Determinant wikipedia , lookup

Gröbner basis wikipedia , lookup

Dual space wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Matrix calculus wikipedia , lookup

Fundamental theorem of algebra wikipedia , lookup

System of polynomial equations wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Cartesian tensor wikipedia , lookup

Bra–ket notation wikipedia , lookup

Jordan normal form wikipedia , lookup

Gaussian elimination wikipedia , lookup

Linear algebra wikipedia , lookup

System of linear equations wikipedia , lookup

Basis (linear algebra) wikipedia , lookup

Transcript
Let u1 , u2 , . . . , uk ∈ Rn , and let v1 , v2 , . . . , vm ∈ span(u1 , u2 , . . . , uk ). If m > k, then the set
{ v1 , v2 , . . . , vm } is linearly dependent.
Proof. We want to prove that there exist real numbers x1 , x2 , . . . , xm , not all 0, such that
x1 v1 + · · · + xm vm = O.
Since v1 , v2 , . . . , vm ∈ span(u1 , u2 , . . . , uk ), we know that for each i = 1, 2, . . . , m, there are real
Pk
numbers ai,j , j = 1, . . . , k, such that vi = j=1 aij uj . Then for any x1 , x2 , . . . , xm ∈ R,
x1 v1 + · · · + xm vm
k
k
X
X
amj uj )
a1j uj ) + · · · + xm (
= x1 (
j=1
j=1
m
m
m
X
X
X
aik xi )uk .
ai2 xi )u2 + · · · + (
ai1 xi )u1 + (
=(
i=1
i=1
i=1
Thus if we can determine that there is a nontrivial solution to the homogeneous system of k linear
equations in the m unknowns x1 , x2 , . . . , xm
a11 x1 + a21 x2 + · · · + am1 xm = 0
a12 x1 + a22 x2 + · · · + am2 xm = 0
..
.
a1k x1 + a2k x2 + · · · + amk xm = 0
then we have determined that there are x1 , x2 , . . . , xm , not all 0, such that x1 v1 + · · · + xm vm = O
and hence that { v1 , v2 , . . . , vm } is linearly dependent. But every homogeneous system of linear
equations with more variables than equations has a nontrivial solution, and since k < m, the result
follows.
As immediate consequences, we obtain the following facts.
Fact: any subspace of Rn has a basis containing at most n elements.
Proof. This follows immediately from the preceding result once we observe that if { u1 , u2 , . . . , uk }
is a linearly independent subset of a subspace U of Rn , but span(u1 mu2 , . . . , uk ) 6= Rn , there there
exists v ∈ U such that v ∈
/ span(u1 , u2 , . . . , uk ). But in such a case, the set { v, u1 , u2 , . . . , uk } is
linearly independent. Thus every linearly independent subset of U that is not a spanning set for U
can be enlarged to a linearly independent subset of size one greater. By the preceding result, since
{ e1 , e2 , . . . , en } is a basis for Rn , any subset of U of size greater than n is linearly dependent, so
the process of enlarging a linearly independent subset of U must be stopped in a finite number of
steps. However, the stopping condition is that the linearly independent subset spans U , and so U
must have a basis, and any basis cannot exceed n in size.
Fact: Any two bases for a subspace U of Rn must contain the same number of elements (and this
number is then called the dimension of U ).
Proof. If { u1 , u2 , . . . , uk } and { v1 , v2 , . . . , vm } are both bases for a subspace U of Rn , then by
the main result of this note, mm ≤ k and k ≤ m, so m = k.
Fact: If U is a subspace of dimension k in Rn , then any spanning set for U of size k is linearly
independent, and any linearly independent subset of U of size k is a spanning set for U .
Proof. Suppose that { u1 , u2 , . . . , uk } is a spanning set for a subspace U of dimension k, but
that { u1 , u2 , . . . , uk } is not linearly independent. Then for some i, ui can be written as a linearly
combination of the vectors uj , j 6= i, in which case U = span(uj | j = 1, 2, . . . , k, n 6= i). But then
any subset of U of size k must be linearly dependent since there is a spanning set for U with fewer
than k elements. Since U has a basis of size k, this is not the case, and so { u1 , u2 , . . . , uk } must be
linearly independent.
2
Next, suppose that { u1 , u2 , . . . , uk } is a linearly independent subset of U but does not span U .
Then there exists v ∈ U with v ∈
/ span(u1 , u2 , . . . , uk ), in which case { v, u1 , u2 , . . . , uk } is a linearly
independent subset of U of size k + 1. But by the main result, every subset of U of size k + 1 is
linearly dependent. Thus { u1 , u2 , . . . , uk } spans U .
Given a spanning set v1 , v2 , . . . , vk } for a subspace U , we can find a linearly independent subset
of this set that still spans U (that is, a basis for U contained in this set) by making the matrix A
whose columns are the vectors v1 , v2 , . . . , vk , so that U = col(A), and then row reduce A to find
which columns of A form a basis for col(A).
If { v1 , u2 , . . . , uk } is a linearly independent subset of Rn and we wish to find a basis for Rn which
contains { v1 , v2 , . . . , vk }, form the matrix A whose columns are, in order, v1 , v2 , . . . , vk , followed by
e1 , e2 , . . . , en }. Then col(A) = Rn , and upon row reducing A we find the columns of A that form a
basis for Rn , and since row reduction will row reduce the first k columns, we will find that the first
k columns will row reduce to the k × k identity matrix above a block of 0′ s, and so we will chose
the first k columns of A, plus some n − k columns from In to form our basis of Rn .