Download Linear Combinations and Linearly Independent Sets of Vectors

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Gaussian elimination wikipedia , lookup

Jordan normal form wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Matrix multiplication wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Euclidean vector wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Matrix calculus wikipedia , lookup

Four-vector wikipedia , lookup

Vector space wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Ordinary least squares wikipedia , lookup

Least squares wikipedia , lookup

System of linear equations wikipedia , lookup

Transcript
Math1300:MainPage/LinearIndependence
Contents
• 1 Linear Combinations and Linearly Independent Sets of
Vectors
♦ 1.1 Definition of Linear Independence
♦ 1.2 How to prove that a set of vectors is linearly
independent
♦ 1.3 Examples of Linear Independence
(Polynomials)
♦ 1.4 Examples of Linear Independence (Euclidean
n-space)
♦ 1.5 Examples of Linear Independence (Matrices)
♦ 1.6 Small Linearly Independent Sets
◊ 1.6.1 Theorem (Independent Sets with
Two Vectors)
◊ 1.6.2 Theorem (Sets containing 0 are
dependent)
♦ 1.7 Properties of Linearly Independent Sets
◊ 1.7.1 Theorem (one vector as a
combination of others)
◊ 1.7.2 Corollary (one vector as a
combination of others)
◊ 1.7.3 Theorem (Large sets in n-space are
linearly dependent)
◊ 1.7.4 Theorem (Uniqueness of the Linear
Combination)
Linear Combinations and Linearly Independent Sets of
Vectors
Definition of Linear Independence
Suppose
is a set vectors in the vector space V, and consider the linear combination
then
The set S is linearly independent if
this is the only way a linear combination can equal
If
A set of vectors
in a vector space V is linearly independent if
implies
How to prove that a set of vectors is linearly independent
If
Contents
is the set of vectors, set
and prove that
This will, by definition, make the set of vectors linearly independent.
1
Math1300:MainPage/LinearIndependence
Examples of Linear Independence (Polynomials)
• Let p(x) = x2 − 3x + 2 and q(x) = 2x2 − 1. To see if S = {p(x),q(x)} is linearly independent, we set a linear
combination of the vectors equal to
This gives us a set of equations, one for each power of x:
and so r1 = r2 = 0. This means that S is linearly independent.
• Let p1(x) = x2 − 3x + 2, p2(x) = 2x2 − 1 and p3(x) = x2 + 3x − 3. To see if S = {p1(x),p2(x),p3(x)} is linearly
independent, we set a linear combination of the vectors equal to zero.
This gives the system of linear equations
with augmented matrix
and reduced row echelon form
which means the solutions are (r ,r ,r ) = (t, − t,t) with
Hence there are values of r1,r2,r3, not all zero, so that
This means that {p (x),p (x),p (x)} is not linearly
1 in turn
2
3
independent. Indeed, if t = 1, then r1 = 1, r2 = − 1 and r3 = 1, which
gives:
1 2 3
Examples of Linear Independence (Euclidean n-space)
• Let v = (1,2,3,1) and v = (3,1,0,2). To test for linear independence, we use
1 (0,0,0,0) = r (1,2,3,1)
2
+ r2(3,1,0,2) = (r1 + 3r2,2r1 + r2,3r1,r1 + 2r2).
1
Each coordinate contributes an equation to the system:
and so r1 = r2 = 0 and the set S = {v1,v2} is linearly independent.
• Let S = {(1,2,3,1),(3,1,0,2),( − 1,3,6,0)}. To test for linear independence we use the equation
Examples of Linear Independence (Polynomials)
2
Math1300:MainPage/LinearIndependence
As before, each coordinate contributes an equation to the system:
This system of equations has augmented matrix
which has reduced row echelon form
and hence we have (r1,r2,r3) = ( − 2t,t,t) for all
Since the equation is valid
when r1 = r2 = r3 = 0 does not hold, the set S is linearly dependent.
Examples of Linear Independence (Matrices)
•
Let
To see if S is linearly independent, we use the equation
which corresponds to four equations
These equations have only r1 = r2 = r3 = 0 as a solution. This means that S is linearly
independent.
•
Let
To see if S is linearly independent, we use the equation
which corresponds to four equations
Examples of Linear Independence (Euclidean n-space)
3
Math1300:MainPage/LinearIndependence
which has augmented matrix
which has reduced row echelon form
and so (r1,r2,r3) = ( − t, − t,t) for
In particular, there exist solutions to the
system of linear equations where not all ri = 0. Hence S is linearly dependent. In particular, when
t = 1 we have
Small Linearly Independent Sets
If
and
dependent.
then, setting r = 1, we have
and so S is linearly
1
If
and
independent.
then, as we saw here
If
implies r = 0. This means that S is linearly
is linearly dependent, then
nonzero. Say that
scalar multiple of
dependent.
Then
that is,
has a solution with at least one of r ,r
1 2
and so
then
is a scalar multiple of
and so
Conversely, if
is a
is linearly
Theorem (Independent Sets with Two Vectors)
is linearly dependent if and only if one vector is a scalar multiple of the other.
Theorem (Sets containing 0 are dependent)
If a set of vectors S contains
Proof Let
linearly dependent.
then S is linearly dependent.
Then, using r = 1, we have
Examples of Linear Independence (Matrices)
1
and so S is
4
Math1300:MainPage/LinearIndependence
Properties of Linearly Independent Sets
Theorem (one vector as a combination of others)
A set
with k > 1 is linearly dependent if and only if some vector in S is a linear
combination of the other vectors in S.
Proof If S is linearly dependent, then
Then
with not all ri = 0. Say that
and
and
is a linear combination of
the other vectors of S.
Conversely, suppose that
dependent (at least one
is a linear combination of the other vectors of S. This means
which in turn implies
which makes S linearly
since r = − 1).
j
Corollary (one vector as a combination of others)
A set
with k > 1 is linearly independent if and only if no vector in S is a linear
combination of the other vectors in S.
Theorem (Large sets in n-space are linearly dependent)
Let
be a set of vectors in
and assume that k > n. Then S is linearly dependent.
Proof The equation
gives a system of homogenious linear equations,
one for each coordinate. Hence there are n equations with
as k unknowns. as we have proven this
system in fact has an infinite number of nontrivial solutions, and so S is linearly dependent.
Theorem (Uniqueness of the Linear Combination)
Let
be a linearly independent set of vectors in V and
can be written as a linear combination of
in at most one way.
Proof Suppose
Since S is linearly independent, we have
Properties of Linearly Independent Sets
be a vector in V. Then
and
Then
and so
5