Download Sec 1.7 - UBC Math

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Debye–Hückel equation wikipedia , lookup

Differential equation wikipedia , lookup

Minkowski space wikipedia , lookup

Calculus of variations wikipedia , lookup

Derivation of the Navier–Stokes equations wikipedia , lookup

Schwarzschild geodesics wikipedia , lookup

Partial differential equation wikipedia , lookup

Exact solutions in general relativity wikipedia , lookup

Transcript
Math 221, Section 1.7
(1) Linear dependence/independence
(2) Example with 1-3 vectors
(3) When L.D. is automatic
1 / 13
Motivation
I
Remember the quiz questions: Given three non-zero vectors
u, v, w in R3 , then
I
I
I
Span{u} is a line (1-dimensional), but
is Span{u, v} a plane (2-dimensional) passing through the
origin?
is Span{u, v, w} equal to R3 (3-dimensional)?
I
The answers are YES sometimes, but NO in general.
I
The NO answers can be explained using linear combinations in
Sec 1.3.
I
Today we will see under a special condition (linear
independence) on {u, v, w}, the vectors can span a subspace
with expected dimension, giving YES answers to above.
2 / 13
Linear dependence: 1 vector
I
A set of one vector {v} are said to be
I
I
I
Geometrically, it means that
I
I
I
{v} is linearly independent if Span{v} = {xv|x is a scalar} is a
line.
{0} is linearly dependent because Span{0} = 0 is a point, not
a line.
Arithmetically, it means that:
I
I
I
linearly independent if it is a non-zero vector,
linearly dependent if it is the zero vector 0.
{v} is linearly independent if the equation xv = 0 has only
trivial solution x = 0.
The zero vector 0 is linearly dependent because the equation
x0 = 0 has many nontrivial solutions for x (any scalar is a
solution).
The example of one vector is not very interesting.
3 / 13
Linear dependence: 2 vectors
A set of two vectors {u, v} are said to be
I
linearly independent if they are not multiple of each other.
I
linearly dependent if one is a multiple of another.
For example, in R2 ,
I
u = [ 32 ] and v = [ 62 ] are linearly independent.
I
u0 = [ 31 ] and v0 = [ 62 ] are linearly dependent because v0 = 2u0 .
Geometrically, two vectors are linearly dependent if and only if they
lie on the same line through the origin.
4 / 13
Linear dependence: 2 vectors
Arithmetically, there is another way to describe this.
I u, v are linearly independent if the equation xu + y v = 0 has
only trivial solution [ yx ] = [ 00 ].
Check:
I
u0 , v0 linearly dependent if the equation xu0 + y v0 = 0 has
non-trivial solutions.
Check:
This arithmetic definition will be used for describing 3 or more
vectors to be linear independent or not.
5 / 13
Linear dependence: 3 vectors
Three vectors {u, v, w} are said to be
I
linearly independent if the homogeneous system
xu + y v + zw = 0
hx i h0i
has only trivial solution y = 0 .
z
I
0
linearly dependent if the above equation has non-trivial
solutions.
6 / 13
Example 1 in the book u =
I
I
I
h2i
h4i
h1i
2 , v = 5 , w = 1 .
3
0
The equation xu + y v + zw = 0 is a homogeneous system,
h1 4 2i
Form the coefficient matrix A = [uvw] = 2 5 1 .
360
Apply Gaussian elimination.
h1 4 2i
251 →
360
I
6
→
h1 4 2i
011
0 0 0 REF
.
Infinitely many solutions
⇒ non-trivial solution exists
⇒ {u, v, w} are linearly dependent
7 / 13
If you proceed to find the RREF
h1 4 2i
011
→
→
0 0 0 REF
You can solve for the solution
hx i
y
z
=z
h
2
−1
1
h 1 0 −2 i
01 1
00 0
RREF
,
i
. Therefore,
2u − 1v + 1w = 0, or w = −2u + 1v,
(Think: what if I write a vertical line separating the last column from the others.
Which equation is it?)
Geometrically, this means that w is in Span {u, v}, or that w lies
in the plane generated by {u, v}.
8 / 13
Example 1 in the book u =
I
I
I
h5i
h
0 , v =
0
7
2
−6
i
h
,w=
i
.
The equation xu + y v + zw = 0 is a homogeneous system,
h5 7 9 i
Form the coefficient matrix A = [uvw] = 0 2 4 .
0 −6 −8
Apply Gaussian elimination.
h5 7 9 i
0 2 4
→
0 −6 −8
I
9
4
−8
→
h 1 7/5 9/5 i
0 1
0 0
2
1
.
REF
unique solution
⇒ solution must be trivial
⇒ {u, v, w} are linearly independent .
9 / 13
Summary
I
To check whether {u, v, w} are linearly independent or not:
(1) Form the matrix A = [uvw]
(2) Apply Gaussian elimination on A
(3) In the REF of A, if
I
I
non-pivotal column exists ⇒ linearly dependent
all columns are pivotal ⇒ linearly independent
I
In the linearly dependent case, to further find out the scalars
s, t such that w = su + tv, read the entries from the
non-pivotal column.
I
If a set of vectors are linearly independent, they span a
subspace of dimension equal to the number of vectors in the
set.
10 / 13
Some automatic situations
The next two theorems describe special cases in which the linear
dependence of a set is automatic.
I
(Theorem 8) If a set contains more vectors than there are
entries in each vector, then the set is linearly dependent.
I
(Ex 15) [ 51 ] , [ 28 ] , [ 13 ] ,
5 2 1 −1 →
183 7
−1 7
are linearly dependent.
→ [ 10 01
]REF .
the last two columns are non-pivotal, so there exist non-trivial
solution, which means: linearly dependent.
I
(Reason: If A has more columns than rows, then it must have
non-pivotal columns)
I
(In fact, if you understand the reason, then Gaussian
elimination is not even necessary.)
11 / 13
Some automatic situations
I
(Theorem 9) If a set of vectors contains the zero vector, then
the set is linearly dependent.
I
(Ex 17)
h
5
−3
−1
i h 0 i h −7 i
, 0 , 2 are linearly dependent, because
0
4
h 5 i
h0i
h −7 i h 0 i
0 −3 + y 0 + 0 2 = 0
−1
0
4
0
for every y , and we may take y to be non-zero.
I
(Again, if you understand the reason, then Gaussian
elimination is not even necessary.)
12 / 13
Appendix: General definition
A set of vectors {v1 , v2 . . . , vn } are said to be
I
linearly independent if the homogeneous system
x1 v1 + x2 v2 + · · · + xn vn = 0
has only trivial solution x1 = x2 = · · · = xn = 0.
I
linearly dependent if the above equation has non-trivial
solutions.
13 / 13