Download Your Next Test §1 – Linear Independence and Linear Dependence

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Exterior algebra wikipedia , lookup

Matrix calculus wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Euclidean vector wikipedia , lookup

System of linear equations wikipedia , lookup

Vector space wikipedia , lookup

Four-vector wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Transcript
Your Next Test
Your next test will be on Monday, March 6. It will cover all material up to and including Chapter 10, with
a special focus on Chapters 7, 8, 9, 10. Note that Chapter 10 “Dimension Theorems” will be the main
topic of Tuesday’s lecture. It’s the last chapter of this abstract part of the textbook. We will then go on
with Chapter 14 “Matrix Multiplication”.
§1 – Linear Independence and Linear Dependence
VSF Chapter 7 (pp. 75 – 84)
If you want to justify that a set {~
v 1 , . . . ,~
vm } is linearly independent, you must show that λ1 = . . . = λm = 0
is the only solution to the equation λ1~
v1 + . . . + λm~
vm = ~0. If you want to justify that a set {~
v1 , . . . ,~
vm } is
linearly dependent, you must either find a non-trivial solution such as “5 · ~
v1 − 3 · ~
v2 + 0 · ~
v3 = ~0” or express
one of the vectors ~
v1 , . . . ,~
vm as a linear combination of the others such as “~
v1 = 3/5 ·~
v2 + 0 · ~
v3 ”.
Justification in Rn and Mmn
Are the following sets linearly independent (“LI”) or linearly dependent (“LD”):
½S· = { (1, 2¸, 3·, 4), (5, 6¸, 7,·8), (9, 10¸, 11· , 12) } in¸ ¾the vector space R
4
1
S=
1
0
2
2
0
0
0
,
1
2
,
0
2
0
1
,
2
1
0
0
in the vector space M2 2
H INT: Let ~
v1 = (1, 2, 3, 4), ~
v2 = (5, 6, 7, 8), ~
v3 = (9, 10, 11, 12). In order to solve λ1~
v1 + λ2~
v2 + λ3~
v3 = ~0,
merge the vectors on the left hand side to a single vector. Each entry of the resulting vector involves
the variables λ1 , λ2 , λ3 . You have to ensure that all four entries are equal to zero, and thus solve a
homogeneous linear system with four equations in three variables. The latter can be done using Gauss
elimination. For S 2 , use the same strategy.
Justification in Pn and F(R)
Check if the following sets are LI or LD:
S = { 1 + x , x + 2x , x − x , 3 − 2x − x } in the vector space P
S = { x , sin(x), e } in the vector space F(R)
3
2
3
4
2
2
3
3
3
?
x
H INT: For S 3 , recall that P3 denotes the space of polynomials of degree at most 3. Unlike before, there
are no entries to compare. But, you can do something very similar. More precisely, let p 1 ( x) = 1 + x2 ,
p 2 ( x) = x+2 x3 , p 3 ( x) = x2 − x3 , p 4 ( x) = 3−2 x− x3 . In order to solve λ1 p 1 ( x)+λ2 p 2 ( x)+λ3 p 3 ( x)+λ4 p 4 ( x) = 0,
merge the polynomials on the left hand side to a single polynomial. Each coefficient of the resulting
polynomial involves the variables λ1 , λ2 , λ3 , λ4 . Since the set { 1, x, x2 , x3 } is linearly independent, which
you may use as a fact, the only possible way to obtain zero on the right hand side is that all four
coefficients are equal to zero. So, you have to solve a homogeneous linear system with four equations in
four variables. For S 4 , try to plug in different values for x. You may want to start with x = 0.
Theoretical Questions
Looking at the past tests, you will also find a couple of theoretical question on LI and LD.
F15–T2 # 6b ? F14–T3 # 6a ? F14–T3 # 7
H INT: As you see, F14–T3 # 7 is a bonus question. You have to show that λ1 = λ2 = λ3 = 0 is the
~ = ~0. You are not given much information, so you have to use the few
only solution to λ1~
u + λ2~
v + λ3 w
things you have. Observe that if you take the dot product with ~
u on either side, then the left hand side
~ ) ·~
(λ1~
u + λ2~
v + λ3 w
u and right hand side ~0 · ~
u of the equation must remain equal. This will allow you to
justify that λ1 = 0. Using similar arguments, you may then justify that λ2 = 0 and λ3 = 0.
Non-Standard Operations
For those of you who like non-standard operations, here is a particular nice one:
VSF # 7.1k
§2 – Linear Independence and Spanning Sets
VSF Chapter 8 (pp. 87 – 91)
Reducing LD Spanning Sets
There are two useful theorems. The first one says: “If W = span{~
v1 , . . . ,~
vm } and the set {~
v1 , . . . ,~
vm } is LD,
then one vector ~
v i can be expressed as a linear combination of the others and can therefore be removed. The
remaining set {~
v 1 , . . . ,~
v i−1 ,~
v i+1 , . . . ,~
vm } will still span W .” This allows us to reduce any LD spanning set
until it is a basis of the vector space. Well, at least any finite LD spanning set . . .
VSF # 8.1b ?
Recall the set S = { (1, 2, 3, 4), (5, 6, 7, 8), (9, 10, 11, 12) } and let W = span(S ).
1
1
1
Reduce the spanning set S 1 until it is a basis of the vector space W1 .
Extending LI Sets
The second one says: “Let V be a vector space. Moreover, let {~
v1 , . . . ,~
vm } be a LI set of vectors in V and ~
v ∈ V.
The set {~
v 1 , . . . ,~
vm ,~
v } is LI if and only if ~
v 6∈ span{~
v 1 , . . . ,~
vm }.” This allows us to increase the size of any LI
set until it is a basis of the vector space.
VSF # 8.1f ? F15–T2 # 3 F14–T3 # 2
Extend your basis of W
1
until it is a basis of the surrounding vector space R4 .
H INT: We will see on Tuesday: “If dim(V ) = n, then (i) any LI set with n vectors is a basis of V and
(ii) any spanning set with n vectors is a basis of V .” Recall that dim(R4 ) = 4. So, as soon as you have
extended your basis of W1 to a LI set with four vectors in R4 , you know that the latter must be a basis
of R4 . And there is no need to check that the LI set is spanning. However, it can still be hard work to
find a vector ~
v ∈ V that is not yet in span{~
v1 , . . . ,~
vm }. In Chapter 17.2 we will learn an algorithm that
does this unpleasant work for us. For now, I recommend that you guess a candidate ~
v ∈ V and check
that ~
v 6∈ span{~
v 1 , . . . ,~
vm }. As indicated in Remark 9.1, when you pick a random candidate ~
v ∈ V , your
choice will work almost all the time.
§3 – Basis and Dimension
VSF Chapter 9 (pp. 93 – 98)
Now comes the really good stuff. This chapter introduced basis and dimension. The O THER B IG T HEOREM
tells us that for any vector space V , and in particular for any subspace, the following inequality holds:
size of any LI set in V ≤ dim(V ) ≤ size of any spanning set of V
Moreover, as mentioned in the hint a few lines above, we will see on Tuesday: “If dim(V ) = n, then (i) any
LI set with n vectors is a basis of V and (ii) any spanning set with n vectors is a basis of V .” So, knowing
the dimension of a vector space tells us a lot about sets of vectors: Any LI set with four vectors in R4 must
be a basis, any set of five vectors in R4 must be LD, . . .
Examples
The following examples are nice because they combine many of the ideas that we have covered so far. I
strongly recommend to do them. If you are short in time, do at least the ones with a red star.
in R4 . . . . . . . :
in M22 . . . . . :
in P2 . . . . . . . :
W15–T3 # 4 ? F14–T3 # 4
F15–T2 # 5 F15–T2 # 6c ?
F15–T2 # 4 ?
H INT: You shouldn’t have any problems solving F15–T2 # 4bcd. However, F15–T2 # 4a is slightly more
advanced. The polynomials p( x) under consideration are of degree at most 2 and satisfy p(3) = 0. They
are therefore of the form p( x) = ( x − 3) · q( x), where q( x) is a polynomial of degree at most 1. This allows
us to conclude that we can describe W in a different way, namely as W = { ( x − 3) · (a + bx) | a, b ∈ R }. Now,
write the latter in terms of a · (. . .) + b · (. . .), and you are back on safe ground.
Theoretical Questions
For the first two of the following problems keep in mind: When you think that some statement is false,
give a counterexample with numbers. The third problem is somewhat different and you will find a hint
below.
F14–T3 # 6b ? W15–T3 # 7a W15–T3 # 8
~ is in U = span{ ~
H INT: W15–T3 # 8 is again a bonus question. Here is an idea to start. Since w
u1 , . . . ,~
u 7 },
~ = λ1~
~ is non-zero, there must
we can decribe it as a linear combination w
u 1 + . . . + λ7~
u 7 . Because w
~ is in V = span{~
be some λ i 6= 0. On the other hand, since w
v1 , . . . ,~
v9 }, we can decribe it as a linear
~ = µ1~
~ to show
combination w
v1 + . . . + µ9~
v9 . As a next step, try to use these two linear combinations of w
that { ~
u1 , . . . ,~
u 7 ,~
v 1 , . . . ,~
v9 } is LD.