Download Solutions #4

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Linear least squares (mathematics) wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Euclidean vector wikipedia , lookup

Four-vector wikipedia , lookup

Matrix calculus wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Vector space wikipedia , lookup

System of linear equations wikipedia , lookup

Transcript
APSC 174J, Solutions #4
Posted: March 23, 2016
Section 5, Problem 1(c).
To see if {~v2 } is linearly independent, we need to solve the vector equation:
x · ~v2 = ~0
That is, x(3, 0) = (3x, 0) = (0, 0). The only solution is x = 0, so {~v1 } is linearly
independent.
Note: If a set contains only one vector, then it is linearly independent if and only if
the vector is a non-zero vector.
Section 5, Problem 1(g).
To see if {~v3 , ~v4 } is linearly independent, we need to solve the vector equation:
x1 · ~v3 + x2 · ~v4 = ~0
That is, x1 (0, 0) + x2 (1, 1) = (x2 , x2 ) = (0, 0). The solution is x2 = 0, and x1 is free, for
example, we can choose x1 = 1, then
1 · ~v3 + 0 · ~v4 = ~0
Which means we find non-zero weights so that the linear combination of ~v3 , ~v4 equals
the zero vector, so {~v3 , ~v4 } is linearly dependent.
Section 5, Problem 1(k).
To see if {~v2 , ~v4 , ~v5 } is linearly independent, we need to solve the vector equation:
x1 · ~v2 + x2 · ~v4 + x3 · ~v5 = ~0
That is, x1 (3, 0) + x2 (1, 1) + x3 (2, 1) = (3x1 + x2 + 2x3 , x2 + x3 ) = (0, 0). The equations
are
(
3x1 + x2 + 2x3 = 0
x2 + x3 = 0
The solution is x2 = −x3 , x1 = −(1/3)x3 and x3 is free, for example, if we choose
x3 = 3, then x1 = −1, x2 = −3 and we will have
−1 · ~v2 + (−3) · ~v4 + 3 · ~v5 = ~0
2
Which means we find non-zero weights so that the linear combination of ~v2 , ~v4 , ~v5
equals the zero vector, so {~v2 , ~v4 , ~v5 } is linearly dependent.
Section 5, Problem 2(b).
To see if {~v1 , ~v2 } is linearly independent, we need to solve the vector equation:
x1 · ~v1 + x2 · ~v2 = ~0
 
   
1
2
0
 
   
0
1 0
 
   
That is, x1   + x2   =  . This gives equations:
0
0 0
 
   
0
0
0


 x1 + 2x2 = 0


x =0
2




0=0
0=0
The only solution is x1 = x2 = 0, so {~v1 , ~v2 } is linearly independent.
Section 5, Problem 2(f ).
To see if {~v2 , ~v4 } is linearly independent, we need to solve the vector equation:
x1 · ~v2 + x2 · ~v4 = ~0
   
 
0
0
2
   
 
0 0
1
   
 
That is, x1   + x2   =  . This gives equations:
1 0
0
   
 
0
0
1

2x1 = 0



 x =0
1

x2 = 0



x2 = 0
The only solution is x1 = x2 = 0, so {~v2 , ~v4 } is linearly independent.
Section 5, Problem 2(o).
To see if {~v3 , ~v4 , ~v5 } is linearly independent, we need to solve the vector equation:
x1 · ~v3 + x2 · ~v4 + x3~v5 = ~0
3
 
 
   
1
0
0
0
 
 
   
1
0
1 0
 
 
   
That is, x1   + x2   + x3   =  . This gives equations:
1
1
0 0
 
 
   
0
1
1
0

x1 = 0



x + x = 0
1
3

x1 + x2 = 0



x2 + x3 = 0
The only solution is x1 = x2 = x3 = 0, so {~v3 , ~v4 , ~v5 } is linearly independent.
Section 6, Problem 1(a).
Since our system of linear equations have two equations, we consider vectors in the
space R̂2 :
 
2
~v =   ,
4
 
4
w
~ = 
5
The system has a solution if and only if w
~ is in the linear span of ~v . Easy to see that w
~
is not in the linear span of ~v , so the system has no solutions.
Section 6, Problem 1(b).
Since our system of linear equations have two equations, we consider vectors in the
space R̂2 :
 
2
~v =   ,
4
 
4
w
~ = 
8
The system has a solution if and only if w
~ is in the linear span of ~v . Since w
~ = 2~v , w
~ is
in the linear span of ~v , so the system has solutions. To determine if there is a unique
solution or infinitely many solutions, we need to see if {~v } is linearly independent or
dependent. Since there is only one vector, and ~v 6= ~0, the set {~v } is linearly
independent. Hence the system has a unique solution.
Section 6, Problem 1(g).
Since our system of linear equations have two equations, we consider vectors in the
space R̂2 :
4
 
1
~v1 =   ,
1
 
1
~v2 =   ,
1
 
−1
~v3 =   ,
0
 
−1
~v4 =   ,
0
 
0
w
~ = 
2
The system has a solution if and only if w
~ is in the linear span of {~v1 , ~v2 , ~v3 , ~v4 }.
Observe that w
~ = ~v1 + ~v2 + ~v3 + ~v4 , so w
~ is in the linear span of {~v1 , ~v2 , ~v3 , ~v4 }, hence
the system has solutions. To determine if there is a unique solution or infinitely many
solutions, we need to determine if {~v1 , ~v2 , ~v3 , ~v4 } is linearly independent or dependent.
Easy to see that ~v1 , ~v2 , ~v3 , ~v4 are linearly dependent, for example,
1 · ~v1 + (−1) · ~v2 + 0 · ~v3 + 0 · ~v4 = ~0
Which means we find non-zero weights so that the linear combination of ~v1 , ~v2 , ~v3 , ~v4
equals the zero vector, so {~v1 , ~v2 , ~v3 , ~v4 } is linearly dependent.
Hence the system has infinitely many solutions.