* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download CM222A LINEAR ALGEBRA Solutions 1 1. Determine whether the
Matrix multiplication wikipedia , lookup
Jordan normal form wikipedia , lookup
Eigenvalues and eigenvectors wikipedia , lookup
Laplace–Runge–Lenz vector wikipedia , lookup
Gaussian elimination wikipedia , lookup
Cross product wikipedia , lookup
Exterior algebra wikipedia , lookup
Singular-value decomposition wikipedia , lookup
System of linear equations wikipedia , lookup
Matrix calculus wikipedia , lookup
Vector space wikipedia , lookup
Euclidean vector wikipedia , lookup
Geometric algebra wikipedia , lookup
CM222A
LINEAR ALGEBRA
Solutions 1
1. Determine whether the following sets of vectors are linearly independent subsets of the indicated vector space
V.
(i)
(ii)
(iii)
{(1, −1, 4), (1, 1, 5), (1, −5, 2)}, V = R3 .
{(2, 4, −4, 4), (2, −1, 0, 1), (−2, 3, −2, 2), (1, −1, −2, 4)}, V = R4 .
{(3, 0, 2, −1), (4, −1, 1, −4), (2, 1, 1, 0), (3, −2, 3, −2)}, V = R4 .
(i) Put vectors in rows and reduce to echelon form.
R20 = R2 − R1
1 −1 4
1 −1 4
1 −1
4
0
R3 = R3 + 2R2
0
1
0
2 1
1 5
R30 = R3 − R1
2
1
−→
0
0 0
1 −5 2
0 −4 −2
−→
and since the last row is zero, the vectors are linearly dependent.
[Of course, simply writing 3(1, −1, 4) − 2(1, 1, 5) − (1, −5, 2) = (0, 0, 0) would be a correct answer, but that
would hide the way the the work is done.]
R20 = R2 − R1
2
4 −4 4
2
4 −4
4
2 −1
0 −5
R30 = R3 − R1
0 1
4 −3
(ii)
1
0
2
3 −2 2
0 −1
2 −2
R4 = R4 − 2 R1
1 −1 −2 4
0 −3
0
2
−→
2
4 −4
4
2
4 −4
4
R2 ↔ R3 then
0
0 −1
0 −1
2 −2
2 −2
R30 = R3 − 5R2
R4 = R4 − R3
0
0
0
−→
R4 = R4 − 3R2
0 −6
7
0 −6
7
−→
0
0 −6
8
0
0
0
1
so with no zero rows, the vectors are linearly independent.
(iii) The same procedure as in (ii) results in a row of zeros and so shows that these vectors are linearly
dependent.
2. For what real values of a and b is {(1, b, 0), (a, a, 1), (0, 0, 1 + a)} a linearly dependent subset of R3 ?
1
b
0
1 b
0
0
R2 = R2 − aR1
a a
0 a − ab
1 .
1
−→
0
0
1+a
0 0 1+a
If a = 0 or b = 1 the operation R30 = R3 − (1 + a)R2 results in a zero row. If a = −1 the last row is already
zero. Hence the vectors are linearly dependent if a = 0 or −1 or if b = 1.
a x e
3. Let A = t i c and let e1 , e2 , e3 be the usual basis. Write down :
h m s
e3 t Ae2 , (Ae1 )t , e1 t Ae3 , e3 t Ae2 , e1 t Ae1 , e2 t A, e3 t Ae3 .
The answer is : m, ath, e, m, a, tic, s.
4. (i)
Find a basis of R3 containing (2, 1, 3) and (1, 1, 1).
Find a basis of R5 containing (1, 4, 7, 0, 3), (2, 9, 13, 2, 6) and (1, 5, 6, 2, 8).
µ
¶
µ
¶
1 1 1
R20 = R2 − 2R1
1 1 1
(i) Note that
. Clearly adding (0, 0, 1) to these vectors
2 1 3
−→
0 −1 1
gives 3 vectors in echelon form with non-zero entries in all the diagonal positions and hence a basis. Therefore
{(2, 1, 3), (1, 1, 1), (0, 0, 1)} is a basis of R3 . [There are other correct answers.]
1 4 7 0 3
R20 = R2 − 2R1
1 4 7 0 3
1 4 7 0 3
0
R3 = R3 − R2
R30 = R3 − R1 0 1 −1 2 0
0 1 −1 2 0 .
(ii) 2 9 13 2 6
−→
1 5 6 2 8
−→
0 1 −1 2 5
0 0 0 0 5
(ii)
Thus suitable vectors to add are (0, 0, 1, 0, 0) and (0, 0, 0, 1, 0).
5. Prove that a set S of vectors of a vector space V is a basis if and only if
(i)
S is linearly independent, and
(ii)
every subset of V which properly contains S is linearly dependent.
Note: Properly contains means “contains and is not equal to”.
A set satisfying (i) and (ii) above is said to be a maximal linearly independent set.
This is little more than a paraphrase of the definition of a basis.
If S = {v1 , v2 , . . . , vn } is a basis it is certainly linearly independent. Also if a set T properly contains S then
there is a vector
/ S. If w = 0 thenPT is a linearly dependent set. Otherwise, since S is
Pnw ∈ T with w ∈
n
spanning, w = 1 αi vi for some scalars αi and w − 1 αi vi = 0 shows that T is linearly dependent.
Conversely, if S satisfies (i) and (ii) it is linearly independent. To show that it is spanning, let v be any
vector.PIf v ∈ S then clearly v ∈ span S. If v ∈
/ S then S ∪ {v} is linearly dependent so for vi ∈ S we have
αv + αi vi = 0 for some scalars α, αi which are not all 0. We cannot have α = 0 for then S would be
linearly dependent. Therefore we can divide by α and rearrange the above relation to show that v ∈ span S.
Thus S is spanning. [Alternatively one could say that, since S is linearly independent, by Theorem 1.3 it
can be extended to a basis. But from (ii) any strictly larger set cannot be a basis, so S itself must be a
basis.]
2006/07