Download Basics from linear algebra

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Cross product wikipedia , lookup

Jordan normal form wikipedia , lookup

Matrix multiplication wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Exterior algebra wikipedia , lookup

Euclidean vector wikipedia , lookup

System of linear equations wikipedia , lookup

Matrix calculus wikipedia , lookup

Four-vector wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Vector space wikipedia , lookup

Lp space wikipedia , lookup

Transcript
Basics from linear algebra
Definition. A vector space is a set V with the operations of addition
+ : V × V → V,
denoted w
~ + ~v = +(~v , w),
~ where ~v , w
~ ∈V
and multiplication by a scalar
·:R×V →V
denoted r~v = ·(r, ~v ), where r ∈ R~v ∈ V
such that the following hold:
(1) We have ~v + (w
~ + ~u) = (~v + w)
~ + ~u for all ~v , w,
~ ~u ∈ V .
(2) We have ~v + w
~ =w
~ + ~v for all ~v , w
~ ∈V.
(3) There exists an element ~0 ∈ V such that for every ~v ∈ V we have
~0 + ~v = ~v + ~0 = ~v . (One can prove that if an element ~0 with this
property exists then such an element is unique).
(4) For every ~v ∈ V there exists an element w
~ ∈ V such that ~v + w
~ =
~
w
~ + ~v = 0. Again, one can show that for any given ~v an element w
~
with this property is unique, and it is denoted w
~ = −~v .
(5) For every ~v ∈ V we have 1 · ~v = ~v .
(6) For every r ∈ R and for all ~v , w
~ ∈ V we have r(~v + w)
~ = r~v + rw.
~
(7) For every r, s ∈ R and every ~v ∈ V we have (r + s)~v = r~v + rw.
~
Elements ~v of a vector space V are called vectors.
Examples:
(1) If n ≥ 1 is an integer, then the Euclidean space Rn , with the standard
operations of addition and multiplication by a scalar, is a vector
space.
(2) The set Mn,n (R) of all n×n matrices with entries in R, with the standard operations of matrix addition and multiplication by a scalar, is
a vector space.
(3) If X is a nonempty set, then the set F (X, R) of all functions f :
X → R, with point-wise addition and point-wise multiplication by
a scalar, is a vector space.
That is, for f, g : X → R, f + g : X → R is defined as (f + g)(x) =
f (x)+g(x) for all x ∈ X. Similarly, if r ∈ R and f : X → R, then the
function rf : X → R is defined as (rf )(x) := rf (x), where x ∈ X.
Basic properties of vector spaces. Let V be a vector space. Then:
(1) We have 0 · ~v = ~0 for every ~v ∈ V .
(2) We have (−1) · ~v = −~v for all ~v ∈ V .
Definition. Let V be a vector space and let ~v1 , . . . , ~vm ∈ V be m vectors
in V (where m ≥ 1). We say that ~v1 , . . . , ~vm are linearly independent in
V if whenever c1 , . . . , cm ∈ R are such that c1~v1 + . . . cm~vm = ~0 then c1 =
· · · = cm = 0. The vectors ~v1 , . . . , ~vm are linearly dependent if they are not
linearly independent.
1
2
Thus ~v1 , . . . , ~vm are linearly dependent if and only if there exist c1 , . . . cm ∈
R such that c1~v1 + . . . cm~vm = ~0 but that ci 6= 0 for some i.
Example.
(1) The vectors ~v1 = (0, 1, 3), ~v2 = (−1, 1, 2) ∈ R3 are linearly independent in R3 .
(2) The vectors ~v1 = (0, 1, 3), ~v2 = (−1, 1, 2), ~v3 = (−2, 3, 7) ∈ R3 are
linearly dependent in R3 . Indeed 1 · v1 + 2~v2 + (−1)~v3 = (0, 0, 0) = ~0.
(3) The vectors ~v1 = (0, 1, 3), ~v2 = (0, 0, 0) ∈ R3 are linearly dependent
in R3 . Indeed, 0 · v1 + 1 · ~v2 = (0, 0, 0) = ~0 and 1 6= 0.
(4) The functions x, x2 , 5x3 are linearly independent in F (R, R) (try to
prove this fact).
Recall that the Euclidean space Rn is also equipped with the dot-product
operation (x1 , . . . , xn ) · (y 1 , . . . , y n ) = x1 y 1 + · · · + xn yn .
Recall that for ~x = (x1 , . . . , xn ) ∈ Rn the norm or length of ~x is
√
p
||~x|| := ~x · ~x = (x1 )2 + · · · + (xn )2 .
Thus we always have ||~x|| ≥ 0 and, moreover ||~x|| = 0 if and only if ~x = ~0.
A system of vectors ~v1 , . . . ~vm ∈ Rn is called orthogonal if ~vi · ~vj = 0 for
all i 6= j, 1 ≤ i, j ≤ m.
Fact: Let ~v1 , . . . ~vm ∈ Rn be an orthogonal system of vectors such that
~vi 6= ~0 for i = 1, . . . , m. Then the vectors ~v1 , . . . , ~vm are linearly independent.
Proof. Suppose c1 , . . . , cm ∈ R are such that
c1~v1 + · · · + cm~vm = ~0.
Let i ∈ {1, . . . , m} be arbitrary.
take the dot-product of the above equation with ~vi . Then
(c1~v1 + · · · + cm~vm ) · ~vi = ~0 · ~vi = 0
c1 (~v1 · ~vi ) + . . . cm (~vm · ~vi ) = 0
Because ~v1 , . . . ~vm ∈ Rn is, by assumption, an orthogonal system in the
above sum all terms ~vj · ~vi are = 0 except for the case j = i. Thus we get
ci (~vi · ~vi ) = ci ||~vi ||2 = 0.
Since, again by assumption, ~vi 6= ~0, we have ||~vi || > 0. Therefore from
ci ||~vi ||2 = 0 we get ci = 0. Since i ∈ {1, . . . , m} was arbitrary, we conclude
that c1 = · · · = cm = 0.
Thus ~v1 , . . . ~vm are linearly independent, as claimed.
Definition. Let V be a vector space and let W ⊆ V be a subset.
The subset W is called a linear subspace of V if it satisfies the following
properties:
(1) ~0 ∈ W .
(2) Whenever ~v ∈ W and r ∈ R then r~v ∈ W .
3
(3) For every ~v , w
~ ∈ W we have ~v + w
~ ∈ W.
If W is a linear subspace of V , we write W ≤ V .
Note that if W ≤ V then W is itself a vector space, with the operations
of addition and multiplication by a scalar restricted from V .
Example:
(1) The set W = {(x, y) ∈ R2 |y = 3x} is a linear subspace of R2 .
(2) The set W = {(x, y) ∈ R2 |y = 3x + 1} is a not linear subspace of R2 .
(3) The set W = {f : R → R : f (3) = 0} is a linear subspace of F (R, R).
(4) The set W = {f : R → R : f (3) = 2f (5)} is a linear subspace of
F (R, R).
(5) The set W = {f : R → R : f is continuous} is a linear subspace of
F (R, R).
(6) If A ∈ M2,2 (R) is a 2 × 2 matrix, then
x
0
1 2
2
ker(A) := {(x , x ) ∈ R |A 1 =
}
x2
0
is a linear subspace of R2 .
(7) Let V be a vector space and let S ⊆ V be a nonempty subset. The
span of S is defined as:
Span(S) := {r1~v1 + . . . rn~vn |n ≥ 1, ~v1 , . . . , ~vn ∈ V, and r1 , . . . rn ∈ R
(Note that n in the above definition is not fixed, so that Span(S)
consists of all finite linear combinations of elements of S).
Then Span(S) is a linear subspace of V .
(8) For S = {(0, 1, 2), (0, 0, −1)} ⊆ R3 try to prove that Span(S) =
{(0, y, z)|y, z ∈ R are arbitrary}.
Definition. Let V be a vector space. A collection of vectors ~v1 , . . . , ~vn ∈
V is called a basis of V if the vectors ~v1 , . . . , ~vn are linearly independent and
if Span(~v1 , . . . , ~vn ) = V .
Fact. A collection of vectors ~v1 , . . . , ~vn ∈ V is a basis of V if and only if
for every ~v ∈ V there exists a unique n-tuple of real numbers c1 , . . . , cn such
that c1~v1 + · · · + cn~vn = ~v .
Basic properties of bases:
(1) If ~v1 , . . . , ~vn ∈ V and w
~ 1, . . . , w
~ m ∈ V are bases of V then n = m.
For this reason, if a vector space V admits a finite basis ~v1 , . . . , ~vn ∈
V , then the number n is called the dimension of V and denoted
n = dim V . If a vector space V does not admit a finite basis, we set
dim V := ∞.
(2) If ~v1 , . . . , ~vm ∈ V is a linearly independent collection of vectors then
~v1 , . . . , ~vm is a basis of the linear subspace Span(~v1 , . . . , ~vm ).
(3) If dimV = n < ∞ and ~v1 , . . . , ~vm ∈ V is a linearly independent collection of vectors then m ≤ n and there exist vectors ~vm+1 , . . . ~vn ∈ V
such that ~v1 , . . . , ~vm , ~vm+1 , . . . ~vn ∈ V is a basis of V .
4
(4) If dim(V ) = n < ∞ and if W ≤ V then dim W ≤ n.
Examples:
(1) dim Rn = n and ~e1 , . . . ~en is a basis of Rn where ~ei = (0, . . . , 1, . . . 0)
with 1 occurring in the i-th position.
(2) dimF (X, R) = |X|, the cardinality of the set X. In particular,
dimF (X, R) < ∞ if and only if X is a finite set.
(3) dim Mn,n (R) = n2 .
(4) Let ~v1 , . . . , ~vn ∈ Rn be n vectors in Rn and let A = [v1 |v2 | . . . |vn ] be
the n × n matrix with the i-th column being the vector ~vi .
Then ~v1 , . . . , ~vn is a basis of Rn if and only if det(A) 6= 0.
(5) Let ~v1 , . . . , ~vn ∈ Rn an orthogonal system of vectors such that ~vi 6= ~0
for i = 1, . . . , n. Then the vectors ~v1 , . . . ~vn form a basis of Rn .
Definition. Let V and W be vector spaces. A function T : V → W is
called a linear map if for every ~v1 , ~v2 ∈ V and r1 , r2 ∈ R we have
T (r1~v1 + r2~v2 ) = r1 T (~v1 ) + r2 (~v2 ).
Basic facts:
(1) If T : V → W is a linear map then T (~0) = ~0.
(2) If T : V → W is a linear map then ker(T ) := {~v ∈ V |T (~v ) = ~0} is a
linear subspace of V .
(3) If T : V → W is a linear map then T (V ) = {T (~v )|~v ∈ V } is a linear
subspace of W .
(4) Let V, W be vector spaces, and let ~v1 , . . . ~vn be a basis of V . Let
T, S : V → W be linear maps such that for i = 1, . . . , n we have
T (~vi ) = S(~vi ). Then T = S as functions, that is, T (~v ) = S(~v ) for all
~v ∈ V .
(5) Let V, W be vector spaces, and let ~v1 , . . . ~vn be a basis of V , and let
w
~ 1, . . . , w
~ n ∈ W be arbitrary.
Then there exists a unique linear map T : V → W such that
T (~vi ) = w
~ i for i = 1, . . . , n.
Examples.
(1) The function T : R2 → R given by the formula T (x, y) = 3x − 5y, is
a linear map.
(2) The function T : R2 → R given by the formula T (x, y) = 3x − 5y + 4,
is not a linear map.
(3) Consider the function T : F (R, R) → R2 given by T (f ) = (f (0) +
3f (1), −5f (20)), where f : R → R is arbitrary. Then T is a linear
map.
(4) Consider the function T : M2,2 → R3 given by
1
x x2
T 3
= (x1 − 2x3 , 2x2 + 5x1 , x1 + x2 + 4x3 ).
x x4
Then T is a linear map.
5
(5) Let A be an m × n matrix
entries 
in R.
with

 Consider the map
x1
x1
 
 
T : Rn → Rm given by T  ...  := A  ...  (where we think of
xn
xn
n
m
elements of R and of R as column-vectors and where the righthand side of the preceding formula refers to the matrix product).
Then T is a linear map.
(6) Let w
~ ∈ R be an arbitrary fixed vector. Consider the function T :
Rn → R given by T (~x) := ~x · w.
~ Then T is a linear map.
n
(7) Let ~v1 , . . . ~vn−1 ∈ R be arbitrary fixed n − 1 vectors.
Consider the map T : Rn → R given by T (~x) = det[v1 | . . . |vn−1 |x]
for ~x ∈ Rn . Then T is a linear map.