Download VECTOR SPACES

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Eigenvalues and eigenvectors wikipedia , lookup

Exterior algebra wikipedia , lookup

System of linear equations wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Euclidean vector wikipedia , lookup

Matrix calculus wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Vector space wikipedia , lookup

Four-vector wikipedia , lookup

Lp space wikipedia , lookup

Transcript
CS131
Part V, Abstract Algebra
CS131 Mathematics for Computer Scientists II
Note 30
VECTOR SPACES
We now look at a useful generalisation of the set Rn of vectors in n dimensional space.
A vector space V over a field F is a set V (whose elements are called vectors)
together with an operation (called scalar multiplication) which associates
a vector λ v with every v ∈ V and every λ ∈ F , such that the following
properties hold:
[V1 ] V is an abelian group (as usual we write the group operation as +
and the identity element as 0)
[V2 ] 1v = v for all v ∈ V
[V3 ] λ(µ v ) = (λ µ)v for all v ∈ V and all λ, µ ∈ F
[V4 ] (the distributive laws) for all v , w ∈ V and all λ, µ ∈ F :
λ(v + w ) = λ v + λ w
(λ + µ)v = λ v + µ v
Elements of the field F are called scalars.
When the field of scalars is R we talk of a real vector space; when it is C
then we talk of a complex vector space.
Examples of Real Vector Spaces.
(1) Rn is a real vector space
(2) the set Pn = {a0 + a1 x + a2 x 2 + · · · + an−1 x n−1 | a0 , a1 , . . . , an−1 ∈ R}
of all polynomials of degree less than n is a real vector space
(3) The set all functions from R to R is a real vector space. If f and g
are functions from R to R and λ ∈ R, then the functions f + g and
λ f are defined by
(f + g)(x ) = f (x ) + g(x )
(λ f )(x ) = λ f (x )
for all x ∈ R.
30–1
(4) The set R∞ of all infinite sequences of real numbers is a real vector
space. If (x1 , x2 , . . .) and (y1 , y2 , . . .) are sequences of real numbers
then their sum is defined by
(x1 , x2 , . . .) + (y1 , y2 , . . .) = (x1 + y1 , x2 + y2 , . . .)
and for λ ∈ R we define
λ(x1 , x2 , . . .) = (λ x1 , λ x2 , . . .)
(5) the set of all m × n matrices whose entries are real numbers is a real
vector space. Here addition and scalar multiplication are defined in
the usual way.
Problem. Let V be a vector space over F . Show that 0v = 0 for every
v ∈ V (here the 0 on the left hand side is the zero scalar while the 0 on the
right hand side is the zero vector).
Show also that if v ∈ V , then the scalar multiple (−1)v is the inverse of v .
Solution. We have 0v = (0 + 0)v and by the second distributive law:
0v = 0v + 0v
or
0 + 0v = 0v + 0v
and the cancellation law for groups give 0 = 0v .
To show that (−1)v is the inverse of v we need to verify that
v + (−1)v = 0. We have
v + (−1)v =
=
=
=
1v + (−1)v
(1 + (−1))v
0v
0
by [V2]
by the second distributive law [V4]
by what was just proved above.
Many of the concepts we defined earlier for Rn can be generalised to vector
spaces. For example if V is vector space over the field F :
• A nonempty subset S of V is called a subspace of V if whenever
u, v ∈ S and λ ∈ F then u + v ∈ V and λ u ∈ V .
• A vector v ∈ V is called a linear combination of vectors u1 , u2 , . . . , un
∈ V if there are λ1 , λ2 , . . . , λn ∈ F with v = λ1 u1 +λ2 u2 +· · ·+λn un .
The set of all linear combinations of u1 , u2 , . . . , un is a subspace of V
called the subspace spanned by the set {u1 , u2 , . . . , un }.
• A subset {u1 , u2 , . . . , un } of V is called linearly dependent if there are
λ1 , λ2 , . . . , λn ∈ F , not all zero, with λ1 u1 + λ2 u2 + · · · + λn un = 0.
Otherwise it is called linearly independent.
30–2
• A vector space is called finite dimensional if there is a finite linearly independent set which spans V . Such a set is called a basis for
V . The number of vectors in a basis is called the dimension of V .
If {v1 , v2 , . . . , vn } is a basis for V then any x in V can be written
uniquely in the form
x = λ1 v1 + λ2 v2 + · · · + λn vn
for some λ1 , λ2 , . . . , λn ∈ R and we call [λ1 , λ2 , . . . , λn ] the coordinates
of x with respect to the basis.
Linear independence. A vector space is called infinite dimensional if it
is not finite dimensional. We say that an infinite set of vectors is linearly
independent if each of its finite subsets is linearly independent. To prove
that a vector space is infinite dimensional it is sufficient to show that it has
an infinite linearly independent subset.
Examples.
• If V is any vector space then {0} and V are subspaces of V .
• For each n let Pn be the vector space of all polynomials of degree less
than n. Then Pn is a subspace of Pn+r for every r > 0.
• The set {1, x , x 2 , . . . , x n−1 } is a basis for the vector space Pn so the
dimension of Pn is n.
• The vector space of all functions from R to R is infinite dimensional
since if fn is the function x 7→ x n then {f0 , f1 , f2 , . . .} is an infinite
linearly independent set.
• The set C of all complex numbers is a real vector space and {1, i } is
a basis of C. Hence the real vector space C has dimension 2.
• The set C is also a complex vector space having {1} as a basis. So
the complex vector space C has dimension 1.
Linear transformations. If V and W are vector spaces over the same
field F , then a function T : V → W is called a linear transformation if
T (x + y) = T (x ) + T (y) and T (λ x ) = λ T (x ) for all x , y ∈ V and all
λ ∈ F . If V = {v1 , v2 , . . . , vm } is a basis for V and W = {w1 , w2 , . . . , wn } is
a basis for W then the matrix of a linear transformation T : V → W with
respect to V and W is defined to be the matrix whose columns contain the
coordinates of the vectors T (v1 ), T (v2 ), . . . , T (vm ) with respect to W.
30–3
Example. The set C [0, 1] of all continuous functions from the interval [0, 1]
to R is a real vector space and the function T : C [0, 1] → R defined by
Z 1
T (f ) =
f (x ) dx
f ∈ C [0, 1]
0
is a linear transformation.
Problem. Let D : P3 → P3 be the linear transformation defined by
d
a + bx + cx 2 .
D(a + bx + cx 2 ) =
dx
Find the matrix of D with respect to the basis {1, x , x 2 } of P3 .
Solution. We apply D to each of the basis vectors and write the result as
a linear combination of the basis vectors:
D(1) = 0 = 0 × 1 + 0x + 0x 2
D(x ) = 1 = 1 × 1 + 0x + 0x 2
D(x 2 ) = 2x = 0 × 1 + 2x + 0x 2 .
The coefficients in these expansions form the columns of the matrix. Hence
the matrix of D with respect to the basis {1, x , x 2 } is


0 1 0
 0 0 2 .
0 0 0
ABSTRACT
Content definition of vector space, examples
The Vector Space is a generalisation of the better known vectors in R 2 and R 3 . However, functions have
many properties which are very similar to those of vectors. Differentiation of functions fits neatly into
vector space theory and so opens up the prospect of using vector space techniques in the solution of
differential equations.
History
David Hilbert contributed much to the application of vector space theory. His work influenced the entire
world of modern mathematics. Hilbert believed that all mathematical ideas eventually fit together ’harmoniously’. He believed that every mathematical problem can be settled ’either in the form of an actual
answer.....or by the proof of the impossibility of its solution’.
Immanuel Lazarus Fuchs, [1833-1902] was a German mathematician whose work on Georg Riemann ’s
method for the solution of differential equations led to a study of the theory of functions that was later
crucial to Henri Poincare in his investigation of function theory. The first proof for solutions of linear
differential equations of order n was developed from his study of functions, as were the Fuchsian differential
equations and the Fuchsian theory on solutions for singular points. His work in this field was of great
importance to Poincaré.
30–4