Download Vectors

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Singular-value decomposition wikipedia , lookup

Bivector wikipedia , lookup

Cross product wikipedia , lookup

Exterior algebra wikipedia , lookup

Matrix calculus wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Vector space wikipedia , lookup

Euclidean vector wikipedia , lookup

Vector field wikipedia , lookup

Lp space wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Four-vector wikipedia , lookup

Transcript
Linear Algebra
Chapter 1.
Vectors in R n and C n , spatial vectors
A vector v is a member of a Vector Space R n ,C n .
If u ,v V where V is a vector space, and  ,  are
scalars (numbers) then
 ( u  v )  u  v , (    )u  u  u ,
 ( u )  (  )u are all in V .
A vector is an ordered collection of numbers
(quantities), an array.
e.g.
v  [ 2.3 - 3.5 0.1]
v  ( on on off on on off)
v  ( blue green blue white yellow)
In physics, vectors are things with a magnitude
and a direction. Or, it is a point in a space.
( 4 ,3 )
vector addition, vector amplifications.
We may depict a vector either as a row of
numbers or a column of numbers. It is up to us,
but we’ve to be consistent.
 u1 
u 
If u  ( u1 ,u2 ,...,un ) then v  u t   2  is the
 .. 
u 
 n
transpose of the vector u . Conversely, v t  u .
Here, u  row  vector , v  column  vector
Vector addition, scalar multiplication. Negation of
a vector ( if v, then what is –v?).
A vector u  ( u1 ,u2 ,...,un ) is a tuple in an n-space.
The components ui are also called coordinates,
entries, elements, etc.
One may consider a set of unit vectors comprising
this R n ( or C n ) space. The corresponding vector-
space that it collectively spans allows us to express
each vector as follows:
n 



 
u  i1u1  i2 u 2  ...  ik u k  ...  in u n   ik u k
k 1
If the vector space is orthogonal, each unit vector
spanning the space is orthogonal to the other. The
collection of the unit vectors that span completely
any vector in the vector space is called a basis.
In the above, the set of unit vectors { i j } form an
orthonormal basis.
e.g. A three-dimensional orthogonal space
Norm (or Length) of a vector.
The norm or length of a vector u  R n is || u ||.
If u  ( u1 ,u2 ,...,un ) then || u ||
u is a unit vector if || u || 1 .
n
 ui2
i 1
Therefore,
v̂ 
v
is a unit vector.
|| v ||
Accordingly, if u and v are two points in the nspace R n , the distance between them is PQ where
 n
2

PQ    ( ui  vi ) 
 i 1

P
u
Q
v
if the notion of a distance is ‘meaningful’ in that
space.
In such a space, the dot product (or the inner
product) between two vectors is defined in this
way:
For u  ( u1 ,u2 ,u3 ,...,un ) and v  ( v1 ,v2 ,...,vn ) , the
dot product is u .v or  u | v  (this Dirac notation
will be explained shortly)
u .v  u | v  ( u1v1  u 2 v2  ...  u n vn )   ui vi
i
In terms of the physics-metaphor
 u | v   | u || v | cos  , where  is the angle
between the two vectors.
The norm of a vector u is then  u | u  . We
should be careful. This is only one variety of norm
that we can think of with a vector. In fact, we can
have a number of them:
n
l1  norm : || u ||1  | ui |
i 1
l2  norm : || u ||2 || u || as we have defined it.
 n

l p  norm : || u ||p    | ui | p 
 i 1

l  norm : || u ||  max | ui |
i
1/ p
For our sake, we’d mostly consider l2  norm .
An inner product space V is a vector space where
norm of a vector is defined and one could form a
dot product  u | v  between any pair of vector
u and v in it with the following conditions:
1.  u | u   0 the length of a vector is never
negative.
2.  u | v  v | u  symmetry
3.  u  w | v    u | v    w | v 
4.  u | v     u | v  with  a constant
5.  u  v | u  v    u | u    v | v  triangle
inequality
6.  u | v    u | u  v | v  Cauchy-Schwarz
inequality
ex. u  ( 2 0 - 1) v  (1 3 - 2)
 u |u   4 0 1  5
 v | v   1  9  4  14 and
 u | v   (-2  1  0  3  (-1) (-2))  0  5 14
ex. Let V  R 2 be an inner product space where
the dot products are defined in the following term:
(( a ,b ),( c , d ))  ac  bd
Then (( a ,b ),( a ,b ))  a 2  b 2  0 (condition 1 is
fulfilled and R 2 is an inner product space.
ex. A polynomial space P n is an inner product
space where every vector is an n-degree
polynomial (polynomial-format?) like
p( x )  a0  a1 x  a2 x 2  ...  an x n
q( x )  b0  b1 x  b2 x 2  ...  bn x n
Then  p | q   a0 b0  a1b1  ...  anbn is an inner
product between two such vectors in P n .
Another inner product in P n may be defined as
1
 p | q    p( x )q( x )dx
0
ex. Another inner product space is the space of
continuous functions where the dot product
between f ( x ) and g ( x ) is defined as

 f | g    f(t)g(t)dt
-
In this set up, the basis set S  { v̂1 ,v̂2 ,..., } is an
infinite set of vectors like
1
1
1
cos t , v̂3 
sint ,
, v̂2 

2

1
1

cos nt , v̂2 n 1 
sin nt
v̂1 
v̂2 n


That these form an orthonormal basis is evident
from the fact that

 cos mt sin nt dt  0 for m  n ,


 cos mt cos nt dt  0 for m  n


 sin mt sin nt dt  0 for m  n

ex. A weighted Eucledian inner product space R n
may be designed to yield dot products like
 u | v   1u1v1  2u2 v2  ...  nun vn
with 1  2  ...  n  1
The neural networks are developed on such
spaces.
More observation.
A set S in an inner product space V is called
orthogonal if any two distinct vectors in S are
orthogonal. If also each vector is a unit vector, the
set S is called orthonormal.
The set of unit vectors in R n comprise an
orthonormal space if
 i j | ik    jk (Kronecker delta)
 jk  1 if j  k, 0 otherwise
Notice that a finite orthonormal set S is spanned
by linearly independent vectors.
Accordingly, if a set of vectors S  V can be
identified such that any vector v  V can be
expressed uniquely as a linear combination of
vectors in S , the set S is a basis for V . Thus,
v  î1v1  î2 v2  î3 v3  ...  în vn if each îk  S .
A vector space may have several distinct bases but
each will have same number of basis vectors in
them. The number of basis vectors spanning a
vector space is called the dimension of a vector
space.
Ex. The standard basis spanning the R 3 is the set
of three vectors S  { e1 ,e2 ,e3 } where
e1  ( 1 0 0) , e2  ( 0 1 0) and e3  ( 0 0 1)
Any three-dimensional vector can be expressed in
this basis. e.g.
v  ( 4 - 5 3)  4e1  5e2  3e3
Obviously other bases in this vector space are
possible. Suppose, we choose these as our basis set
S'  { u1 ,u2 ,u3 } with
u1  ( 1 - 2 1), u 2  (0 3 2) and u3  ( 2 1 - 1)
Then,
v  ( 4 - 5 3)   1u1   2u2   3u3 and the
coefficients are determined by the equations
 1  2 3  4
 2 1  3 2   3  5
 1  2 2   3  3
Distances, angle and projections
The distance between the two vectors (the distance
between the terminals of the two vectors)
d ( u ,v )   u | v 
If u and v are vectors in an inner product space
V, the angle  between these two vectors is given
by
cos 
 u|v 
|| u |||| v ||
The two vectors are orthogonal (perpendicular to
each other) if  u | v  0
Projection of a vector on another vector (see the
diagram)
u
v

u cos 
Projection of u on vector v is uv  proj( u.v ) is
<u.v>  u.v 
uv  proj( u.v )  u cos  = ||u||

||u||||v|| || v ||