Download A T y

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Determinant wikipedia , lookup

Symmetric cone wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Jordan normal form wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Bivector wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Principal component analysis wikipedia , lookup

Cross product wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Exterior algebra wikipedia , lookup

Matrix multiplication wikipedia , lookup

Euclidean vector wikipedia , lookup

Vector space wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Matrix calculus wikipedia , lookup

Geometric algebra wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Four-vector wikipedia , lookup

Transcript
Chapter 5
Orthogonality
1 The scalar product in Rn
The product xTy is called the scalar product of x and y.
In particular, if x=(x1, …, xn)T and y=(y1, …,yn)T, then
xTy=x1y1+x2y2+‥‥+xnyn
The Scalar Product in R2 and R3
Definition
Let x and y be vectors in either R2 or R3. The distance
between x and y is defined to be the number ‖x-y‖.
Example
If x=(3, 4)T and y=(-1, 7)T, then the distance
between x and y is given by
‖y-x‖= 5
Theorem 5.1.1 If x and y are two nonzero vectors in either
R2 or R3 and θ is the angle between them, then
(1)
xTy=‖x‖‖y‖cosθ
Corollary 5.1.2 ( Cauchy-Schwarz Inequality)
If x and y are vectors in either R2 or R3 , then
(2)
︱xTy︱≤‖x‖‖y‖
with equality holding if and only if one of the vectors is 0 or one
vector is a multiple of the other.
Definition
The vector x and y in R2 (or R3) are said to be orthogonal if
xTy=0.
Example
(a) The vector 0 is orthogonal to every vector in R2.
 3
  4
(b) The vectors  2  and  6  are orthogonal in R2.
 
 
 2 
(c) The vectors   and
  3
 1 
 
 1
  are orthogonal in R3.
 1
 1
 
Scalar and Vector Projections
x
θ
u
z=x-p
y
p=αu
  x cos  
x y cos 
y
xT y

y
The scalar  is called the scalar projection of x and y, and
the vector p is called the vector projection of x and y.
Scalar projection of x onto y:
T
x y

y
Vector projection of x onto y:
1
xT y
p  u  
y T y
y
y y
Example
The point Q is the point on the line y  1 x that is
3
closet to the point (1, 4). Determine the coordinates of Q.
(1, 4)
y
v
w
Q
1
x
3
Orthogonality in Rn
The vectors x and y are said to be orthogonal if xTy=0.
2 Orthogonal Subspaces
Definition
Two subspaces X and Y of Rn are said to be orthogonal if
xTy=0 for every x∈X and every y∈Y. If X and Y are
orthogonal, we write X⊥Y.
Example Let X be the subspace of R3 spanned by e1, and
let Y be the subspace spanned by e2.
Example Let X be the subspace of R3 spanned by e1 and e2,
and let Y be the subspace spanned by e3.
Definition
Let Y be a subspace of Rn . The set of all vectors in Rn that
are orthogonal to every vector in Y will be denoted Y⊥. Thus
Y⊥={ x∈Rn︱xTy=0 for every y∈Y }
The set Y⊥ is called the orthogonal complement of Y.
Remarks
1. If X and Y are orthogonal subspaces of Rn, then X∩Y={0}.
2. If Y is a subspace of Rn, then Y⊥ is also a subspace of Rn.
Fundamental Subspaces
Theorem 5.2.1 ( Fundamental Subspaces Theorem)
If A is an m×n matrix, then N(A)=R(AT) ⊥ and N(AT)=R(A) ⊥.
Theorem 5.2.2 If S is a subspace of Rn, then
dim S+dim S⊥=n. Furthermore, if {x1, …, xr} is a basis for S and
{xr+1, …, xn} is a basis for S⊥, then {x1, …, xr, xr+1, …, xn}
is a basis for Rn.
Definition
If U and V are subspaces of a vector space W and each
w∈W can be written uniquely as a sum u+v, where u∈U and
v∈V, then we say that W is a direct sum of U and V, and we
write W=U
V.
Theorem 5.2.3 If S is a subspace of Rn, then Rn=S S⊥.
Theorem 5.2.4 If S is a subspace of Rn, then (S⊥) ⊥=S.
Theorem 5.2.5 If A is an m×n matrix and b∈Rm, then
either there is a vector x∈Rn such that Ax=b or there is a
vector y∈Rm such that ATy=0 and yTb≠0.
Example
Let
1 1 2


A  0 1 1
1 3 4


Find the bases for N(A), R(AT), N(AT), and R(A).
4 Inner Product Spaces
Definition
An inner product on a vector space V is an operation on V
that assigns to each pair of vectors x and y in V a real
number <x, y> satisfying the following conditions:
Ⅰ. <x, x>≥0 with equality if and only if x=0.
Ⅱ. <x, y>=<y, x> for all x and y in V.
Ⅲ. <αx+βy, z>=α<x, z>+β<y, z> for all x, y, z in V and all
scalars α and β.
The Vector Space Rm×n
Given A and B in Rm×n, we can define an inner product by
m
n
A, B   aijbij
i 1 j 1
Basic Properties of Inner product Spaces
If v is a vector in an inner product space V, the length or norm
of v is given by
v 
v, v
Theorem 5.4.1 ( The Pythagorean Law )
If u and v are orthogonal vectors in an inner product space V,
then
uv  u  v
2
2
2
Example
then
If
1 1 


A  1 2
3 3


A, B  6
A 5
B 6
and
 1 1 


B   3 0
  3 4


Definition
If u and v are vectors in an inner product space V and v≠0,
then the scalar projection of u onto v is given by

u,
v
v
and the vector projection of u onto v is given by
 1  u, v
p    v  
v
 v  v, v
Theorem 5.4.2 ( The Cauchy- Schwarz Inequality)
If u and v are any two vectors in an inner product space V, then
u, v  u v
Equality holds if and only if u and v are linearly dependent.
5 Orthonormal Sets
Definition
Let v1, v2, …, vn be nonzero vectors in an inner product
space V. If <vi, vj>=0 whenever i≠j, then { v1, v2, …, vn} is
said to be an orthogonal set of vectors.
Example The set {(1, 1, 1)T, (2, 1, -3)T, (4, -5, 1)T} is an
orthogonal set in R3.
Theorem 5.5.1 If { v1, v2, …, vn} is an orthogonal set of
nonzero vectors in an inner product space V, then v1, v2, …,vn
are linearly independent.
Definition
An orthonormal set of vectors is an orthogonal set of unit
vectors.
The set {u1, u2, …, un} will be orthonormal if and only if
u i , u j   ij
where
1
 ij  
0
if i  j
if i  j
Theorem 5.5.2 Let { u1, u2, …, un} be an orthonoemal basis
for an inner product space V. If v 
n
c u
i 1
i
i
, then ci=<v, ui>.
Corollary 5.5.3 Let { u1, u2, …, un} be an orthonoemal basis
for an inner product space V. If u 
n
a u
i
i 1
i
and v 
n
b u
i 1
i
i
, then
n
u, v   ai bi
i 1
Corollary 5.5.4 If { u1, u2, …, un} is an orthonoemal basis
for an inner product space V and v 
n
c u
i 1
n
v   ci2
2
i 1
i
i
, then
Orthogonal Matrices
Definition
An n×n matrix Q is said to be an orthogonal matrix if the
column vectors of Q form an orthonormal set in Rn.
Theorem 5.5.5 An n×n matrix Q is orthogonal if and only if
QTQ=I.
Example
 cos 
For any fixed  , the matrix Q  
 sin 
is orthogonal.
 sin  

cos  
Properties of Orthogonal Matrices
If Q is an n×n orthogonal matrix, then
(a) The column vectors of Q form an orthonormal basis for Rn.
(b) QTQ=I
(c) QT=Q-1
(d) det(Q)=1 or -1
(e) The thanspose of an orthogonal matrix is an orthogonal
matrix.
(f) The product of two orthogonal matrices is also an orthogonal
matrix.
6 The Gram-Schmidt Orthogonalization
Process
Theorem 5.6.1 ( The Gram-Schmidt Process)
Let {x1, x2, …, xn} be a basis for the inner product space V. Let
 1 
 x1
u1  

 x1 
and define u2, …, un recursively by
1
u k 1 
( x k 1  p k )
x k 1  p k
for k=1, …, n-1
where
pk=<xk+1, u1>u1+<xk+1, u2>+‥‥<xk+1, uk>uk
is the projection of xk+1 onto Span(u1, u2, …, uk). The set
{u1, u2, …, un}
is an orthonormal basis for V.
Example Let
1  1

1 4
A
1 4

1  1

4

2
2

0 
Find an orthonormal basis for the column space of A.