* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Chap5
Euclidean vector wikipedia , lookup
Tensor operator wikipedia , lookup
Singular-value decomposition wikipedia , lookup
Jordan normal form wikipedia , lookup
Cross product wikipedia , lookup
System of linear equations wikipedia , lookup
Hilbert space wikipedia , lookup
Exterior algebra wikipedia , lookup
Vector space wikipedia , lookup
Covariance and contravariance of vectors wikipedia , lookup
Matrix calculus wikipedia , lookup
Oscillator representation wikipedia , lookup
Orthogonal matrix wikipedia , lookup
Geometric algebra wikipedia , lookup
Linear algebra wikipedia , lookup
Cartesian tensor wikipedia , lookup
Four-vector wikipedia , lookup
CHAPTER FIVE
Orthogonality
Why
orthogonal?
Least square problem
Accuracy of Numerical computation
Least square problem
Motivation: Curve fitting
Problem formulation:
m
mn
Given A , b
Find x 0 such that
b Ax o minn b Ax
x
Outlines
Results of
Orthogonal subspace
Inner product spaces
Least square problems
Orthonormal sets
Gram-Schmidt Orthogonalization process
n
The scalar product in
Def: Let x , y n.the
inner product (or scalar product)
of x and y is defined to be
n
x, y x y x y xi yi
i 1
The norm of x is defined as
x
x, x
n
x & y n . Then
Theorem5.1.1: Let
x y x y cos
pf: By the law of cosines,
2
2
2
y x x y 2 x y cos
2
2
1 2
x y cos x y y x
2
1
2
2
2
xi yi yi xi
2
xi yi x y
Note: If is the angle between x & y , then
x y
x
y
Thus
cos
1
1
x y
x y
Def: x y x y 0
Cor: (Cauchy-Schwartz Inequality)
n
Let x & y . Then
x y x y.
Moreover , equality holds x y
for some
Scalar and Vector Projections
Let u 1 . Then the quantify
x
is
the
scalar
x u x u cos
projection of x onto u and
u
the vector x u u is called the vector
projection of x onto u
If y 1 ,then the scalar projection of
x onto y is x y and the vector projection
y
of x onto y is
x y
y y
y
x
y y
y
y
1.4
v
Example: Find the point
1
y
x
1
3
on the line y x that
3
Q
is closest to the point
(1,4)
3
1
w
Sol: Note that the vector
1 is on the line y x
3
Then the desired point is
v w
2.1
w
w w
0.7
Example: Find the equation of the plane
passing through 2,1,3 and
normal to 2,3,4
Sol: 2 x 2
3 y 1 0
4 z 3
2x 2 3 y 1 4z 3 0
Example: Find the distance form P 2,0,0
to the plane x 2 y 2 z 0
Sol: a normal vector to
the plane is 12
2
desired distance
v
P
v
P
2
3
v
Orthogonal Subspace
Def: Let X & Y be two subspace
n
of . We say that X Y if
x y 0 , x X and y Y .
Example: Let X span{e1},
Y span{e2 }, Z span{e2 , e3}
X Y & X Z.
but Y does not orthogonal to Z
Def: Let Y be a subspace of n.
Then the orthogonal complement
of Y is defined as
Y {x x y 0, y F}
n
Example: In , Y span{e1} Y span{e2 , e3}
3
X span{e1 , e2 } X span{e3}
Lemma: (i) Two subspaces X Y X Y {0}
n
(ii) If X is a subspace of ,then X is
n
also a subspace of .
Pf: (i) If x X Y
2
x x x 0 x 0
(ii) Let
y1 & y 2 X
x X and
y y x y
1
2
0
1
x y2 x
Four Fundamental Subspaces
mn
Let A
A nm
N A x
N A x
or
A : n m
x Ax
n
Ax 0 n
m
A x 0 m
R A {b b A x for some x }
m
n
n
x
}
R A {b b A x for some
It will be shown later that N A R A
N A R A
n
N A R A
m N A R A
n
m
m
Theorem5.1.2: Let
A
mn
. Then
and
N A R A
N A R A
pf: Let x N A and y RA
i 1, m _____(1)
Ai, :x m 0
& y i A i, : for some i _____(2)
i 1
(1)
x y i x A i, : 0
N A RA ________(3)
Also, if z RA Z A i, : 0, i 1,m
Az 0 RA N A _____(4)
N A RA
N A R A
( 3)( 4 )
RA
Similarly, N A R A
1 0
1 2
Example: Let A
A
2 0
0 0
0
N A span
1
2
N A span
1
1
R A span
2
1
R A span
0
Clearly, N A RA
N A R A
Theorem5.2.2: Let
n
be
a
subspace
of
,
S
then (i) dim S dim S n
(ii) If {x1 x r } is a basis for S and
{x r 1 x n } is a basis for S ,then
n
{x1 x r } is a basis for .
pf: If S {0} S n The result follows
Suppose S {0} . Let X ( x1 x r ) nr
R( X ) S and rank ( X ) r rank ( X )
Theorem5.2.1
S R( X ) N ( X )
Theorem3.6.4
dim( S ) dim N ( X ) n r
To show that {x1 x n }is a basis for n,
It remains to show their independency.
n
Let ci x i 0 . Then x S
i 1
n
r
x ci x i x ci x i 0
i 1
i 1
r
ci x i 0 ci 0, i 1 r
i 1
Similarly, y S
n
n
y ci x i y ci x i 0
i r 1
n
ci x i 0 ci 0, i r 1, n
i r 1
This completes the proof
Def: Let U &V are two subspaces
of W . We say that W is a
direct sum of U &V ,denoted by
W U V , if each
wW can be written uniquely as
a sum u v , where u U & v V
Example: Let
1 0
0 0
0
U span 0 1 ,V span 1 0 ,W span 0
0 0
0 1
1
Then 3 U W ,
3 U V but 3 U V
S is a subspace of ,
then n S S
n
pf: By Theorem5.2.2, S S
n
x , x u v, u S & v S .
n
Theorem5.2.2: If
To show uniqueness,
Suppose x u1 v1 u 2 v 2
where u1 , u 2 S & v1 , v 2 S
u1 u 2 v 2 v1 S S
S S {0}
u1 u 2 & v 2 v1
Theorem5.2.4: If S is a subspace of n ,
then ( S ) S
pf: Let dim( S ) r
Theorem5.2.2
dim( S ) r
If x S x y 0 y S
S S
S S
x S
(Why?)
Remark: Let A . i.e. , A :
Since n N A R A
and rank ( A) rank A
n nullity A rank A
nullity A rank A
mn
n
A : R A R A
A : R A R A
are bijections .
m
Let A mn
A:
n
A
bijection
m
A
N A
A :
N A
m
A
bijection
N A
n
A
N A
Cor5.2.5:
m
mn
Let A and b . Then
N(A )
either
(i) x n Ax b
b
or
(ii) y m
R(A)
A y 0& y b 0
n R( A) N ( AT )
pf: b R(A) or b R(A)
Note that
b R( A) N ( A ) y N ( A ) y b 0
y A y 0 & y b 0
m
1 1 2
Example: Let A 0 1 1 . Find
1 3 4
N ( A), R( A), N ( A ), R( A )
The basic idea is that the row space and the sol. of
Ax b are invariant under row operation.
1 0
1 0 1
Sol: (i) row
A ~ Ar 0 1 1 R( A ) span 0 1 (Why?)
1 1
0 0 0
1
N
(
A
)
span
Ar x 0 x1 x3 0 & x2 x3 0
1
1
1 0
1 0 1
(iii) Similarly, A row
~ 0 1 2 R( A) span 0 1
1 2
0 0 0
1
and N ( A ) span 2
1
(ii)
(iv) Clearly, N A R A & N ( A ) R( A)
(Why?)
2 0 0 3
: 2
Example: Let A
0 3 0
0
1 0
(i) 3 N A R A span 0 span 0 1
1
0 0
2
R
(
A
)
and
(ii) The mapping A
R( A)
:
R
A
R( A )
x1 2 x1
x2 3 x2
0 0
is a bijection
1
A
and R ( A ) : R A R( A )
1
y1
2
y1 1
y2
y2 3
0
(iv) What is the matrix representation for A R ( A ) ?
Linear Product Space
A tool to measure the
orthogonality of two vectors in
general vector space
Def: An inner product on a
vector space V is a function
, : V V F (orC )
Satisfying the following conditions:
(i) x, x 0 with equality iff x 0
(ii) x, y y, x
(iii) x y, z
x, z y, z
Example: (i) Let
x, y & wi 0i 1,n.
n
n
n
x
,
y
w
x
y
i
i
i
Then
is an inner product of
n
i 1
m
(ii) Let A, B mn, Then A, B aijbij is an
i 1
j 1
inner product of mn
(iii) Let f , g , w( x) C 0 [a, b]. and w( x) 0 then
b
f , g w( x) f ( x) g ( x)dx is
an inner product of C [a, b].
(iv) Let p, g n , w(x ) is a positive function and
x1 xn are distinct real numbers. Then
a
0
n
p, g w( xi ) P(xi ) g ( xi )
i 1
an inner product of n
is
Def: Let , be an inner
product of a vector space V
and u, v V .
we say u v u; v 0
The length or norm of v is
v
v; v
Theorem5.3.1: (The Pythagorean Law)
2
2
2
u v uv u v
2
pf: u v u v, u v
u , u 2 u , v v, v
2
u v
2
u
uv
v
Example: Consider C 0 [1,1] with inner product
1
f , g f ( x) g ( x)dx
1
(i)
1
1, x 1 xdx 0 1 x
1
1
(ii) 1,1 11dx 2 1 2
1
2
(iii) x, x 1x xdx x 2 3
3
2
2
2
2 8
(iv) 1 x 1 x 2
(Pythagorean Law)
3 3
2
1
2
8
or 1 x 1 x,1 x (1 x) dx
1
3
1
Example: Consider C 0 [1,1] with inner product
1
f , g f ( x) g ( x)dx
It can be shown that
(i)
cos nx, sin mx 0
(ii)
cos mx, cos nx mn
(iii)
sin mx, sin nx mn
Thus cos nx, sin nx n N
orthonormal set.
are
Example: Let A, B
m
mn
n
A, B aijbij
i 1 j 1
AF
and
A, A
1 1
1 1
letA 1 2 , B 3 0
3 3
3 4
Then
A, B 6 A not orthogonal
to B
A F 5, B F 6
Def: Let u & v 0 be two vectors in an
inner product space V . Then
the scalar projection of u onto v is
defined as
u, v
v
u,
v
v
The vector projection of u onto v is
u, v
v
P
v
v
v, v
Lemma: Let v 0 & P be the vector projection
of u onto v . Then
(i )u P P
(i )u P u k v for some k
pf:
(i ) P, u P P, u P, P
u, v
v, v
2
u, v
v, v
P u P
(ii )trivial
2
0
u
v
P
uP
Theorem5.3.2: (Cauchy-Schwarz Inequality)
Let u & v be two vectors in an
u
inner product space V . Then
P
u, v u v
v
uP
Moreover, equality holds u & v are linear dependent.
pf: If v 0, trivial
2
If v 0, then u, v P 2 PythagoreanTheorem u 2 u P 2
v, v
u, v
2
2
2
u v v uP
2
u v
Equality holds
2
2
2
v 0, or u P
u, v
v
v, v
i.e., equality holds iff u & v are linear dependent.
Note:
From Cauchy-Schwarz Inequality.
1
u, v
u v
1
! 0,
cos
u, v
u v
This, we can define as the
angle between the two vectors
u & v.
Def: Let V be a vector space
A fun : V F is said
v v
to be a norm if it satisfies
(i ) v 0 with equality v 0
(ii ) v v , scalar
(iii ) v w v w
Theorem5.3.3: If V is an inner product
space, then
v
vv
is a norm on V
pf: trivial
Def: The distance between u & v is defined
as u v
Example: Let x . Then
n
(i ) x 1 xi is a norm
n
i 1
(ii ) x max xi
is a norm
1i n
P
(iii ) x P xi
i 1
n
1
P
is a norm for any P 1
In particular ,
x2
n
x
i 1
i
2
euclidean norm
x, x
is the
4
Example: Let x 5 . Then
3
x 1 12
x 2 5 2
x
5
1
4
Example: Let x1 & x 2
2
2
x1 , x 2 0
Thus, x 2 x 2 2 5 20 25 x1 x 2
2
2
However, x1 x 2 4 16
2
2
20 x1 x 2
2
16
(Why?)
2
2
Example: Let B x 2 x 1
Then
B
B2
1
1
B1
1
Least square problem
A typical example:
Given xi
, i 1, n
yi
Find the best line y c0
to fit the data . 1 x1
c1 x
y1
1 x2 c0 y2
solve
c1
1 x
y
n
n
or find c0 , c1 such that
Ax b is minimum
Geometrical meaning :
or Ax b
y c0 c1 x
( xn , yn )
( x1 , y1 )
Least square problem:
Given A mn & b m ,
then the equation Ax b
may not have solutions
i.e., b Col( A) R( A)
The objective of least square problem is
trying to find x such that
b
b A x has minimum value
i.e., find x satisfying
Ax
b A x minn b A x
x
R(A)
Preview of the results:
It will be shown that
! P R( A)
b P min y b
yR ( A)
b y
Moreover, b P ymin
R ( A)
b P R( A) N ( A )
b
A b P 0
A b Ax 0
A Ax A b
If columns of A are Linear independent .
x A A
1
A b
P
R(A)
Theorem5.4.1: H. Let S be a subspace of m
m
b
, ! P S
C. (i)
b y b P for all y S \ {P}
b
y b b PS
(ii) P b min
yS
pf:
S
(i ) m S S
b P z where P S & z S
If y S \ {P}
P
2
b y b P P y
2
zS
S
Pythogorean Theorem
bP P y
2
2
0
Since the expression b P z is unique,
result (i) is then proved .
(ii) follows directly from (i) by noting that b P z S
Question: How to find x which solves
A x b minn b A x ?
x
mn
b
R(A)
Answer: Let A
P Ax
From previous Theorem , we know that
b P R( A) N ( A )
A (b P) 0
A b A Ax 0
normal equation
Theorem5.4.2: Let A mn and rank ( A) n.
Then the normal equation . A Ax A b Has
a unique sol .
1
x A A
A b
and x is the unique least square
sol . to Ax b
pf: Clearly, A A is nonsingular (Why?)
x is the unique sol. To normal equation .
x is the unique sol . To the
least square problem (Why?)
( A has linear independent columns)
Note: The projection vector
P Ax A A A
1
b
R(A)
A b
is the element of R(A) that
is closet to b in the least square
sense . Thus, The matrix
1
P AA A A is called the
Projection matrix (that project any
vector of m to R(A) )
P
Example: Suppose a spring obeys the Hook’s law
F Kx
and a series of data are taken (with measurement
error) as F 3 5 8
x 4 7 11
How to determine K ?
sol: Note that 4 K 3
4
The
The
3
7 K 5 or 7 K 5 is inconsistent
11
8
11K 8
4
3
normal equation. is 4 7 11 7 K 4 7 11 5
11
8
least square sol. K 0.726
Example: Given the data
x 0 3 6
y 1 4 5
Find the best least square fit
by a linear function.
sol: Let the desired linear function
be y c0 c1 x
The problem be comes to find the
least square sol. of 1 1 0
c0
4 1 3
c1
5 1 6
x
b
Least square sol.
A
4
c0
A A A b 3
2
c1
3
The best linear least square fit is
4 2
y x
3 3
Example: Find the best quadratic least square
fit to the data
x 0 1 2 3
y 3 2 4 4
sol: Let the desired quadratic function . be y c0 c1 x c2 x 2
The problem becomes to find the least square
sol . of
3 1 0 0
c0
2 1 1 1
4 1 2 4 c1
4 1 3 9 c2
least square sol .
c0 2.75
c1 0.25
c 0.25
2
the best quadratic least square fit is
y 2.75 0.25x 0.25x 2
Orthonormal Set
Simplify the least square sol.
(avoid computing inverse)
Numerical computational stability
Def: v1 v n is said to be an orthogonal set in
an inner product space V if
v i , v j 0 for all i j
Moreover, if
v i , v j ij , then
v1 v n is said to be orthonormal
Example:
1
2
4
v1 1, v 2 1 , v 3 5
1
3
1
orthogonal set but not
orthonormal
v
u
,u
However ,
3
is orthonormal
1
1
2
is an
v
v2
, u3 3
14
42
Theorem5.5.1: Let v1 v n be an orthogonal
set of nonzero vectors in an inner
product space V . Then they are
linear independent
n
pf: Suppose ci V i 0
i 1
V j , ci V i 0 ci V j , V i
c j V j ,V j
c j 0 , j 1, n
v1 v n
is linear independent .
Example: 1
, cos nx sin ns n N
2
is an
0
,
C
orthonormal set of
with inner product
f ,g
1
f ( x) g ( x)dx
Note: Now you know the meaning what one
says that cos x sin x
Theorem5.5.2: Let u1 u n be an
orthonormal basis for an inner
n
product space V . If v ci u i
i 1
then ci u i , v
pf:
n
n
j 1
j 1
ui , v ui , c j u j c j ui , u j
n
c j ij c j
j 1
Cor: Let u1 u n be an orthonormal
V .
basis for an inner product
space
n
n
If u ai u i and v bi u i ,
i 1
i 1
n
then u, v ai bi
i 1
pf:
n
u, v
a u ,v
i 1
i
n
ai u i , v
i 1
Theorem5.5.2 n
a b
i 1
i i
i
Cor: (Parseval’s Formula)
If u1 u n is an orthonormal
basis for nan innerproduct space V
and v ci u i , then
i 1
n
v c
i 1
2
i
pf: direct from previous corollary
Example:
u1
1
2
1
2
and
1
u2 2
1
2
from
an orthonormal basis for 2 .
If x x1 2 , then
x
2
x1 x2
x1 x2
x, u 1
, x, u 2
2
2
Theorem5.5.2
x1 x2
x1 x2
x
u1
u2
2
2
2
2
and x 2 x1 x2 x1 x2 x12 x2 2
2
2
Example: Determine
sin 4 xdx without
computing antiderivatives .
sol:
sin
4
xdx sin x, sin x sin x
2
2
2
2
1 cos 2 x
1 1 1
sin x
cos 2 x
2
2 2 2
2
and
1
,
cos
2
x
is
2
an orthonormal set of C 0 ,
sin xdx sin x
4
2
2
1 2 1 2 3
2 2 4
Def: Q mn is said to be
an orthogonal matrix if the
column vectors of Q form an
orthonormal set in n
Example: The rotational matrix cos sin
sin
and the elementary reflection
cos
matrix
sin
matrix .
sin
are orthogonal
cos
cos
Properties of orthogonal matrix:
mn
at Q be orthogonal . Then
(i )The column vectors of Q form an
n
orthonormal basis for
(ii )Q Q I
(iii )Q Q
1
(iv ) Q x, Q y x, y Preserve inner product
(v ) Q x
2
x
2
preserve norm
(vi)preserve angle .
Note: Let the columns of A form
an orthonormal set of n .Then
A A I and the least
square sol to Ax b is
x A A
1
A b A b
This avoid computing matrix inverse .
Cor5.5.9:
Let S be a nonzero subspace of m
and u1 u K is an orthonormal
basis for S . If U u1 u K ,
then the projection P of b onto S is
P UU b
pf: P U U U U b
UU b
Note: Let columns of U u1 u K be
an orthonormal set
u 1
UU b u1 u K b
K
uK
ui b ui
i 1
The projection of b onto R(U )
is the sum of the projection
of b onto each u i .
Example: Let S x, y,0 x, y
Find the vector P in S that is closet
to w 5,3,4 .
Sol: Clearly e1 , e2 is
1 0
a basis for S . Let U 0 1
0 0
Thus P UU w
1 0 0 5 5
0 1 0 3 3
0 0 0 4 0
Hw: Try
1
1
2
2
1
1
U
2
2
0
0
What is UU ?
Approximation of functions
Example: Find the best least square
approximation to e x on 0,1 by a linear function .
i.e., Find P0 ( x) P 2 0,1
e x P ( x) min e x P( x)
0
2
2
P ( x )P
where f 2 f , f 1 f 2 dx
0
2
Sol: (i) Clearly , span1, x P 2 0,1
but 1, x is not orthonormal
(ii) seek a function of the form x a
1
1, x a ( x a)dx
0
By calculation
x
1
1
2
12
1
1
a 0 a
2
2
1
u1 1, u 2 12( x )
2
(iii)
( x a) 1
is an orthonormal set of P 2 0,1
1
u1 , e x e x e 1
0
u 2 , e x u2 e x dx 3 3 e
1
0
Thus the projection . P( x) c1 u1 c2 u 2
1
(e 1) 1 3 (3 e)( 12 ( x )) (4e 10) 6(3 e) x
2x
is the best linear least square approximation to e on 0,1
Approximation of trigonometric polynomials
1
,
conx
,
sin
nx
n
N
forms
2
FACT:
an
orthonormal set in C 0 , with respect
to the inner product
f,g
1
f ( x) g ( x)dx
Problem: Given a 2 periodic function f (x) ,
find a trigonometric polynomial of degree n
n
a0
t n ( x)
aK cos Kx bK sin Kx
2 K 1
which is a best least square approximation to f (x) .
Sol: It suffices to find the projection
of f (x) onto the subspace
1
span
, conKx, sin Kx K 1, , n
2
The best approximation of
has coefficients
1
1
a0 f ,
f ( x)dx
2
2
1
aK f , cos Kx f ( x) cos Kxdx
bK f , sin Kx
1
f ( x) sin Kxdx
tn
Example: Consider C 0 , with
inner product of f , g 1 f ( x) g ( x)dx
2
e
K 0,1,,n are orthonormal
(i) Check
(ii) Let t C e
iKx
n
iKx
n
K n
CK
K
1
2
f ( x )e
iKx
dx
1
(aK ibK )
2
Similarly C K CK
(iii) CK eiKx C K e iKx
aK cos Kx bK sin Kx
n
(iv) tn CK eiKx
K n
n
a0
aK cos Kx bK sin Kx
2 K 1
Cram-Schmidt Orthogonalization
Process
Question: Given a set of linear independent
vectors,
how to transform them into
orthogonal ones while
preserve spanning set.?
Given x1 x K
x
u1 1 ,Clearly span{u1} span{x1}
x1
x2 P1
P1 x 2 , u1 u1 , u 2
x2 P1
Clearly u1 u 2 & span{x1 , x 2 } span{u1 , u 2 }
Similarly, P2 x3 , u1 u1 x3 , u 2 u 2
x2
u1
P1
x3 P2
u3
x3 P2
and
Clearly u 3 u1 , u 3 u2 & span{x1 , x2 , x3} span{u1 , u 2 , u 3}
We have the next result
Theorem5.6.1: (The Cram-Schmidt process)
H. (i) x1 x n be a basis for
an inner product space V
x
u
,
(ii)
x
1
1
1
u K 1 x K 1 P K
xK 1 PK
, K 1, , n 1
K
where P K xK 1 , u j u j
j 1
C. u1 u n is an orthonormal basis
Example: Find an orthonormal basis for
P 3 with inner product given by
3
P, g P( xi )g ( xi ),
i 1
where x1 1, x2 0 & x3 1.
Sol: Starting with a basis 1, x, x 2
u1
1
1
1
3
1 1
0
3 3
x P1
u2
x
2
x P1
P1 x ,
1
x
x2 ,
3
2
2
x2
2
x P2
3
u3 2
2
x P2
3
P2 x2 ,
1
3
x
2
0
2 3
QR-Decomposition
Given a1 a n
Let r11 a1 a1 r11 g 1 _______(1)
P1 a2 , g1 g1 r12 g1 _________________(2)
_______________________(3)
r22 a2 P1
( 2 )( 3)
a2 P1
g 2
a2 P1 r22 g 2 r12 g1 r22 g 2
r22
K 1
K 1
i 1
i 1
PK 1 g i , aK g i ri , K 1 g i
rKK aK PK 1
K 1
aK PK 1
g K
QK ri , K 1 rKK g K
rKK
i 1
Define Q g1 g n mn , R rij nn
A QR
Where Q has orthonormal columns and R is upper-triangular
To solve Ax b with A mn & rank ( A) n
QR x b
Rx Q b
Then, the example can be solved
By backsubstitution without finding
Inverse (if A is square)
Example: Solve
1 2 1
1
x1
2
0
1
1
2 4 2 x2 1
4 0 0 x3 2
A
b
By direct calculation,
1 2 4
5 2 1
1 2 1 2
A QR
0
4
1
5 2 4 2
0 0
2
4 2 1
R
Q
1
Q b 1
2
The solution can be obtained from
5 2 1 1
0 4 1 1
0 0
2 2