Download Slide 1 Orthogonal vectors, spaces and bases • Review: Inner

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Matrix multiplication wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

System of linear equations wikipedia , lookup

Bivector wikipedia , lookup

Matrix calculus wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Cross product wikipedia , lookup

Symmetric cone wikipedia , lookup

Euclidean vector wikipedia , lookup

Vector space wikipedia , lookup

Exterior algebra wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Four-vector wikipedia , lookup

Geometric algebra wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Transcript
Math 20F Linear Algebra
Lecture 23
'
1
$
Orthogonal vectors, spaces and bases
Slide 1
• Review: Inner product → Norm → Distance.
• Orthogonal vectors and subspaces.
• Orthogonal projections.
• Orthogonal and orthonormal bases.
&
%
'
$
An inner product fixes the notions of angles,
length and distance
( , ), must be positive, symmetric and linear, that is,
Slide 2
1. (u, u) ≥ 0, and (u, u) = 0 ⇔ u = 0;
2. (u, v) = (v, u);
3. (au + bv, w) = a(u, w) + b(v, w).
kuk =
p
(u, u),
dist(u, v) = ku − vk.
&
%
Math 20F Linear Algebra
Lecture 23
'
2
$
We transfer the notion of perpendicular vectors
from IR2 , IR3 to V
In IR2 holds
u ⊥ v ⇔ Pythagoras formula holds,
Slide 3
⇔
Diagonals of a parallelogram
have the same length,
Definition 1 Let V , ( , ) be an inner product space,
then u, v ∈ V are called orthogonal or perpendicular
⇔ (u, v) = 0.
&
%
'
$
Double-check, orthogonal vectors then form a
generalized rectangle
Theorem 1 Let V be a vector space and u, v ∈ V .
Then,
ku + vk = ku − vk
Slide 4
⇔
(u, v) = 0.
Proof:
ku + vk2 = (u + v, u + v) = kuk2 + kvk2 + 2(u, v).
ku − vk2 = (u − v, u − v) = kuk2 + kvk2 − 2(u, v).
then,
ku + vk2 − ku − vk2 = 4(u, v).
&
%
Math 20F Linear Algebra
Lecture 23
'
3
$
The vectors cos(x), sin(x) which belong to C([0, 2π])
are orthogonal
(cos(x), sin(x)) =
Slide 5
Z
2π
sin(x) cos(x) dx,
0
1
=
2
= −
= 0.
Z
2π
sin(2x) dx,
0
1
,
cos(2x)|2π
0
4
&
%
'
$
Even subspaces can be orthogonal!
Slide 6
Definition 2 Let V , ( , ) an inner product space and
W ⊂ V a subspace. Then W ⊥ is the orthogonal subspace,
given by
W ⊥ = {v ∈ V : (v, u) = 0,
&
for all u ∈ W }.
%
Math 20F Linear Algebra
Lecture 23
'
4
$
Orthogonal projection of a vector along any other
vector is always possible
Fix V, ( , ), and u ∈ V , with u 6= 0.
Slide 7
Can any vector x ∈ V be decomposed in orthogonal parts
with respect to u?
That is, x = au + x0 with (u, x0 ) = 0?
Is this decomposition unique?
Answer: Yes.
&
%
'
$
Here is how to compute a and x0
Theorem 2 V, ( , ), an inner product vector space, and
u ∈ V , with u 6= 0. Then, any vector x ∈ V can be
uniquely decomposed as
Slide 8
x = au + x0 ,
where a =
(x, u)
.
kuk2
Therefore,
x0 = x −
&
(x, u)
u,
kuk2
⇒ (u, x0 ) = 0.
%
Math 20F Linear Algebra
Lecture 23
5
Orthogonal projection along a vector
Proof: Introduce x0 by the equation x = au + x0 . The condition (u, x0 ) = 0 implies that
⇒
(x, u) = a(u, u),
then
x=
(x, u)
u + x0 ,
kuk2
a=
x0 =
⇒
(x, u)
,
kuk2
(x, u)
u − x̂.
kuk2
This decomposition is unique, because, given a second decomposition x = bu + y 0 with
(u, y0 ) = 0, then
au + x0 = bu + y0 ⇒ a = b,
from a multiplication by u, and then,
x0 = y 0 .
$
'
Bases can be chose to be composed by mutually
orthogonal vectors
Slide 9
Definition 3 Let V, ( , ) be an n dimensional inner
product space, and {u1 , · · · , un } be a basis of V .
The basis is orthogonal ⇔ (ui , uj ) = 0, for all i 6= j.
The basis is orthonormal ⇔
it is orthogonal, and
kui k = 1, for all i,
where i, j = 1, · · · , n.
&
%
Math 20F Linear Algebra
Lecture 23
'
6
$
To write x in an orthogonal basis is to decompose
x along each basis vector direction
Slide 10
Theorem 3 Let V, ( , ) be an n dimensional inner
product vector space, and {u1 , · · · , un } be an orthogonal
basis. Then, any x ∈ V can be written as
x = c 1 u1 + · · · + c n un ,
with the coefficients have the form
ci =
(x, ui )
,
kui k2
i = 1, · · · , n.
&
%
Proof: The set {u1 , · · · , un } is a basis, so there exist coefficients ci such that x =
c1 u1 + · · · + cn un . The basis is orthogonal, so multiplying the expression of x by ui , and
recalling (ui , uj ) = 0 for all i 6= j, one gets,
(x, ui ) = ci (ui , ui ).
The ui are nonzero, so (ui , ui ) = kui k2 6= 0, so ci = (x, ui )/kui k2 .