Download 5.3 Orthogonal Transformations and Orthogonal Matrices

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
5.3 Orthogonal Transformations and Orthogonal Matrices
Definition 1 (5.3.1). A linear transformation T : Rn → Rn is called orthogonal if it preserves the length
of vectors:
||T (x)|| = ||x|| , for all x ∈ Rn .
If T (x) = Ax is an orthogonal transformation, we say that A is an orthogonal matrix.
Theorem 2 (5.3.2, orthogonal transformations preserve orthogonality). Let T : Rn → Rn be an
orthogonal linear transformation. If v, w ∈ Rn are orthogonal, then so are T (v), T (w).
Note 3. In fact, orthogonal transformations preserve all angles, not just right angles: the angle between
two nonzero vectors v, w ∈ Rn equals the angle between T (v), T (w). This is a homework problem.
Theorem 4 (orthogonal transformations preserve the dot product). A linear transformation
T : Rn → Rn is orthogonal if and only if T preserves the dot product:
v · w = T (v) · T (w)
for all v, w ∈ Rn .
Theorem 5 (5.3.3, orthogonal matrices and orthonormal bases). An n × n matrix A is orthogonal if
and only if its columns form an orthonormal basis of Rn .
Theorem 6 (5.3.4, products and inverses of orthogonal matrices).
a) The product AB of two orthogonal n × n matrices A and B is orthogonal.
b) The inverse A−1 of an orthogonal n × n matrix A is orthogonal.
Definition 7 (5.3.5). For an m × n matrix A, the transpose AT of A is the n × m matrix whose ijth entry
is the jith entry of A:
[AT ]ij = Aji .
The rows of A become the columns of AT , and the columns of A become the rows of AT .
A square matrix A is symmetric if AT = A and skew-symmetric if AT = −A.
Note 8 (5.3.6). If v and w are two (column) vectors in Rn , then
v · w = vT w.
(Here we choose to ignore the difference between a scalar a and the 1 × 1 matrix a ).
Theorem 9 (5.3.7, transpose criterion for orthogonal matrices). An n × n matrix A is orthogonal if
and only if AT A = In or, equivalently, if A has inverse A−1 = AT .
Theorem 10 (5.3.8, summary: orthogonal matrices). For an n × n matrix A, the following statements
are equivalent:
1. A is an orthogonal matrix.
2. ||Ax|| = ||x|| for all x ∈ Rn .
1
3. The columns of A form an orthonormal basis of Rn .
4. AT A = In .
5. A−1 = AT .
Theorem 11 (5.3.9, properties of the transpose).
a) If A is an n × p matrix and B a p × m matrix (so that AB is defined), then
(AB)T = B T AT .
b) If an n × n matrix A is invertible, then so is AT , and
(AT )−1 = (A−1 )T .
c) For any matrix A,
rank(A) = rank(AT ).

− w1

..
Theorem 12 (invertibility criteria involving rows). For an n × n matrix A = 
.
− wn
following are equivalent:
−


, the
−
1. A is invertible,
2. w1 , . . . , wn span Rn ,
3. w1 , . . . , wn are linearly independent,
4. w1 , . . . , wn form a basis of Rn .
Theorem 13 (column-row definition of matrix multiplication). Given matrices




− w1 −
|
|


..
A = v1 · · · vm  and B = 
,
.
|
|
− wm −
with v1 , . . . , vm , w1 , . . . , wm ∈ Rn , think of the vi as n × 1 matrices and the wi as 1 × n matrices. Then the
product of A and B can be computed as a sum of m n × n matrices:
AB = v1 w1 + · · · + vm wm =
m
X
vi wi .
i=1
Theorem 14 (5.3.10, matrix of an orthogonal projection). Let V be a subspace of Rn with orthonormal
basis u1 , . . . , um . Then the matrix of the orthogonal projection onto V is


|
|
QQT , where Q = u1 · · · um  .
|
|
2
Related documents