Download Solution Set - Harvard Math Department

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Euclidean vector wikipedia , lookup

Vector space wikipedia , lookup

Exterior algebra wikipedia , lookup

Cross product wikipedia , lookup

Linear least squares (mathematics) wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Rotation matrix wikipedia , lookup

Determinant wikipedia , lookup

Jordan normal form wikipedia , lookup

Matrix (mathematics) wikipedia , lookup

Principal component analysis wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Singular-value decomposition wikipedia , lookup

System of linear equations wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Ordinary least squares wikipedia , lookup

Four-vector wikipedia , lookup

Gaussian elimination wikipedia , lookup

Matrix calculus wikipedia , lookup

Matrix multiplication wikipedia , lookup

Transcript
Solution Set
Bretscher 2.1 - 3,4,24-30,38,42,43,44
6/28/17
2.1 / 3
  x1    x2  x3 


In order for a transformation T to be linear, T(x+y) = T(x) + T(y). For the given transformation T   x2     x1 x3  , we
x  x  x 
  3  1 2 
 1 1 
  2   0
would expect T  1  1   T  2   4
 1 1 
  2   0
    
   
transformation is not linear.
?

 1 
 1  0 0 0
T  1   T  1   1  1  2 . Clearly, this is not the case. Thus, this
 1 
 1  0 0 0
 
       
2.1 / 4
y1
y2
We can rewrite the transformation
y3
y4




9 x1
2 x1
4 x1
5 x1
 3 x2
 9 x2
 9 x2
 x2
 3x3
 x3
in matrix notation as
 2 x3
 5 x3
 y1  9 3  3
 y2  2  9 1   x1 
 y   4  9  2  x2  . Thus, the
 3  5 1 5   x3 

 y4  
4x3 matrix gives the matrix of the linear transformation.
2.1 / 24
This is a rotation
transformation by 90°
counterclockwise.
2.1 / 27
This is a reflection about
the x-axis.
2.1 / 30
This is a projection
onto the y-axis.
(Drawing should be
completely flat.)
2.1 / 28
This is a scaling in the y
direction by 2.
2.1 / 25
This is a scaling by 2.
2.1 / 29
This is a reflection
across both the x and yaxis, or equivalently, a
rotation by 180°.
2.1 / 26
This is reflection across
y=x.
2.1 / 38






Since v1 and v2 are the column vectors, we know that T e1   v1 and T e2   v2 .
 


 


Thus, T   2    T 2e1  e2   2T e1   T v2   2v1  v2 .

1


  
2.1 / 42
 0 
a. T  0   0
 0  0
 
 1   1 
T  1    21 
 0    
   2
b.
 
2v1  v2

v2
 1    1 
T  0    12 
  0    
   2
 0 
T  1   1
 0  0
 
 0 
T  0   0
 1  1
 
 0 
T  1   1
 1  1
 
 1    1 
T  0    12 
 1   
   2 
 1   1 
T  1    12 
 1   
  2

v1
x3
x2
x1
The points that are transformed to 0 satisfy the following system of equations:
0
 x1  2t 
 1 1 0 0 1  2 0 0 1 0  2 0
 12 x1  x2  0
  12
  0 1  1 0  0 1  1 0   x2    t 
1
 2 x1  x3  0
 

 x   t 
 2 0 1 0 
 3
2.1 / 43
a. We can show that transformation T is linear by finding a matrix of T. Recalling the definition of matrix multiplication,
the formula for T is similar to the matrix multiplication of a row vector by a column vector. In fact, the row vector is

basically v expressed as a row vector. Thus, the matrix of T is 2 3 4 .



b. More generally, the matrix of T for any vector v is v1 v2 v3  , or v t (the transpose of v ).
c.
Conversely, given any linear transformation T from 3 to , we know that the corresponding matrix must be a 1x3
matrix v1 v2 v3  . From part b, we can immediately see that this linear transformation is the corresponding matrix of
v
  1
the dot product transformation with v  v2  .
v 
 3
2.1 / 44
We can show that the cross product transformation is linear by finding the corresponding matrix directly. Alternatively, we
can show that the cross product satisfies the two conditions for linearity:


 

 

1. T kx   kT x 
T kx   v  kx   k v  x   kT x 
 



  
 
 


2. T x  y   T x   T  y 
T x  y   v  x  y   v  x   v  y   T x   T  y 
In proving the above two conditions, we have used a few properties of cross product. Since both conditions are satisfied, the
cross product transformation is linear.
Using the definition of cross product, we can write:
v x v x
0  v3 v2   x1 
v
x
    1  1  2 3 3 2 
T x   v  x  v2    x2    v3 x1  v1 x3    v3
0  v1   x2  . Thus, we have found the transformation matrix.
 v   x   v x  v x   v
0   x3 
 3   3   1 2 2 1   2 v1