Download Math 5535 – HW II – Solutions to selected problems.

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Generalized linear model wikipedia , lookup

Computational electromagnetics wikipedia , lookup

Inverse problem wikipedia , lookup

Mathematics of radio engineering wikipedia , lookup

Rotation matrix wikipedia , lookup

Density matrix wikipedia , lookup

Multidimensional empirical mode decomposition wikipedia , lookup

Linear algebra wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Transcript
Math 5535 – HW II – Solutions to selected problems.
2
1. iii. The equivalent
first-order linear dynamical system on R has
0 1
matrix A =
. The eigenvalues are repeated: λ = 2, 2. An
−4 4
T
eigenvector V1 = 1 2 can be found by solving (A − 2I)V1 = 0 and
T
a generalized eigenvector (as in problem 9) V2 = 0 1 can be found
from (A − 2I)V2 = V1 . Then if C = [V1 |V2 ] is the
matrix
with these as
2
1
columns, we will have A = CBC −1 where B =
. Then
0 2
1 0 2n n2n−1
1 0 1
n −1
Xn = CB C X0 =
.
2 1 0
2n
−2 1 1
The first component gives the solution of the original scalar difference
equation:
xn = 2n − n2n−1 .
4. To show A is not a contraction one could just exhibit a single vector
whose length is increased under multiplication by A. For example if
T
T
V = 0 1 then AV = 1 32 and |AV | > |V |. Alternatively, one
could calculate the matrix norm ||A|| and see that it is larger than 1.
√
By the method described in class, the matrix norm is σ1 where σ1 is
the largest eigenvalue of the matrix AT A. Calculating this we get
s
√
61 + 3145
||A|| =
> 1.
72
1 0
Although A is not a contraction it is conjugate to B = 2 2 which
0 3
is a contraction with ||B|| = 32 . By a theorem from class, any matrix
conjugate to a contraction is an eventual contraction. The argument
was that if A = CBC −1 (with ||B|| = λ < 1) then
|An V | = |CB n C −1 V | ≤ ||C|| |B n C −1 V | ≤ ||C|| λn |C −1 V |
≤ ||C|| λn ||C −1 || |V | = kλn |V |
with k = ||C||||C −1 ||.
Another way to understand A is to say that it is a contraction with
respect to a different norm. The new norm is just ||X|| = |C −1 X| where
C is the matrix in the conjugacy above, i.e., the matrix whose columns
1 6
are the eigenvectors of A. Here we can use, for example, C =
0 1
2
2
2
which gives ||X|| = x1 − 12x1 x2 + 37x2 .
5. The complex map f (z) = (−1+i)z−1 can be expanded as f (x+iy) =
(−x − y − 1) + i(x − y) or in real terms:
−1 −1 x
−1
F (x, y) =
+
.
1 −1 y
0
√
The linear part can be understood as a scaling √
by 2 and a rotation
3π
corresponding to the polar form −1 + i = 2 ei 4 . The complete
by 3π
4
T
map also involves a translation by −1 0 . It is easy to find the fixed
1
point by solving f (z) = z with the result z = −2+i
= − 25 − i 15 .
6. The fixed points of f (z) are solutions of z 2 − z + 1−i
= 0. The
4
quadratic formula gives
√
√
1± i
2±1±i
√
z± =
=
2
2 2
√
where the hint was used to simplify i. To check stability we can go to
the real form F (x, y) = (x2 −y 2 + 41 , 2xy − 41 ) and compute the Jacobian
matrix
2x −2y
DF (x, y) =
.
2y 2x
√
√ , ±1
√ ). Substituting into the
The fixed points are at (x, y) = ( 22±1
2 2 2
matrix DF and calculating
the eigenvalues (i.e.,
the multipliers of the
√
√
2+1±i
2−1±i
fixed points) gives µ = √2 at z+ and µ = √2 at zi . The moduli
p
p
√
√
of these numbers are |µ| = 2 + 2 > 1 and |µ| = 2 − 2 < 1
respectively. Hence z+ is a repeller and z− is an attractor.
9. Suppose v1 and v2 are chosen as described in the problem, so that
Av1 = λv1
Av2 = λv2 + v1 .
To show linear independence suppose some linear combination c1 v1 +
c2 v2 = 0 where ci are constant. We must show c1 = c2 = 0. Multiply
both sides of the dependence relation by the matrix A−λI. The choice
of the vectors vi shows that (A − λI)v1 = 0 and (A − λI)v2 = v1 . So
we get c1 · 0 + c2 v1 = 0. Since 6= 0 and v1 6= 0 we must have c2 = 0.
Then the dependence relation reduces to c1 v1 =0 which give c1 = 0.
λ To verify the matrix equation AC = C
note that
0 λ
AC = A v1 v2 = Av1 Av2 = λv1 λv2 + v1
where the column vectors of the
are indicated.
matrices
The same
λ λ columns are obtained from C
= v1 v2
which com0 λ
0 λ
pletes the proof.