Download Complex Eigenvalues

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Addition wikipedia , lookup

Bra–ket notation wikipedia , lookup

System of polynomial equations wikipedia , lookup

Recurrence relation wikipedia , lookup

Fundamental theorem of algebra wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

System of linear equations wikipedia , lookup

Mathematics of radio engineering wikipedia , lookup

Transcript
3.4 Complex Eigenvalues.
3.4.1 Concepts.
We are still concerned with linear system of differential equations of the form
dx1
= a11x1 + a12x2 + … + a1nxn
dt
dx2
= a21x1 + a22x2 + … + a2nxn
dt
(1)
. . .
dxn
= an1x1 + an2x2 + … + annxn
dt
where the aij are numbers and the xi(t) are unknown functions of t. In vector-matrix form this is
(2)
du
= Au
dt
a11 a12 … a1n
x1


x2
a a … a2n
with u =  .  and A =  21 22
. In this section we look at the case where A complex eigenvalues.
 
...
xn
an1 an2 … ann
We begin with an example with two equations.
Example 1.
(3)
dx
= - 4x - 5y
dt
(4)
dy
=
dt
x - 2y
In vector – matrix form this is
d x - 4 - 5 x du
x
-4 -5
=
or
= Au where u = y and A =  1 - 2 .
dt y  1 - 2  y dt
aet
a
We look for solution that have the form u =  t = etb = etw where a, b and  are numbers and
be 
a
w = b. As before  must be an eigenvalue of A and w an associated eigenvector. We find the eigenvalues
-4 -5
and eigenvectors of A =  1 - 2 .
A - I =
-4-
 1
0 = det( A - I ) =
-5 
-2-
-4-
 1
-5 
= (- 4 - )(- 2 - ) – (1)(- 5)
-2-
= 2 + 6 + 13
So the eigenvalues are
3.4.1 - 1
 =
-6
36 - (4)(13)
- 6  - 16
=
= - 3  2i
2
2
So
1 = - 3 + 2i
2 = - 3 + 2i
and
This example illustrates a general property of complex eigenvalues of a matrix with real entries – they occur
in complex conjugate pairs. One reason is that the characteristic equation det( A - I ) = 0 is a polynomial
equation in  with real coefficients and for such equations the roots occur in complex conjugate pairs. We
shall see another reason below.
_
If z = x + yi is a complex number with real and imaginary parts x and y then the complex conjugate of z is z
_____
= x – yi. For example, 3 - 2i = 3 + 2i. The operation of taking complex conjugate has a number of simple
algebraic properties. Some of these are
_____
_
_
_____
_ _
(5)
z+w = z + w
z-w = z - w
__
__
_____
_ _
(6)
zw = z w
z/w = z/w
Now we find the eigenvectors for 1 = - 3 + 2i one has
A - 1I = A – (- 3 + 2i)I =
- 1 - 2i
 1
-5 
1 - 2i 
x
So an eigenvector w =  y  satisfies
 0  = (A - I)w = - 1 - 2i
0
 1
-5 x
1 - 2i   y  =
 (- 1 - 2i)x - 5y 

x
+ (1 - 2i)y 
So
(- 1 - 2i)x x
5y = 0
+ (1 - 2i)y = 0
If one multiplies the second equation by - 1 – 2i one obtains the first. So any solution to the second
equation is also a solution to the first. So it suffices to solve the second equation whose solution is x = (- 1
+ 2i)y. So an eigenvector v for 1 = - 3 + 2i has the form
v =
 x  =  (- 1 + 2i)y  = y - 1 + 2i 
y
 y

 1 
- 1 + 2i 
So any multiple of the vector w1 =  1
 is an eigenvector for 1 = - 3 + 2i.
For 2 = - 3 - 2i all the previous computation that we did for 1 = - 3 + 2i remain the same except we
- 1 - 2i
replace i by – i. So it is not hard to see that any multiple of the vector w2 =  1  is an eigenvector for
2 = - 3 - 2i.
3.4.1 - 2
This example illustrates another general feature of the eigenvalues and eigenvectors for complex
eigenvalues, namely the eigenvectors for complex conjugate eigenvalues have complex conjugate
components. It was not hard to see why this was true in the above example, and the same argument can be
used in general. However, in the Appendix there is a slightly different argument that is useful in other
similar situations.
Returning to the equations (3) and (4) it follows that two solutions of the system (10) are
- 1 + 2i 
u1(t) = e(-3+2i)t  1

- 1 - 2i
u2(t) = e(-3+2i)t  1 
and
In fact, the second solution is just the complex conjugate of the first. By the superposition principle
x = c1e(-3+2i)t - 1 + 2i  + c2e(-3+2i)t - 1 - 2i 
y
 1 
 1 
is a solution to (3) and (4) for any c1 and c2.
Often we want to express the solutions in terms of real valued functions. In order to do this it is helpful to
use the following
Proposition 1. Suppose A is a square matrix with real entries and u(t) is a solution to
du
= Au. Then v2(t) =
dt
___
Re[u(t)] and v1(t) = Im[u(t)] and u(t) are also solutions.
If z = x + yi is a complex number with real and imaginary parts x and y then Re(z) = x and Im(z) = y. So Re
and Im are the operations of taking the real and imaginary parts of a complex number. For example, Re(3 –
2i) = 3 and Im(3 – 2i) = - 2.
du
du
= Au, then Re[
] = Re[Au]. In general, taking the real part commutes
dt
dt
du
d[Re(u)]
with taking a derivative, i.e. Re[
]=
. Also Re[Au] = A(Re(u)) since A has real entries.
dt
dt
d[Re(u)]
Combining we get
= A(Re(u)) which proves that v2(t) = Re[u(t)] is a solution. The proof that v1(t)
dt
___
= Im[u(t)] is also a solution is similar. The fact that u(t) is a solution follows from the superposition
principle. //
Proof of Proposition 1. If
- 1 + 2i 
Let's apply this to the solution u(t) = e(-3+2i)t  1
 of (3) and (4). Note that
e(-3+2i)t
= e-3t + 2it = e-3t e2it = e-3t (cos 2t + i sin 2t) = e-2t (c + is)
where c = cos 2t and s = sin 2t. So
- 1 + 2i 
(- 1 + 2i) (c + is)
-3t
- 1 + 2i 
-3t 
u(t) = e(-3+2i)t  1
 = e (c + is)  1  = e 
c + is

3.4.1 - 3
- c - 2s + i(-s + 2c)
= e-3t 
c + is

Therefore, two solutions to (3) and (4) are
- cos 2t - 2 sin 2t
v2(t) = Re[u(t)] = e-3t 
cos 2t

- sin 2t + 2 cos 2t
v1(t) = Im[u(t)] = e-3t 
sin 2t

and
x = c1e-3t - sin 2t + 2 cos 2t + c2e-3t - cos 2t - 2 sin 2t
y

sin 2t


cos 2t

(7)
is a solution for and c1 and c2. This expresses the solution in terms of real valued functions. We can find c1
and c2 from two additional pieces of information such as the values of x and y at some particular t.
For example, suppose x(0) = 2 and y(0) = - 4. Substituting t = 0 into (7) one obtains
2 c1 - c2 =
2
c2 = - 4
So c1 = - 1 and
x
3.0
x = - e-3t - sin 2t + 2 cos 2t - 4e-3t - cos 2t - 2 sin 2t
y

sin 2t


cos 2t

2.5
2.0
1.5
1.0
and
0.5
x = 2e-3tcos 2t + 9e-3tsin 2t
1
2
3
4
1
2
3
4
t
y
y = - 4e-3tcos 2t - e-3tsin 2t
1
Graphs of x and y are at the right. They are damped oscillations although the
2
damping is so fast compared to the oscillations it is hard to see the oscillations.
3
4
Note that both x(t) and y(t) approach 0 as t  . The solution where x(t) = 0
0
and y(t) = 0 for all t is an equilibrium solution and the vector 0 is an equilibrium point. Note that no
0
0
matter what c1 and c2 are the solution given by (8) approaches 0 as t  . Thus 0 is an asymptotically
stable equilibrium point or a sink.
There is another way of writing the solution (7) that displays how the solution depends on the initial values.
Note that (7) can be written as (again using c = cos 2t and s = sin 2t)
3.4.1 - 4
t
x = e-3t 2c - s
y
 s
- c - 2s
c 
 c1  = e-3t 2 - 1 c - s  c1 
 c2 
0 1  s c   c2 
The matrix on the left can be factored as
(8)
x = e-3t 2 - 1 c - s  c1  = e-3t T Rt  c1 
y
0 1  s c   c2 
 c2 
where
T =
Rt =
2 - 1 = matrix whose columns are imaginary and real parts of the eigenvector
0 1 
- 1 + 2i 
 1 
cos 2t
sin 2t
- sin 2t 
cos 2t  = matrix for rotation by angle 2t
If we plug in t = 0 into (8) and use the fact that R0 = I we get
 x(0)  = T  c1 
 y(0) 
 c2 
So
-1
 c1  = T-1  x(0)  = 2 - 1
 c2 
 v(0) 
0 1 
and
(9)
x = e-3t T Rt T-1  x(0)  = e-3t 2 - 1 cos 2t
y
 v(0) 
0 1  sin 2t
- sin 2t  2
cos 2t  0
x(0)
= etA  v(0) 
where
(10)
2
etA = e-3t T Rt T-1 = e-3t 0
- 1 cos 2t
1  sin 2t
- sin 2t  2
cos 2t  0
= exponential of the matrix tA
If we multiply out the right side of (10) we get
etA =
e-3t 2 cos 2t - sin 2t
sin 2t
2 
- 5 sin 2t

2 cos 2t + sin 2t
So the formula (9) becomes
3.4.1 - 5
- 1-1
1
- 1-1  x(0) 
1   v(0) 
x = e 2 cos 2t - sin 2t
y
sin 2t
2 
-3t
- 5 sin 2t
  x(0) 
2 cos 2t + sin 2t  y(0) 
Appendix.
The operation of taking complex conjugates can be extended to vectors and matrices. If
z 
_  __ 
 zz 
v =  .  is a vector with complex components then its complex conjugate is v = z . If
 __. 
z 
z
a a  a

 a 
A =  a a
 is a matrix with complex components then its complex conjugate is
a a  a 
__ __
__
a a  a


_
__ __
__
A =  a a  a .
 __ __ __ 
a a  a 
__
1
1
2
2
n
n
11
12
1n
21
22
2n
m1
m2
11
12
1n
21
22
2n
m1
m2
mn
mn1
_
_
2 - 3i
2 + 3i
2 - 3i 7 + i
Example 2. If v =  5 + 4i  then v =  5 - 4i . If A =  5 + 4i 6 - 8i  then A
2 + 3i 7 - i
=  5 - 4i 6 + 8i .
The algebraic properties (5) and (6) of complex conjugates for numbers extends to
complex conjugates of vectors and matrices, e.g. if  is a complex number, u and v are
vectors and A and B are matrices then
_____
_
_
A+B = A + B
_____
_
_
u+v = u + v
(7)
__
__
u =  u
__
__
A =  A
__
__
Au = A u
__
__
AB = A B
The following proposition shows that complex eigenvalues of matrices with real entries
occur in conjugate pairs.
Proposition 1. Suppose A is a matrix with real entries and  is an eigenvalue of A with
_
_
_
eigenvector v. Then  is also an eigenvalue of A and v is an eigenvector for .
3.4.1 - 6
__ __
Proof. One has Av = v. Taking complex conjugates of both sides gives Av = v. Using
__ __
(7) gives A v =  v. which proves the proposition. //
3.4.1 - 7