Download Applications

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Determinant wikipedia , lookup

System of linear equations wikipedia , lookup

Matrix (mathematics) wikipedia , lookup

Jordan normal form wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Four-vector wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Gaussian elimination wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Matrix multiplication wikipedia , lookup

Matrix calculus wikipedia , lookup

Transcript
Dynamical Systems and Markov Processes
A dynamical system is a finite set of variables whose values change with time. For instance, we may be
interested in how the percentage of voters who are registered Republicans, Democrats, Independent, or some other
party, changes over time. We may be interested in how the population is changing in a city and its surrounding
suburbs.
For example, suppose two competing television channels, channel 1 and channel 2, each have 50% of the viewer
market at some initial point in time. Assume that over each one-year period channel 1 captures 30% of channel 2’s
share, and channel 2 captures 20% of channel 1’s share. What is each channel’s share after one year?
1
The values of the variables in a dynamical system at a point in time are referred to as states (or the state of the
variable at that time). Suppose that we have n states (we’ll denote them S 1 , S 2 , … , S n ). The probability that a
member of the population will change from state S j to state S i is denoted p i j where 0 ≤ p i j ≤ 1. If p i j = 0 then the
member is certain not to change from state S j to state S i . If p i j = 1 then the member is certain to change from state
S j to state S i . The collection of all such probabilities can be arranged in what is called the transition matrix (we’ll
call it P) as shown below
From
S1
P=
S2 ⋯
Sn
p 11 p 12 ⋯ p 1n
S1
p 21 p 22 ⋯ p 2n
S2
⋮
⋮
⋱
⋮
p n1 p n2 ⋯ p nn
⋮
To
Sn
At each transition period, each member must either leave or stay. This means that the sum of the entries in any
column in the above matrix must equal 1. Thus, for instance, we must have
p 11 + p 21 + ⋯ + p n1 = 1
Such a matrix (whose entries are nonnegative and whose individual column entries sum to one) is called a
stochastic matrix. The state vector (we’ll call it ⃗xt ) is a n × 1 matrix representing the value of each of the n states
at time t. By multiplying ⃗x on the right by P we obtain the new population distribution for the states after one
transition period. Such a process is called a Markov Process (in honor of Andrei Markov (1856-1922) a Russian
mathematician known for his work in probability theory). A dynamical system undergoing a Markov Process is
called a Markov Chain.
Long-Term Behavior of a Markov Chain
If P is the transition matrix for a Markov Chain, what does the sequence of state vectors
⃗x0, P ⃗x0, P 2 ⃗x0, P 3 ⃗x0, … approach?
If this sequence converges to the vector ⃗
q, then we call ⃗
q the steady-state vector of the Markov Chain. In this
case, the sequence of matrices
P, P 2 , P 3 , …
approaches the matrix Q each of whose column vectors is ⃗
q.
When does a Markov Chain have a steady-state solution?
A stochastic matrix P is said to be regular if P or some power of P has all positive entries.
A sufficient condition for a Markov Chain to have a steady-state system is that its transition matrix is regular. A
Markov Chain with a regular transition matrix is called a regular Markov Chain.
2
Examples:
1. Two competing companies offer cable television service to a city. Currently 15% of all citizens use Cable
Company A and 20% use Cable Company B. Assume the change in cable subscriptions each year is as follows:
Twenty percent of Company A subscribers switch to Company B while 10% switch to no cable television.
Fifteen percent of Company B subscribers switch to Company A while 5% switch to no cable television.
Fifteen percent of citizens switch from no cable television to Company A.
Ten percent of citizens switch from no cable television to Company B.
a. Construct the matrix of transition probabilities for this process.
b. What percent of citizens will be using Company A one year from now? __________
Two years from now? __________
Five years from now? __________
3
c. Find the steady state solution showing support below. In the steady state solution, what percent of citizens
will be
using Company A? ________
using Company B? ________
using no cable company? _______
4
2. A study has determined that the occupation of a boy, as an adult, depends upon the occupation of his father and
is given by the following transition matrix where P = professional, F = farmer, and L = laborer.
Father’s occupation
P
P=
F
L
0. 8 0. 3 0. 2
P
0. 1 0. 5 0. 2
F
0. 1 0. 2 0. 6
L
Son’s occupation
Thus, for instance, the probability that the son of a professional will also be a professional is 0.8.
a. What is the probability that the grandchild of a professional will also be a professional?
b. In the long run, what proportion of the population will be farmers?
5