Download Math 5652: Introduction to Stochastic Processes Homework 3: due

yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts

Gene expression programming wikipedia, lookup

Population genetics wikipedia, lookup

Gene expression profiling wikipedia, lookup

Designer baby wikipedia, lookup

Artificial gene synthesis wikipedia, lookup

Genetic drift wikipedia, lookup

Microevolution wikipedia, lookup

Hardy–Weinberg principle wikipedia, lookup

Math 5652: Introduction to Stochastic Processes
Homework 3: due Tuesday, March 10
You are welcome and encouraged to discuss the problems with your classmates; please write
up your own solutions, and indicate collaborators on your write-up.
(1) (10 points) This problem is taken from: J. Norris, Markov Chains, Cambridge University
Press 1997.
Construct a sequence of non-negative integers as follows. Let F0 = 0 and F1 = 1. Once
F0 , . . . , Fn are known, let Fn+1 be either the sum of Fn and Fn−1 , or (the absolute value
of) their difference, each with probability 1/2.
(a) Is (Fn )n≥0 a Markov chain?
(b) Let Xn = (Fn−1 , Fn ). Find the transition probabilities for this Markov chain. (Give
a formula, don’t try to write them into a matrix.) Using your formula, find the
probability that (Fn )n≥0 reaches 3 before first returning to 0.
Hint 1: draw the first few transitions of the chain Xn . What states correspond to “Fn
reaches 3”? What states correspond to “Fn reaches 0”?
Hint 2: to figure out the probability of reaching one set of states before another, consider
the Markov chain in which the states of interest are all absorbing.
(2) (20 points) In this problem, we will look at the Wright-Fisher model of population genetics. Consider a Markov chain with state space {0, 1, . . . , m}, and transition probabilities
j m−i
pij =
The biological interpretation is as follows. We have k individuals, each of whom has one
gene, but two copies of it (as is usual for people). The total number of genes floating
around is m = 2k. Let’s call the two versions (alleles) of the gene A and a: so each
individual has either AA, Aa, or aa (the order doesn’t matter). Xn is the total number
of A alleles in the nth generation. To get from one generation to the next, pick two genes
uniformly at random, with replacement, from the gene pool in the parent generation:
equivalently, pick two individuals with replacement, and take one allele from each of
them. (This works well as a model of plant reproduction for self-pollinating plants, and
is only an approximation in bisexual species.)
For example, if in generation n the population has genotypes
AA Aa AA AA aa
then in generation n + 1, each gene will be A with probability 0.7 (the proportion of
A’s in the nth generation), and a with probability 0.3. For the Markov chain, we’re just
counting the proportion of each allele in the population, so the structure of the pairs
isn’t important.
(a) What are the closed irreducible classes for this Markov chain?
(b) In the long-run in this model, genetic diversity disappears, and we arrive into one
of the absorbing states. Calculate all the hitting probabilities of arriving into state
Xn = m for m = 3 and m = 4. That is, compute ρim for all i = 0, . . . , m for those
two cases. (Note: I’m asking for m = 3 or 4, not k = 3 or 4.) Can you guess the
form of the answer in general?
(c) Check that your guess in part (b) satisfies the equations for hitting probabilities for
general m. In a finite-state chain, the solution is unique, so you have just figured
out the probability that the final population will be all-A starting from every initial
distribution of genes.
It’s possible to extend this model to take various other biological features into account.
(3) (10 points) Let Y1 , Y2 , ... be a sequence of iid random variables with all moments finite.
Let N be a nonnegative-integer-valued random variable, independent from all the Yi ,
also with all moments finite. Let S = Y1 + . . . + YN , and define S = 0 if N = 0. Find the
mean E[S] and the variance Var(S) in terms of the moments (mean, variance, second
moment) of Y and N .
Hint: begin by conditioning on the value of N , i.e. write
E[S] =
P(N = n)E[S|N = n],
E[S 2 ] =
P(N = n)E[S 2 |N = n].
(4) (10 points) Durrett 2.10: Consider a bank with two tellers. Three people – Alice, Bob,
and Carol – walk into the bank at almost the same time, but in that order. Alice and
Bob go directly into service, while Carol waits for the first available teller. Suppose that
the service time of each of the three customers is independent, exponentially distributed,
with a mean of 4 minutes. (Careful: this is the mean, not the rate!)
(a) What is the expected amount of time Carol spends in the bank, including both the
wait and her service time?
(b) What is the expected time until the last of the customers leaves? (The last customer
may or may not be Carol.)
(c) What is the probability that Carol is the last customer to leave?
(5) (10 points) Durrett 2.24: Suppose that the number of calls to an answering service follows
a Poisson process with a rate of 4 per hour.
(a) What is the probability that (strictly) fewer than two calls come in the first hour?
(b) Suppose that six calls come in the first hour. What is the probability that (strictly)
fewer than two come in the second hour?
(c) Suppose that the operator gets to take a break after she has answered ten calls.
How long on average are her work periods?
(6) (10 points) Durrett 2.34: Edwin catches trout at the times of a Poisson process with a
rate of 3 per hour. Suppose that the trout weigh an average of 4 pounds, with a standard
deviation (careful: not variance!) of 2 pounds. Find the mean and standard deviation
(careful: not variance!) of the total weight of fish Edwin catches in two hours. (Note:
you will want to use your answer to problem 3 in this problem.)