Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
UCSD ECE250
Prof. Young-Han Kim
Handout #19
Wednesday, March 11, 2015
Solutions to Exercise Set #7
(Prepared by TA Fatemeh Arbabjolfaei)
1. Symmetric random walk. Let Xn be a random walk defined by
X0 = 0,
n
X
Zi ,
Xn =
i=1
where Z1 , Z2 , . . . are i.i.d. with P{Z1 = −1} = P{Z1 = 1} = 21 .
(a) Find P{X10 = 10}.
(b) Approximate P{−10 ≤ X100 ≤ 10} using the central limit theorem.
(c) Find P{Xn = k}.
Solution:
(a) Since the event {X10 = 10} is equivalent to {Z1 = · · · = Z10 = 1}, we have P{X10 =
10} = 2−10 .
(b) Since E(Zj ) = 0 and E(Zj2 ) = 1, by the central limit theorem,
(
P{−10 ≤ X100 ≤ 10} = P −1 ≤
100
1 X
√
Zi
100 i=1
!
)
≤1
≈ 1 − 2Q(1) = 2Φ(1) − 1
≈ 0.682.
(c)
P{Xn = k} = P{(n + k)/2 heads in n independent coin tosses}
n
= n+k 2−n
2
for −n ≤ k ≤ n with n + k even.
2. Absolute-value random walk. Consider the symmetric random walk Xn in the previous problem. Define the absolute value random process Yn = |Xn |.
(a) Find P{Yn = k}.
(b) Find P{max1≤i<20 Yi = 10 | Y20 = 0}.
Solution:
1
(a) If k ≥ 0 then
P{Yn = k} = P{Xn = +k or Xn = −k} .
If k > 0 then P{Yn = k} = 2P{Xn = k}, while P{Yn = 0} = P{Xn = 0}. Thus
1 n−1
n
k > 0, n − k is even, n − k ≥ 0
2
(n+k)/2
1 n
n
P{Yn = k} =
k = 0, n is even, n ≥ 0
n/2 2
0
otherwise.
(b) If Y20 = |X20 | = 0 then there are only two sample paths with max1≤i<20 |Xi | = 10 that
is, Z1 = Z2 = · · · = Z10 = +1, Z11 = · · · = Z20 = −1 or Z1 = Z2 = · · · = Z10 =
−1, Z11 = · · · = Z20 = +1. Since the total number of sample paths is 20
10 and all paths
are equally likely,
P
2
max Yi = 10|Y20 = 0 =
=
20
1≤i<20
10
2
1
=
.
184756
92378
3. A random process. Let Xn = Zn−1 + Zn for n ≥ 1, where Z0 , Z1 , Z2 , . . . are i.i.d. ∼ N(0, 1).
(a) Find the mean and autocorrelation function of {Xn }.
(b) Is {Xn } wide-sense stationary? Justify your answer.
(c) Is {Xn } Gaussian? Justify your answer.
(d) Is {Xn } strict-sense stationary? Justify your answer.
(e) Find E(X3 |X1 , X2 ).
(f) Find E(X3 |X2 ).
(g) Is {Xn } Markov? Justify your answer.
(h) Is {Xn } independent increment? Justify your answer.
Solution:
(a)
E(Xn ) = E(Zn−1 ) + E(Zn ) = 0.
RX (m, n) = E(Xm Xn )
= E [(Zm−1 + Zm ) (Zn−1 + Zn )]
2 ],
E[Zn−1
n−m=1
E[Z 2 ] + E[Z 2 ], n = m
n
n−1
=
2 ],
E[Z
m−n=1
n
0,
otherwise
2, n = m
1, |n − m| = 1
=
0, otherwise.
2
(b) Since the mean and autocorrelation functions are time-invariant, the process is WSS.
(c) Since (X1 , . . . , Xn ) is a linear transform of a GRV (Z0 , Z1 , . . . , Zn ), the process is Gaussian.
(d) Since the process is WSS and Gaussian, it is SSS.
(e) Since the process is Gaussian, the conditional expectation (MMSE estimate) is linear.
Hence,
2 1 −1 X1
1
= (2X2 − X1 ).
E(X3 |X1 , X2 ) = 0 1
1 2
X2
3
(f) Similarly, E(X3 |X2 ) = (1/2)X2 .
(g) Since E(X3 |X1 , X2 ) 6= E(X3 |X2 ), the process is not Markov.
(h) Since the process is not Markov, it is not independent increment.
4. Moving average process. Let Xn = 12 Zn−1 + Zn for n ≥ 1, where Z0 , Z1 , Z2 , . . . are i.i.d.
∼ N(0, 1). Find the mean and autocorrelation function of Xn .
Solution:
1
E(Xn ) = E(Zn−1 ) + E(Zn ) = 0.
2
RX (m, n) = E(Xm Xn )
1
1
=E
Zm−1 + Zm
Zn−1 + Zn
2
2
1
2
E[Zn−1 ],
n−m=1
12
2
2
4 E[Zn−1 ] + E[Zn ], n = m
=
1
m−n=1
E[Zn2 ],
2
0,
otherwise
5
4, n = m
1
=
2 , |n − m| = 1
0 , otherwise.
5. Autoregressive process. Let X0 = 0 and Xn = 12 Xn−1 + Zn for n ≥ 1, where Z1 , Z2 , . . . are
i.i.d. ∼ N(0, 1). Find the mean and autocorrelation function of Xn .
Solution:
1
1
1
1
1
E[Xn ] = E[Xn−1 ] + E[Zn ] = E[Xn−2 ] + E[Zn−1 ] = . . . = n−1 E[X1 ] = n−1 E[Z1 ] = 0.
2
4
2
2
2
For n > m we can write
Xn =
n−m−1
X 1
1
1
1
X
+
Z
+
.
.
.
+
Z
+
Z
=
X
+
Zn−i .
m
m+1
n−1
n
m
n−m
n−m−1
n−m
2
2
2
2
2i
1
i=0
3
Therefore,
2
RX (n, m) = E(Xn Xm ) = 2−(n−m) E[Xm
],
since Xm and Zn−i , i = 0, . . . , n − m − 1, are independent and E[Xm ] = E[Zn−i ] = 0.
2 ] consider
To find E[Xm
E[X12 ] = 1,
1
1
E[X22 ] = E[X12 ] + E[Z22 ] = + 1,
4
4
..
.
E[Xn2 ] =
1
4n−1
+ ... +
1
4
1
+ 1 = (1 − n ).
4
3
4
Thus in general,
RX (n, m) = E(Xn Xm ) = 2−|n−m| 34 [1 − ( 41 )min{n,m} ].
6. Random binary waveform. In a digital communication channel the symbol “1” is represented
by the fixed duration rectangular pulse
1 for 0 ≤ t < 1
g(t) =
0 otherwise,
and the symbol “0” is represented by −g(t). The data transmitted over the channel is
represented by the random process
X(t) =
∞
X
k=0
Ak g(t − k),
where A0 , A1 , . . . are i.i.d random variables with
+1 w.p.
Ai =
−1 w.p.
for t ≥ 0,
1
2
1
2.
(a) Find its first and second order pmfs.
(b) Find the mean and the autocorrelation function of the process X(t).
Solution:
(a) The first order pmf is
pX(t) (x) = P (X (t) = x)
=P
∞
X
k=0
Ak g(t − k) = x
= P A⌊t⌋ = x
= P (A0 = x) IID
1
2 , x = ±1
=
0 , otherwise.
4
!
Now note that X(t1 ) and X(t2 ) are dependent only if t1 and t2 fall within the same time
interval. Otherwise, they are independent. Thus, the second order pmf is
pX(t1 )X(t2 ) (x, y) = P (X (t1 ) = x, X (t2 ) = y)
∞
X
=P
k=0
(b) For t ≥ 0,
Ak g (t1 − k) = x,
∞
X
k=0
Ak g (t2 − k) = y
!
= P A⌊t1 ⌋ = x, A⌊t2 ⌋ = y
P(A0 = x, A0 = y), ⌊t1 ⌋ = ⌊t2 ⌋
=
P(A0 = x, A1 = y), otherwise
1
2 , ⌊t1 ⌋ = ⌊t2 ⌋ & (x, y) = (1, 1), (−1, −1)
1
=
, ⌊t1 ⌋ =
6 ⌊t2 ⌋ & (x, y) = (1, 1), (1, −1), (−1, 1), (−1, −1)
4
0 , otherwise.
E[X(t)] = E
=
"
∞
X
k=0
∞
X
k=0
Ak g(t − k)
#
g(t − k)E[Ak ]
= 0.
For the autocorrelation RX (t1 , t2 ), we note once again that only if t1 and t2 fall within
the same interval, will X(t1 ) be dependent on X(t2 ); if they do not fall in the same
interval then they are independent from one another. Then,
RX (t1 , t2 ) = E[X(t1 )X(t2 )]
∞
X
g(t1 − k)g(t2 − k)E[A2k ]
=
k=0
=
1,
0,
⌊t1 ⌋ = ⌊t2 ⌋
otherwise.
7. QAM random process. Consider the random process
X(t) = Z1 cos ωt + Z2 sin ωt ,
−∞ < t < ∞ ,
where Z1 and Z2 are i.i.d. discrete random variables such that pZi (+1) = pZi (−1) = 21 .
(a) Is X(t) wide-sense stationary? Justify your answer.
(b) Is X(t) strict-sense stationary? Justify your answer.
Solution:
(a) We first check the mean.
E(X(t)) = E(Z1 ) cos ωt + E(Z2 ) sin ωt = 0 · cos(ωt) + 0 · sin(ωt) = 0 .
5
The mean is independent of t. Next we consider the autocorrelation function.
E(X(t + τ )X(t)) = E((Z1 cos(ω(t + τ )) + Z2 sin(ω(t + τ )))(Z1 cos(ωt) + Z2 sin(ωt)))
= E(Z12 ) cos(ω(t + τ )) cos(ωt) + E(Z22 ) sin(ω(t + τ )) sin(ωt)
= cos(ω(t + τ )) cos(ωt) + sin(ω(t + τ )) sin(ωt)
= cos(ω(t + τ ) − ωt)) = cos ωτ .
The autocorrelation function is also time invariant. Therefore X(t) is WSS.
(b) Note that X(0) = Z1 cos 0 + Z2 sin 0 = Z1 , so X(0) has the same pmf as Z1 . On the
other hand,
X(
π
) = Z1 cos(π/4) + Z2 (sin π/4)
4ω
1
= √ (Z1 + Z2 )
2
√
√2 =
2
w.p. 41
2
w.p. 12
= 0
√
√
−2
= − 2 w.p. 1
2
4
This shows that X(π/4ω) does not have the same pdf or even same range as X(0).
Therefore X(t) is not first-order stationary and as a result is not SSS.
8. Mixture of two WSS processes. Let X(t) and Y (t) be two zero-mean WSS processes with
autocorrelation functions RX (τ ) and RY (τ ), respectively. Define the process
X(t), with probability 21
Z(t) =
Y (t), with probability 21 .
Find the mean and autocorrelation functions for Z(t). Is Z(t) a WSS process? Justify your
answer.
Solution: To show that Z(t) is WSS, we show that its mean and autocorrelation functions
are time invariant. Consider
µZ (t) = E[Z(t)]
= E(Z|Z = X)P{Z = X} + E(Z|Z = Y )P{Z = Y }
1
= (µX + µY )
2
= 0,
and similarly
RZ (t + τ, t) = E[Z(t + τ )Z(t)]
1
= (RX (τ ) + RY (τ )) .
2
Since µZ (t) is independent of time and RZ (t + τ, t) depends only on τ , Z(t) is WSS.
6
9. Stationary Gauss-Markov process. Let
X0 ∼ N(0, a)
Xn = 21 Xn−1 + Zn ,
n ≥ 1,
where Z1 , Z2 , Z3 , . . . are i.i.d. N(0, 1) independent of X0 .
(a) Find a such that Xn is stationary. Find the mean and autocorrelation functions of Xn .
P
(b) (Difficult.) Consider the sample mean Sn = n1 ni=1 Xi , n ≥ 1. Show that Sn converges
to the process mean in probability even though the sequence Xn is not i.i.d. (A stationary
process for which the sample mean converges to the process mean is called mean ergodic.)
Solution:
(a) We are asked to find a such that E(Xn ) is independent of n and RX (n1 , n2 ) depends
only on n1 − n2 . For Xn to be stationary, E(Xn2 ) must be independent of n. Thus
2
E(Xn2 ) = 41 E(Xn−1
) + E(Zn2 ) + E(Xn−1 Zn ) = 14 E(Xn2 ) + 1 .
Therefore, a = E(X02 ) = E(Xn2 ) = 43 . Using the method of problem 5, we can easily
verify that E(Xn ) = 0 for every n and that
RX (n1 , n2 ) = E(Xn1 Xn2 ) =
4
3
2−|n1 −n2 | .
(b) To prove convergence in probability, we first prove convergence in mean square and then
use the fact that mean square convergence implies convergence in probability.
X
n
n
n
1
1X
1X
Xi =
E(Xi ) =
0 = 0.
E(Sn ) = E
n
n
n
i=1
i=1
i=1
To show convergence in mean square we show that Var(Sn ) → 0 as n → ∞.
X
X
2 n
n
1
1
Var(Sn ) = Var
(since E(Xi ) = 0)
Xi = E
Xi
n
n
i=1
i=1
n−1
n
n
X
4
1 XX
−i
(n − i)2
RX (i, j) = 2 n + 2
= 2
n
3n
i=1
i=1 j=1
n−1
∞
X
X
4
4
4
−i
−i
≤
= .
2
2
1+2
1+2
≤
3n
3n
n
i=1
i=1
Thus Sn converges to the process mean, even though the sequence is not i.i.d.
7