Download 3. DISCRETE-TIME RANDOM PROCESSES

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
3. DISCRETE-TIME RANDOM PROCESSES
Outline
Random variables
Random processes
Filtering random processes
Spectral factorization
Special types of random processes
• Autoregressive moving average processes
• Autoregressive processes
• Moving average processes
1
3 Random processes
September 9, 2010
Random variables
Definitions
A random variable x is a function that assigns a number to each outcome of a random experiment.
Probability distribution function:
Fx (α) = Pr{x ≤ α}
Probability density function:
d
Fx (α)
dα
fx (α) =
Mean or expected value:
mx = E {x} =
Z ∞
−∞
α fx (α)d α
Variance:
σ2x
= Var{x} = E {(x − mx ) } =
2
Z ∞
−∞
2
(x − mx )2 fx (α)d α = E {x 2 } − mx2
3 Random processes
September 9, 2010
Random variables
Definitions
Joint probability distribution function:
Fx,y (α, β) = Pr{x ≤ α, y ≤ β}
Joint probability density function:
∂2
fx,y (α, β) =
Fx,y (α, β)
∂α∂β
Correlation:
rxy = E {xy ∗ }
Covariance:
cxy = Cov(x, y ) = E {(x − mx )(y − my )∗ } = rxy − mx my∗
Correlation coefficient
ρxy
rxy − mx my∗
cxy
=
,
=
σx σy
σx σy
3
|ρxy | ≤ 1
3 Random processes
September 9, 2010
Random variables
x and y uncorrelated
x and y strongly correlated
y = αx + n (small n)
Linearly dependent
4
3 Random processes
September 9, 2010
Random variables
Definitions
Two random variables x and y are independent if
fx,y (α, β) = fx (α) fy (β)
Two random variables x and y are uncorrelated if
E {xy ∗ } = E {x}E {y ∗ } or
rxy = mx my∗
or cxy = 0
Two random variables x and y are orthogonal if
rxy = 0
Orthogonal random variables are not necessarily uncorrelated
Zero-mean uncorrelated random variables are orthogonal
5
3 Random processes
September 9, 2010
Random processes
Definitions
A random process x(n) is an indexed sequence of random variables (a “signal”)
Mean and variance:
mx (n) = E {x(n)}
σ2x (n) = E {|x(n) − mx (n)|2 }
Autocorrelation and autocovariance:
rx (k , l ) = E {x(k )x ∗(l )}
cx (k , l ) = E {[x(k ) − mx (k )][x(l ) − mx (l )]∗ } = rx (k , l ) − mx (k )mx∗ (l )
Cross-correlation and cross-covariance
rxy (k , l ) = E {x(k )y ∗(l )}
cxy (k , l ) = E {[x(k ) − mx (k )][y (l ) − my (l )]∗ } = rxy (k , l ) − mx (k )my∗ (l )
Uncorrelated and orthogonal processes are defined as for variables but now ∀k , l
6
3 Random processes
September 9, 2010
Random processes
Stationarity
First-order stationarity if fx(n) (α) = fx(n+k ) (α). Implies mx (n) = mx (0) := mx
Second-order stationarity if fx(n1 ),x(n2 ) (α1 , α2 ) = fx(n1 +k ),x(n2 +k ) (α1 , α2 ).
Implies rx (k , l ) = rx (k − l , 0) := rx (k − l )
Stationarity in the strict sense, if the process is stationary for all orders L > 0
Wide-sense stationarity, if i) mx (n) = mx ; ii) rx (k , l ) = rx (k − l ), and iii) cx (0) < ∞
Two processes x(n) and y (n) jointly wide-sense stationary if i) both x(n) and y (n)
are wide-sense stationary and ii) rxy (k , l ) = rxy (k − l , 0) := rxy (k − l )
Properties of WSS processes:
rx (k ) = rx∗ (−k )
symmetry:
mean-square value:
rx (0) = E {|x(n)|2 } ≥ 0
maximum value:
rx (0) ≥ |rx (k )|
mean-square periodicity: rx (k0 ) = rx (0) ⇔ rx (k ) periodic with period k0
7
3 Random processes
September 9, 2010
Random processes
Autocorrelation and autocovariance matrices
We consider a WSS process x(n) and collect p + 1 samples in a vector
x = [x(0), x(1), . . ., x(p)]T
Autocorrelation matrix:

rx∗ (1)
rx (0)
···


rx (0)
···
 rx (1)
H

Rx = E {xx } =  .
..
 ..
.

rx (p) rx (p − 1) · · ·
rx∗ (p)



rx∗ (p − 1)


..

.

rx (0)
Autocovariance matrix:
Cx = E {(x − mx )(x − mx )H } = Rx − mx mH
x
where mx = [mx , mx , . . . , mx ]T
The autocorrelation matrix of a WSS process x(n) is Toeplitz, Hermitian, and nonnegative definite; hence the eigenvalues of Rx are nonnegative
8
3 Random processes
September 9, 2010
Random processes
Sample mean:
1 N
〈x〉 = ∑ x(n)
N n=1
Realization 1
Realization 2
Realization 3
Realization 4
Realization 5
Ensemble mean:
E [x(n)]
When is the sample mean equal to the ensemble mean (expectation)?
9
3 Random processes
September 9, 2010
Random processes
Ergodicity
Sample mean:
1 N−1
m̂x (N) =
∑ x(n)
N n=0
A WSS process is ergodic in the mean if
lim E {|m̂x (N) − mx |2 } = 0
N→∞
or
lim m̂x (N) = mx
N→∞
Necessary and sufficient condition:
1 N−1
lim
∑ cx (k ) = 0
N→∞ N
k =0
Sufficient condition:
lim cx (k ) = 0
k →∞
Similar derivations exist for higher-order averages
10
3 Random processes
September 9, 2010
Random processes
White noise
White noise is a discrete-time random process v (n) with autocovariance:
cv (k ) = σ2v δ(k )
i.e. cv (k ) = 0 for k 6= 0.
All variables are uncorrelated with variance σ2v (probability density not important)
The power spectrum of zero-mean white noise is constant:
jω
Pv (e ) =
∞
∑
rv (k )e− j k ω = σ2v
k =−∞
11
3 Random processes
September 9, 2010
Random processes
Power spectrum
The power spectrum of a WSS process is the DTFT of the autocorrelation:
jω
Px (e ) =
∞
∑
rx (k )e
− jkω
∞
,
Also: Px (z) =
k =−∞
∑
rx (k )z −k
k =−∞
Since the autocorrelation is conjugate symmetric, the power spectrum is real:
Px (z) = Px∗ (1/z ∗ )
⇒
Px (e j ω ) = Px∗ (e j ω )
If the stochastic process is real, the power spectrum is even:
Px (z) = Px∗ (z ∗ )
Px (e j ω ) = Px∗ (e− j ω ) = Px (e− j ω )
⇒
The power spectrum is nonnegative:
Px (e j ω ) ≥ 0
The total power is proportional to the area under the power spectrum:
1
E {|x(n)|2 } = rx (0) =
2π
Z ∞
−∞
Px (e j ω )d ω
12
(use inverse DTFT, take k = 0)
3 Random processes
September 9, 2010
Random processes
Power spectrum
The eigenvalues λi of the n ×n autocorrelation matrix are upper and lower bounded
by the maximum and minimum value, respectively, of the power spectrum:
min Px (e j ω ) ≤ λi ≤ max Px (e j ω )
ω
ω
The power spectrum is related to the mean of |X (e j ω )|2 as

2 
 N

1
jω
− j nω Px (e ) = lim
E ∑ x(n)e

N→∞ 2N + 1 
n=−N
If x(n) has a nonzero mean or a periodicity, the power spectrum contains impulses
13
3 Random processes
September 9, 2010
Filtering random processes
Suppose x(n) is a WSS process with mean mx and correlation rx (k ) that is filtered
by a stable LSI filter with unit sample response h(n); then the output y (n) is also
WSS with
my
= mx H(e j 0 )
ry (k ) = rx (k ) ∗ h(k ) ∗ h∗(−k )
= rx (k ) ∗ rh (k )
where rh (k ) is the “(deterministic) autocorrelation” of h(n):
∗
rh (k ) = h(k ) ∗ h (−k ) =
∞
∑
h(n)h∗ (n + k )
n=−∞
The power of y (n) is given by
E {|y (n)| } = ry (0) =
2
∞
∞
∑ ∑
h(l )rx (m − l )h∗ (m) = hH Rx h
l =−∞ m=−∞
where we assume h(n) is zero outside [0, N − 1] and h = [h(0), h(1), . . . , h(N − 1)]T
14
3 Random processes
September 9, 2010
Filtering random processes
In terms of the power spectrum, this means that
Py (e j ω ) = Px (e j ω )|H(e j ω )|2
Py (z) = Px (z)H(z)H ∗ (1/z ∗ )
So assuming no pole/zero cancelations between Px (z) and H(z), if H(z) has a
pole (zero) at z = z0 , then Py (z) also has a pole (zero) at z = z0 and another at
the conjugate reciprocal location z = 1/z0∗
If H(e j ω ) is a narrow-band bandpass filter with center frequency ω0 , bandwidth
∆ω, and magnitude 1, then the output power is
E {|y (n)|2 } = ry (0) =
≈
1 ∞
|H(e j ω )|2 Px (e j ω )d ω
2π −∞
∆ω
Px (e j ω0 )
2π
Z
so the power spectrum describes how the power is distributed over frequency ω
15
3 Random processes
September 9, 2010
Spectral factorization
If the power spectrum Px (e j ω ) of a WSS process is a continuous function of ω,
then Px (z) may be factored as
∞
Px (z) =
∑
rx (k )z −k = σ20 Q(z)Q ∗ (1/z ∗ )
k =−∞
Proof:
If ln[Px (z)] is analytic in ρ < |z| < 1/ρ then we can write
∞
ln[Px (z)] =
∑
c(k )z
−k
and
k =−∞
jω
ln[Px (e )] =
∞
∑
c(k )e− j k ω
k =−∞
so c(k ) is the IDTFT of ln[Px (e j ω )], and since ln[Px (e j ω )] is real, c(k ) = c ∗ (−k )
16
3 Random processes
September 9, 2010
Spectral factorization
Proof (continued):
Now we can write
Px (z) = exp{c(0)} exp
(
∞
exp
)
,
|z| > ρ
)∗
= Q ∗ (1/z ∗ ),
∑ c(k )z −k
k =1
If we now define the second exponential as
(
∞
∑ c(k )z −k
Q(z) = exp
k =1
then we can express the third exponential as
(
)
(
−1
exp
∑
c(k )z −k
k =−∞
∞
= exp
∑
c(k )z −k
k =−∞
∑ c(k )(1/z ∗)−k
k =1
(
−1
)
)
|z| < 1/ρ
and so we obtain
Px (z) = σ20 Q(z)Q ∗ (1/z ∗ ) with
17
σ20 = exp{c(0)}
3 Random processes
September 9, 2010
Spectral factorization
The filter Q(z) is causal, stable, and minimum phase; moreover it is monic:
Q(z) = 1 + q(1)z −1 + q(2)z −2 + · · ·
A process that can be factorized as described earlier is a regular process
Properties of a regular process
• A regular process can be realized as the output of a filter H(z) driven by white
noise with variance σ20
• If the process is filtered by the inverse filter 1/H(z), then the output is white
noise with variance σ20 (whitening)
• The process and the white noise contain the same information (compression)
18
3 Random processes
September 9, 2010
Spectral factorization
Suppose the power spectrum is a rational function
Px (z) =
N(z)
D(z)
then the spectral factorization tells us we can factor this as
∗
∗)
B(z)
B
(1/z
Px (z) = σ20 Q(z)Q ∗ (1/z ∗ ) = σ20
A(z) A∗ (1/z ∗ )
where
B(z) = 1 + b(1)z −1 + · · · + b(q)z −q
A(z) = 1 + a(1)z −1 + · · · + a(p)z −p
whose roots are all inside the unit circle
Since Px (e j ω ) is real, we have Px (z) = Px (1/z ∗ ); so the poles and zeros occur in
conjugate reciprocal pairs and we simply relate the zeros inside the unit circle to
the zeros of B(z) and the poles inside the unit circle to the zeros of A(z)
19
3 Random processes
September 9, 2010
Special types of random processes
Autoregressive moving average processes
Suppose we filter white noise v (n) of variance σ2v with the filter
q
Bq (z)
∑k =0 bq (k )z −k
H(z) =
=
A p (z) 1 + ∑kp=1 a p (k )z −k
The power spectrum of the output x(n) can then be written as
Px (z)
= σ2v
Bq (z)Bq∗ (1/z ∗ )
A p (z)A∗p (1/z ∗ )
jω
Px (e )
jω 2
2 |Bq (e )|
= σv
|A p (e j ω )|2
Such a process is known as an autoregressive moving average process of order
(p, q), or ARMA(p, q)
The power spectrum of an ARMA(p, q) process has 2p poles and 2q zeros with
conjugate reciprocal symmetry
20
3 Random processes
September 9, 2010
Special types of random processes
Autoregressive moving average processes
From the LCCDE between v (n) and x(n):
p
x(n) + ∑ a p (l )x(n − l ) =
l =1
q
∑ bq (l )v (n − l )
l =0
we can multiply both sides with x ∗ (n − k ) and take the expectation:
p
rx (k ) + ∑ a p (l )rx (k − l ) =
l =1
q
∑ bq (l )E {v (n − l )x
l =0
∗
(n − k )} =
q
∑ bq (l )rv x (k − l )
l =0
The crosscorrelation between v (n) and x(n) can further be expressed as
∗
rv x (k − l ) = E {v (k )x (l )} =
∞
∑
E {v (k )v ∗(l − m)}h∗ (m) = σ2v h∗ (l − k )
m=−∞
For k ≥ 0, this leads to the Yule-Walker equations

q


2
 σv ∑ bq (l )h∗ (l − k ) = σ2v cq (k ) ; 0 ≤ k ≤ q
p
rx (k ) + ∑ a p (l )rx (k − l ) =
l =0


l =1
 0
; k >q
21
3 Random processes
September 9, 2010
Special types of random processes
Autoregressive moving average processes
The Yule-Walker equations can be stacked for k = 0, 1, . . . , p + q:




r (0)
rx (−1)
···
rx (−p)
c (0)
 x

 q 




..
 rx (1)





c
(1)
rx (0)
.
rx (−p + 1) 

 p 
1


 . 
..
..
..




 .. 
.
.
.







  a p (1) 

2
 r (q)



rx (q − 1)
···
rx (q − p)   .  = σv cq (q)
 x



  .. 

 r (q + 1)





rx (q)
· · · rx (q − p + 1)
x
 0 

 a p (p)
 . 
..
..
..


 . 


 . 
.
.
.




0
rx (q + p) rx (q + p − 1) · · ·
rx (q)
Given the filter coefficients a p (k ) and bq (k ), it gives a recursion for the autocorrelation
Given the autocorrelation, we may compute the filter coefficients a p (k ) and bq (k )
22
3 Random processes
September 9, 2010
Special types of random processes
Autoregressive processes
An ARMA(p, 0) process is an autoregressive process, or AR(p):
Px (z)
= σ2v
|b(0)|2
A p (z)A∗p (1/z ∗ )
jω
Px (e )
= σ2v
|b(0)|2
|A p (e j ω )|2
The Yule-Walker equations are given by
p
rx (k ) + ∑ a p (l )rx (k − l ) = σ2v |b(0)|2 δ(k ) ; k ≥ 0
l =1
Stacking the Yule-Walker equations for k = 0, 1, . . . , p:

rx (0) rx (−1) · · ·

..

 rx (1)
.
rx (0)

 ..
..
 .
.

rx (p) rx (p − 1) · · ·

 
rx (−p)
1
1


 


 
0
rx (−p + 1)  a p (1) 

 = σ2v |b(0)|2  
  .. 
 .. 
..
 . 
.
.


 
rx (0)

a p (p)
0
Estimating a p (k ) from the Yule-Walker equations is easy (linear)
23
3 Random processes
September 9, 2010
Special types of random processes
Moving average processes
An ARMA(0, q) process is a moving average process, or MA(q):
Px (z) = σ2v Bq (z)Bq∗ (1/z ∗ )
Px (e j ω ) = σ2v |Bq (e j ω )|2
The Yule-Walker equations are given by
rx (k ) = σ2v
q
∑ bq (l )b∗q (k − l ) = σ2v bq (k ) ∗ b∗q (−k )
l =0
The autocorrelation function is zero outside [−q, q]
Estimating bq (k ) from the Yule-Walker equations is not easy (nonlinear)
24
3 Random processes
September 9, 2010
Related documents