Download Random Variables and Stochastic Processes * 0903720 Lecture#21

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Random Variables and Stochastic
Processes – 0903720
Lecture#21
Dr. Ghazi Al Sukkar
Email: [email protected]
Office Hours: Refer to the website
Course Website:
http://www2.ju.edu.jo/sites/academic/ghazi.alsukkar
1
Chapter 9
Systems with Stochastic Inputs
 Deterministic Systems
 Memoryless Systems
 Linear Time Invariant Systems (LTI)
 Differentiators
 Upcrossings and Downcrossings of a
stationary Gaussian process
 Vector Processes and Multiterminal Systems
 Discrete-Time Systems
2
Systems with Stochastic Inputs
• A deterministic system1 transforms each input waveform
𝑋 𝑡, 𝜁𝑖 into an output waveform 𝑌 𝑡, 𝜁𝑖 = 𝑇 𝑋 𝑡, 𝜁𝑖 by
operating only on the time variable 𝑡. Thus a set of
realizations at the input corresponding to a process 𝑋(𝑡)
generates a new set of realizations 𝑌 𝑡, 𝜁 at the output
associated with a new process 𝑌(𝑡).
X (t ,  i )
X (t )


T []
Y (t ,  i )
(t )
Y

t
t
• Our goal is to study the output process statistics in terms of
the input process statistics and the system function.
1A
stochastic system on the other hand operates on both the variables 𝑡 and 𝜁
3
Deterministic Systems
Memory
Memoryless
Time variance
Linearity
with Memory
Time-varying
Y (t )  g[ X (t )]
Time-Invariant
Non-Linear
Linear
Y (t )  L[ X (t )]
Linear-Time Invariant
(LTI) systems
X (t )
h(t )
LTI system

Y (t )  X (t )  h(t )   h(t   ) X ( )d


Convolution

4
  h( ) X (t   )d .
Memoryless Systems:
The output Y(t) in this case depends only on the present value of the
input X(t). i.e.,
Y (t )  g{ X (t )}
Strict-sense
stationary input
Wide-sense
stationary input
X(t) stationary
Gaussian with
RXX ( )
Memoryless
system
Strict-sense
stationary output.
Memoryless
system
Need not be
stationary in
any sense.
Memoryless
system
Y(t) stationary, but
not Gaussian with
RXY ( )  RXX ( ).
5
𝑌 𝑡 = 𝑔 𝑋(𝑡)
Here 𝑔(𝑥) is a function of 𝑥.
• Therefore, the first-order density 𝑓𝑌 (𝑦; 𝑡) of 𝑌(𝑡) can be expressed in
terms of the corresponding density 𝑓𝑋 (𝑥; 𝑡) of 𝑋 𝑡 . Furthermore:
∞
𝐸 𝑌(𝑡) =
𝑔 𝑥 𝑓𝑋 𝑥; 𝑡 𝑑𝑥
−∞
• Since 𝑌 𝑡1 = 𝑔 𝑋 𝑡1 and 𝑌 𝑡2 = 𝑔 𝑋 𝑡2 , then the second-order
density 𝑓𝑌 𝑦1 , 𝑦2 ; 𝑡1 , 𝑡2 of 𝑌(𝑡) can be determined in terms of the
corresponding density 𝑓𝑋 𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 of 𝑋 𝑡 . Furthermore:
∞
∞
𝐸 𝑌 𝑡1 𝑌(𝑡2 ) =
𝑔 𝑥1 𝑔(𝑥2 )𝑓𝑋 𝑥1 , 𝑥2 ; 𝑡1 , 𝑡2 𝑑𝑥1 𝑑𝑥2
−∞ −∞
• In the same way the nth-order density 𝑓𝑌 𝑦1 , … , 𝑦𝑛 ; 𝑡1 , … , 𝑡𝑛 of 𝑌(𝑡)
can be determined in terms of the corresponding density
𝑓𝑋 𝑥1 , … , 𝑥𝑛 ; 𝑡1 , … , 𝑡𝑛 of 𝑋 𝑡
6
• If the input 𝑋(𝑡) to a memoryless system is an SSS
process, then to determine the nth-order density of the
output 𝑌(𝑡) we solve the system:
𝑔 𝑥1 = 𝑦1 , 𝑔 𝑥2 = 𝑦2 , … , 𝑔 𝑥𝑛 = 𝑦𝑛
If this system of equations has a unique solution, then:
𝑓𝑋 𝑥1 , … , 𝑥𝑛 ; 𝑡1 , … , 𝑡𝑛
𝑓𝑌 𝑦1 , … , 𝑦𝑛 ; 𝑡1 , … , 𝑡𝑛 =
𝑔′ 𝑥1 ⋯ 𝑔′ (𝑥𝑛 )
Since 𝑋(𝑡) is SSS, then the numerator is invariant to a time
shift, and the denominator does not depend on 𝑡, then
𝑌(𝑡) is also an SSS process.
7
Theorem: If X(t) is a zero mean stationary Gaussian process, and
Y(t) = g[X(t)], where g () represents a nonlinear memoryless device,
then
RXY ( )  RXX ( ),   E{g ( X )}.
Proof:
RXY ( )  E{ X (t )Y (t   )}  E[ X (t ) g{ X (t   )}]
   x1 g ( x2 ) f X1X 2 (x1 , x2 )dx1dx2
where X 1  X (t ), X 2  X (t   ) are jointly Gaussian random
variables, and hence
 x* A1 x / 2
1
𝑋≡𝑋
f X1X 2 ( x1 , x2 ) 
e
2 | A|
X  ( X 1 , X 2 )T ,
x  ( x1 , x2 )T
 R (0) R ( )   *
A  E{ X X }  
 LL

 R ( ) R (0) 
*
XX
XX
XX
XX
8
where L is an upper triangular factor matrix with positive diagonal
entries. i.e.,
 l11 l12 
L
.
 0 l22 
Consider the transformation
Z  L1 X  ( Z1 , Z 2 )T ,
z  L1 x  ( z1 , z2 )T
so that
1
*1
1
*1
E{Z Z }  L E{X X }L  L AL  I
*
*
and hence Z1, Z2 are zero mean independent Gaussian random
variables. Also
x  L z  x1  l11 z1  l12 z2 , x2  l22 z2
and hence
x A1 x  z L* A1 Lz  z z  z12  z22 .
*
*
*
The Jacobaian of the transformation is given by
9
| J || L1 || A |1/ 2 .

RXY ( )   

  (l
11


  z g (l
 l11  

 l12  
z1  l12 z2 ) g (l22 z2 ) 
1
1
| J | 2 | A|1/ 2
22
z2 ) f z1 ( z1 ) f z2 ( z2 )dz1dz2
22
z2 ) f z1 ( z1 ) f z2 ( z2 )dz1dz2
1

  z g (l

2
e
 z12 / 2  z22 / 2
0
e

 l11   z1 f z1 ( z1 )dz1   g (l22 z2 ) f z2 ( z2 )dz2

 l12   z2 g (l22 z2 ) f z2 ( z2 ) dz2
1
2

l12
l222

  ug (u)
let u  l22 z2 .
1
2
e
e z / 2
2
2
 u 2 / 2 l222
du,
10
fu ( u )

RXY ( )  l12 l22   g (u )
u
l2
1
2 l222
22

e
 u 2 / 2 l222
du
df u ( u )
 f u ( u )
du

  RXX ( )   g (u ) f u(u )du,
since A  LL* gives l12 l22  RXX ( ). Hence
0
RXY ( )  RXX ( ){ g (u ) f u (u ) |



    g (u ) f u (u )du}
 RXX ( ) E{g ( X )}  RXX ( ),
the desired result, where   E[ g ( X )]. Thus if the input to
a memoryless device is stationary Gaussian, the cross correlation
function between the input and the output is proportional to the
input autocorrelation function.
11
Linear Systems
L[] represents a linear system if
L{a1 X (t1 )  a2 X (t2 )}  a1 L{X (t1 )}  a2 L{X (t2 )}.
Let
Y (t )  L{ X (t )}
represent the output of a linear system.
(1)
Time-Invariant System: L[] represents a time-invariant system if
(2)
Y (t )  L{ X (t )}  L{ X (t  t0 )}  Y (t  t0 )
i.e., shift in the input results in the same shift in the output also.
If L[] satisfies both (1) and (2), then it corresponds to
a linear time-invariant (LTI) system.
LTI systems can be uniquely represented in terms of their output to
h (t )
Impulse
a delta function (impulse response)
 (t )
LTI
h (t )
response of
the system
t
Impulse
Impulse
response
12
Where ℎ 𝑡 = 𝐿 𝛿(𝑡)
Y (t )
X (t )
X (t )
t
t
Y (t )
LTI

Y (t )     h(t   ) X ( )d
arbitrary
input

    h( ) X (t   )d
Which follows by expressing X(t) as

X (t )     X ( ) (t   )d
and applying (1) and (2) to Y (t )  L{ X (t )}. Thus

Y (t )  L{ X (t )}  L{   X ( ) (t   )d }

    L{ X ( ) (t   )d }
By Linearity

    X ( ) L{ (t   )}d

By Time-invariance

    X ( )h (t   )d     h ( ) X (t   )d .
13
Output Statistics:
• The mean of the output process is given by:

 (t )  E{Y (t )}   E{ X ( )h(t   )d }

Y

   X ( )h(t   )d   X (t )  h(t ).

• Similarly the cross-correlation function between the input and
Output processes is given by:
R XY (t1 , t2 )  E{ X (t1 )Y * (t2 )}

 E{ X (t1 )    X * (t2   )h * ( )d }

    E{ X (t1 ) X * (t2   )}h * ( )d

    R XX (t1 , t2   )h * ( )d
 R XX (t1 , t2 )  h * (t2 ).
Where 𝐿2 means that the system operates on the variable 𝑡2 .
14
• Finally the output autocorrelation function is given by:
RYY (t1 , t 2 )  E{Y (t1 )Y * (t 2 )}
 E{


X (t1   )h(  )d Y * (t 2 )}

  E{ X (t1   )Y * (t 2 )}h(  )d


  RXY (t1   , t 2 )h(  )d

 RXY (t1 , t 2 )  h(t1 )  L1RXY t1 , t 2 ,
or
RYY (t1 , t2 )  RXX (t1 , t2 )  h* (t2 )  h(t1 ).
𝜂𝑋 (𝑡)
𝑅𝑋𝑋 (𝑡1 , 𝑡2 )
ℎ∗ (𝑡2 )
ℎ(𝑡)
𝑅𝑋𝑌 (𝑡1 , 𝑡2 )
𝜂𝑌 (𝑡)
ℎ(𝑡1 )
𝑅𝑌𝑌 (𝑡1 , 𝑡2 )
15
• Also the autocovariance 𝐶𝑌𝑌 (𝑡1 , 𝑡2 ) of 𝑌(𝑡) is the
autocorrelation 𝑅𝑌𝑌 (𝑡1 , 𝑡2 ) of the centered process
𝑌 𝑡 = 𝑌 𝑡 − 𝜂𝑌 (𝑡)
And 𝑌 𝑡 = ℎ 𝑡 ∗ 𝑋(𝑡) where 𝑋 𝑡 = 𝑋 𝑡 − 𝜂𝑋 (𝑡)
𝑅𝑋𝑋 𝑡1 , 𝑡2 = 𝐸 𝑋 𝑡1 − 𝜂𝑋 (𝑡1 ) 𝑋 𝑡2 − 𝜂𝑋 (𝑡2 ) ∗
= 𝐶𝑋𝑋 𝑡1 , 𝑡2
𝑅𝑌𝑌 𝑡1 , 𝑡2 = 𝐸 𝑌 𝑡1 − 𝜂𝑌 (𝑡1 ) 𝑌 𝑡2 − 𝜂𝑌 (𝑡2 ) ∗
= 𝐶𝑌𝑌 𝑡1 , 𝑡2
𝑅𝑋𝑌 𝑡1 , 𝑡2 = 𝐸 𝑋 𝑡1 − 𝜂𝑋 (𝑡1 ) 𝑌 𝑡2 − 𝜂𝑌 (𝑡2 ) ∗
= 𝐶𝑋𝑌 𝑡1 , 𝑡2
Then: 𝐶𝑋𝑌 𝑡1 , 𝑡2 = 𝐶𝑋𝑋 𝑡1 , 𝑡2 ∗ ℎ∗ (𝑡2 )
𝐶𝑌𝑌 𝑡1 , 𝑡2 = 𝐶𝑋𝑌 𝑡1 , 𝑡2 ∗ ℎ (𝑡1 )
16
In particular if 𝑋(𝑡) is wide-sense stationary, then we have (t )  
so that:
X
∞
𝜂𝑌 𝑡 = 𝜂𝑋
ℎ 𝜏 𝑑𝜏 = 𝜂𝑋 𝑐,
𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡
−∞
Also 𝑅𝑋𝑋 𝑡1 , 𝑡2 = 𝑅𝑋𝑋 (𝑡1 − 𝑡2 ) so:
∞∞
𝑅𝑋𝑋 𝑡1 − 𝑡2 + 𝛼 ℎ∗ 𝛼 𝑑𝛼 = 𝑅𝑋𝑋 𝜏 ∗ ℎ∗ −𝜏
𝑅𝑋𝑌 𝑡1 , 𝑡2 =
−∞
= 𝑅𝑋𝑌 𝜏 , 𝜏 = 𝑡1 − 𝑡2
Thus 𝑋(𝑡) and 𝑌(𝑡) are jointly WSS. Further, the output
autocorrelation simplifies to:

RYY (t1 , t 2 )   R XY (t1    t 2 )h(  )d ,

  t1  t 2
 R XY ( )  h( )  RYY ( ).
We obtain
RYY ( )  RXX ( )  h* ( )  h( ).
17
X
Thus the output process is also wide-sense stationary.
This gives rise to the following representation
X (t )
wide-sense
stationary process
LTI system
ℎ(𝑡)
X (t )
wide-sense
stationary process.
X (t )
strict-sense
stationary process
LTI system
ℎ(𝑡)
X (t )
strict-sense
stationary process
X (t )
Linear system
Gaussian
ℎ(𝑡)
process (also stationary)
X (t )
Gaussian process
(also stationary)
18
White Noise Process:
• Recall that 𝑊(𝑡) is said to be a white noise process if
RWW (t1 , t2 )  q(t1 ) (t1  t2 ),
i.e., 𝐸[𝑊(𝑡1 )𝑊 ∗ (𝑡2 )] = 0 unless 𝑡1 = 𝑡2 .
• 𝑊(𝑡) is said to be wide-sense stationary (WSS) white noise
if 𝐸[𝑊(𝑡)] = 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡, and
𝑅𝑊𝑊 𝑡1 , 𝑡2 = 𝑞𝛿 𝑡1 − 𝑡2 = 𝑞𝛿 𝜏
• If 𝑊(𝑡) is also a Gaussian process (white Gaussian process),
then all of its samples are independent random variables.
White noise
𝑊(𝑡)
LTI
ℎ(𝑡)
Colored Noise
𝑁 𝑡 = ℎ 𝑡 ∗ 𝑊(𝑡)
19
• For WSS white noise input 𝑊(𝑡), we have:

E[ N (t )]  W   h( )d , a constant
and
RNN ( )  q ( )  h* ( )  h( )
 qh* ( )  h( )  q ( )
where

 ( )  h( )  h ( )   h( )h * (   )d .
*

• Thus when the input of an LTI system is white noise process then
the output represents a (colored) noise process.
Note: White noise need not be Gaussian.
“White” and “Gaussian” are two different concepts!
20
Case study:Differentiators
• A differentiator is a linear system whose output is the
derivative on the input:
𝑑𝑋(𝑡)
𝑌 𝑡 = 𝐿 𝑋(𝑡) =
= 𝑋 ′ (𝑡)
𝑑𝑡
• The mean of the output:
𝜂𝑋 ′ 𝑡 = 𝐿 𝜂𝑋 (𝑡) = 𝜂𝑋′ (𝑡)
• The crosscorrelation between 𝑋(𝑡) and 𝑋 ′ (𝑡) is:
𝜕𝑅𝑋𝑋 (𝑡1 , 𝑡2 )
𝑅𝑋𝑋 ′ 𝑡1 , 𝑡2 = 𝐿2 𝑅𝑋𝑋 (𝑡1 , 𝑡2 ) =
𝜕𝑡2
• The output autocorrelation:
𝜕𝑅𝑋𝑋 ′ 𝑡1 , 𝑡2
𝑅𝑋 ′ 𝑋 ′ 𝑡1 , 𝑡2 = 𝐿1 𝑅𝑋𝑋 ′ 𝑡1 , 𝑡2 =
𝜕𝑡1
• Combining both:
𝜕 2 𝑅𝑋𝑋 (𝑡1 , 𝑡2 )
𝑅𝑋 ′𝑋 ′ 𝑡1 , 𝑡2 =
𝜕𝑡1 𝜕𝑡2
21
• If 𝑋(𝑡) is WSS, then 𝜂𝑋 𝑡 = 𝜂𝑋 which is constant,
hence:
𝜂𝑋 ′ 𝑡 = 0
Also since 𝑅𝑋𝑋 (𝑡1 , 𝑡2 ) = 𝑅𝑋𝑋 (𝜏), 𝜏 = 𝑡1 − 𝑡2
Then
𝜕𝑅𝑋𝑋 𝑡1 − 𝑡2
𝑑𝑅𝑋𝑋 𝜏
=−
𝜕𝑡2
𝑑𝜏
𝜕 2 𝑅𝑋𝑋 (𝑡1 − 𝑡2 )
𝑑 2 𝑅𝑋𝑋 𝜏
=−
𝜕𝑡1 𝜕𝑡2
𝑑𝜏 2
Hence:
′
𝑅𝑋𝑋 ′ 𝜏 = −𝑅𝑋𝑋
𝜏
′′
𝑅𝑋 ′ 𝑋 ′ 𝜏 = −𝑅𝑋𝑋
𝜏
22
Upcrossings and Downcrossings of a stationary Gaussian process:
Consider a zero mean stationary Gaussian process 𝑋(𝑡) with
autocorrelation function 𝑅𝑋𝑋 (𝜏) .An upcrossing over the mean value
occurs whenever the realization 𝑋(𝑡) passes through zero with
positive slope. Let 𝜌∆𝑡
𝑋(𝑡)
represent the probability
Upcrossings
of such an upcrossing in
the interval 𝑡, 𝑡 + ∆𝑡
𝑡
We wish to determine 𝜌.
Downcrossing
Since 𝑋(𝑡) is a stationary Gaussian process, its derivative process 𝑋 ′ (𝑡)
is also zero mean stationary Gaussian with autocorrelation function
′′
𝑅𝑋 ′𝑋 ′ 𝜏 = −𝑅𝑋𝑋
𝜏 . Further 𝑋(𝑡) and 𝑋 ′ (𝑡)
are jointly Gaussian stationary processes, and since
′
𝑅𝑋𝑋 ′ 𝜏 = −𝑅𝑋𝑋
𝜏
23
we have
𝑅𝑋𝑋 ′ −𝜏 =
𝑑𝑅𝑋𝑋 −𝜏
−
𝑑 −𝜏
=
𝑑𝑅𝑋𝑋 𝜏
𝑑𝜏
= −𝑅𝑋𝑋 ′ 𝜏 (odd function)
which for 𝜏 = 0 gives
𝑅𝑋𝑋 ′ 0 = 0 ⟹ 𝐸 𝑋 𝑡 𝑋 ′ 𝑡
=0
i.e., the jointly Gaussian zero mean random variables
X 1  X (t ) and X 2  X (t )
are uncorrelated and hence independent with variances
′′
𝜎12 = 𝑅𝑋𝑋 0 𝑎𝑛𝑑 𝜎22 = 𝑅𝑋 ′𝑋 ′ 0 = −𝑅𝑋𝑋
0 >0
respectively. Thus
𝑥12 𝑥22
−
2 +2𝜎 2
2𝜎
1
2
𝑒
1
𝑓𝑋1 𝑋2 𝑥1 , 𝑥2 = 𝑓𝑋1 𝑥1 𝑓𝑋2 𝑥2 =
2𝜋𝜎1 𝜎2
To determine 𝜌 the probability of upcrossing rate,
24
we argue as follows: In an interval(t , t  t ), the realization moves
from X(t) = X1 to X (t  t )  X (t )  X (t )t  X 1  X 2 t,
and hence the realization intersects with the zero level somewhere
in that interval if
X 1  0, X 2  0,
and
X (t  t )  X 1  X 2 t  0
i.e., X 1   X 2 t.
Hence the probability of upcrossing
in (t , t  t ) is given by

t   x
2 0
0
 x   x t f
1
2

X1 X 2
( x1 , x2 )d x1dx2
X (t )
X ( t  t )
t
t  t
t
X (t )

  0 f X 2 ( x2 )d x2   x t f X1 ( x1 )d x1 .
2
Differentiating both sides of (14-53) with respect to t , we get

   0 f ( x2 )x2 f (  x2 t )dx2
X2
X1
and letting t  0, this equation reduce to
25

  0

1
x2 f X ( x2 ) f X (0)dx2 
x2 f X ( x2 )dx2

0
2R XX (0)
1
1
1

( 2 2 /  ) 
2R XX (0) 2
2
 (0)
 R XX
R XX (0)
[where we have made use of (5-78), Text]. There is an equal
probability for downcrossings, and hence the total probability for
crossing the zero line in an interval (t , t  t ) equals  0 t , where
 
0
1

 (0) / RXX (0)  0.
 RXX
It follows that in a long interval T, there will be approximately  0T
 (0) is large, then the
crossings of the mean value. If  RXX
autocorrelation function RXX ( ) decays more rapidly as  moves
away from zero, implying a large random variation around the origin
(mean value) for X(t), and the likelihood of zero crossings should
increase with increase in  RXX (0)
26
Vector Processes and Multiterminal Systems
• We consider now a MIMO system which is a system with 𝑛
input Random processes 𝑋𝑖 (𝑡) 𝑛𝑖=1 and 𝑟 output R.P.s
𝑟
𝑌𝑗 (𝑡)
𝑗=1
𝑋1 (𝑡)
𝑋2 (𝑡)
⋮
𝑋𝑛 (𝑡)
MIMO
𝑌1 (𝑡)
𝑌2 (𝑡)
⋮
𝑌𝑟 (𝑡)
• Let 𝑋 𝑡 = 𝑋𝑘 (𝑡) = 𝑋1 𝑡 , 𝑋2 𝑡 , … , 𝑋𝑛 (𝑡) 𝑡 be the input
column Random Process vector, whose 𝑛 elements are
Random Processes.
• Let 𝑌 𝑡 = 𝑌𝑙 (𝑡) = 𝑌1 𝑡 , 𝑌2 𝑡 , … , 𝑌𝑟 (𝑡) 𝑡 be the output
column Random Process vector, whose 𝑟 elements are
Random Processes.
27
• The mean of 𝑋 𝑡 is:
𝐸𝑋 𝑡
= 𝜂𝑋 𝑡 = 𝜂𝑖 (𝑡) , where 𝜂𝑖 𝑡 = 𝐸 𝑋𝑖 (𝑡) .
• The autocorrelation 𝑅𝑋𝑋 𝑡1 , 𝑡2 of the vector process
𝑋 𝑡 is an 𝑛 × 𝑛 matrix:
𝑅𝑋𝑋 𝑡1 , 𝑡2 = 𝐸 𝑋 𝑡1 𝑋 † (𝑡2 )
With elements 𝐸 𝑋𝑘 (𝑡1 )𝑋𝑙∗ 𝑡2 , 𝑘, 𝑙 = 1, … , 𝑛
• The cross-correlation between 𝑋 𝑡 and 𝑌 𝑡 is an
𝑛 × 𝑟 matrix:
𝑅𝑋𝑌 𝑡1 , 𝑡2 = 𝐸 𝑋 𝑡1 𝑌 † (𝑡2 )
With elements 𝐸 𝑋𝑘 (𝑡1 )𝑌𝑙∗ 𝑡2 , where
𝑘 = 1, … , 𝑛 and 𝑙 = 1, … , 𝑟
28
• If the MIMO system is an LTI system, then it is specified in
terms of its impulse response matrix
ℎ 𝑡 = ℎ𝑙𝑘 (𝑡) , 𝑘 = 1, … , 𝑛 and 𝑙 = 1, … , 𝑟
which is an 𝑟 × 𝑛 matrix
Where it component ℎ𝑙𝑘 (𝑡) is the response of the 𝑙th output
terminal when the 𝑘th input terminal has input equals 𝛿(𝑡) and
all other input terminal has input equal 0.
• The response of the 𝑙th output terminal 𝑌𝑙 (𝑡) to an arbitrary
input 𝑋 𝑡 = 𝑋𝑘 (𝑡) is:
𝑌𝑙 𝑡 =
∞
ℎ
−∞ 𝑙1
𝛼 𝑋1 𝑡 − 𝛼 𝑑𝛼 + ⋯ +
Hence
∞
ℎ
−∞ 𝑙𝑛
𝛼 𝑋𝑛 𝑡 − 𝛼 𝑑𝛼
∞
𝑌 𝑡 =
ℎ 𝛼 𝑋 𝑡 − 𝛼 𝑑𝛼
−∞
29
• By setting 𝑡 = 𝑡2 and take the Hermitian of 𝑌 𝑡 and
Premultiplying by 𝑋 𝑡1
∞
𝑋 𝑡1 𝑌 † 𝑡2 =
𝑋 𝑡1 𝑋 † 𝑡2 − 𝛼 ℎ† 𝛼 𝑑𝛼
−∞
Then by taking the mean of both sides:
∞
𝑅𝑋𝑋 𝑡1 , 𝑡2 − 𝛼 ℎ† 𝛼 𝑑𝛼
𝑅𝑋𝑌 𝑡1 , 𝑡2 =
−∞
• By setting 𝑡 = 𝑡1 and Postmultiplying 𝑌 𝑡 by 𝑌 † 𝑡2
∞
𝑌 𝑡1 𝑌 † 𝑡2 =
ℎ 𝛼 𝑋 𝑡1 − 𝛼 𝑌 † 𝑡2 𝑑𝛼
−∞
Then by taking the mean of both sides:
∞
𝑅𝑌𝑌 𝑡1 , 𝑡2 =
ℎ 𝛼 𝑅𝑋𝑌 𝑡1 − 𝛼, 𝑡2 𝑑𝛼
−∞
30
Discrete-Time Systems
• A discrete-time system is a system where the input and the
output are discrete-time signals.
• An LTI discrete-time system is specified through its impulse
response ℎ[𝑛] which is the response of the system when the
input is a discrete impulse
1, 𝑛 = 0
𝛿𝑛 =
0, 𝑛 ≠ 0
• The output 𝑌[𝑛] of a discrete-time system when the input is a
Stochastic process 𝑋[𝑛] is given as:
∞
𝑌 𝑛 =𝑋 𝑛 ∗ℎ 𝑛 =
∞
𝑋 𝑛−𝑘 ℎ 𝑘 =
𝑘=−∞
𝑋[𝑛]
ℎ[𝑛]
ℎ 𝑛−𝑘 𝑥 𝑘
𝑘=−∞
𝑌[𝑛]
Output Statistics
• The mean of the output process is given by:
∞
𝜂𝑌 𝑛 = 𝐸 𝑌[𝑛] = 𝐸
∞
ℎ 𝑛 − 𝑘 𝑋[𝑘] =
𝑘=−∞
ℎ 𝑛 − 𝑘 𝐸 𝑋[𝑘]
𝑘=−∞
= 𝜂𝑋 𝑛 ∗ ℎ[𝑛]
• The cross-correlation function between the input and Output
processes is given by:
∞
𝑅𝑋𝑌 𝑛1 , 𝑛2 = 𝐸 𝑋 𝑛1 𝑌 ∗ 𝑛2
∞
∞
𝐸 𝑋 𝑛1 𝑋 ∗ 𝑛2 − 𝑘 ℎ∗ 𝑘 =
=
𝑘=−∞
= 𝑅𝑋𝑋 𝑛1 , 𝑛2 ∗
ℎ∗
𝑘=−∞
𝑅𝑋𝑋 𝑛1 , 𝑛2 − 𝑘 ℎ∗ 𝑘
𝑘=−∞
𝑛2
𝑋 ∗ 𝑛2 − 𝑘 ℎ∗ 𝑘
= 𝐸 𝑋 𝑛1
• The output autocorrelation function is given by:
𝑅𝑌𝑌 𝑛1 , 𝑛2 = 𝐸 𝑌 𝑛1 𝑌 ∗ 𝑛2
∞
𝑋 𝑛1 − 𝑚 ℎ 𝑚 𝑌 ∗ 𝑛2
=𝐸
𝑚=−∞
∞
𝐸 𝑋 𝑛1 − 𝑚 𝑌 ∗ 𝑛2 ℎ 𝑚
=
𝑚=−∞
∞
=
𝑅𝑋𝑌 𝑛1 − 𝑚, 𝑛2 ℎ 𝑚 = 𝑅𝑋𝑌 𝑛1 , 𝑛2 ∗ ℎ 𝑛1
𝑚=−∞
• Then by combining both relations:
𝑅𝑌𝑌 𝑛1 , 𝑛2 = 𝑅𝑋𝑋 𝑛1 , 𝑛2 ∗ ℎ∗ 𝑛2 ∗ ℎ 𝑛1
Discrete-Time WSS input
• If the input process 𝑋[𝑛] is WSS, then 𝑌[𝑛] is also WSS with:
• Mean:
∞
𝜂𝑌 = 𝜂 𝑋
ℎ[𝑘] = 𝜂𝑋 𝐶: constant
𝑘=−∞
• The cross-correlation between 𝑋(𝑡) and 𝑌(𝑡) is:
𝑅𝑋𝑌 𝑚 = 𝑅𝑋𝑋 𝑚 ∗ ℎ∗ [−𝑚]
• The output autocorrelation is:
𝑅𝑌𝑌 𝑚 = 𝑅𝑋𝑌 𝑚 ∗ ℎ[𝑚]
Or:
𝑅𝑌𝑌 𝑚 = 𝑅𝑋𝑋 𝑚 ∗ ℎ∗ −𝑚 ∗ ℎ 𝑚 = 𝑅𝑋𝑋 𝑚 ∗ 𝜌[𝑚]
∗
Where 𝜌 𝑚 = ∞
ℎ
𝑚
+
𝑘
ℎ
𝑘
𝑘=−∞