Download Chapter 4. Random Processs

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Lecture on Communication Theory
Chapter 4. Random Processes
4.1 Introduction
1. Deterministic signals: the class of signals that may be
modeled as completely specified functions of time.
2. Random signals: it is not possible to predict its precise
value in advance. ex) thermal noise
3. Random variable: A function whose domain is a sample
space and whose range is some set of real numbers.
– obtained by observing a random process at a fixed
instant of time.
4. Random process: ensemble (family) of sample
functions, ensemble of random variables.
4.2 Probability Theory
1. Random experiment 를 위한 요구 사항
1) Repeatable under identical conditions
2) Outcome is unpredictable
3) For a large number of trials of the experiment, the outcomes
exhibit statistical regularity, i.e., a definite average pattern of
outcomes is observed for a large number of trials.
CNU
Dept. of Electronics
1
D. J. Kim
Lecture on Communication Theory
2. Relative-Frequency Approach
1) Relative frequency
N (A)
0 n
1
n
2) Statistical regularity  Probability of event A.
 N (A) 
P(A)  lim  n

n
n


3. Axioms of Probability.
1) 용어
a) Sample points sk: kth outcome of experiment
b) Sample space S: totality of sample points
c) Sure event: entire sample space S
d) : null or impossible event
e) Elementary event: a single sample point
2) Definition of probability
a) A sample space S of elementary events
b) A class  of events that are subsets of S.
c) A probability measure P() assigned to each event A in the class ,
which has the following properties:
( i ) P(s)  1
Axioms
of
Probability
(i i) 0  P(A)  1
(iii) If A  B is the union of two mutually
execlusive events in the class  , then
P(A  B)  P(A)  P(B)
CNU
Dept. of Electronics
2
D. J. Kim
Lecture on Communication Theory
3) Property 1. P(A)  1  P(A)
4) Property 2. If M mutually the exclusive events A1, A 2 ,    , A M
have the exclusive property A1  A 2       A M  S
then P(A1 )  P(A 2 )      P(A M )  1
5) Property 3. P(A  B)  P(A)  P(B) - P(AB)
4. Conditional Probability
1) Conditional Probability of given A
(given A means that event A has occurred)
P(AB)
P(A)
where P(AB)  joint probabilit y of A & B
P(B | A) 
P(AB)  P(B | A)P(A)  P(A | B)P(B)
P(A | B)P(B)
; Bayes' rule
P(A)
2) Statistically independent P(AB) P(A)P(B)
ex1) BSC (Binary Symmetric Channel)
P(B | A) 
Discrete memoryless channel
1-p
[0]
A0
p
[1]
CNU
A1
Dept. of Electronics
p
1-p
3
B0
[0]
B1
[1]
D. J. Kim
Lecture on Communication Theory
.
Priori prob.
P(A 0 ) p0 , P(A 1 ) p 1, 여기에서 p1 p2 1
Conditional prob. or likelihood
P(B1| A 0 )  P(B 0 | A 1 )  p;
[0]송신[1]수신확률
P(B 0 | A 0 )  P(B1 | A 1 )  1  p;
[0]송신[0]수신확률
Output prob.
P(B 0 )  (1  p)p 0  pp 1
P(B1 )  pp 0  (1  p)p 1
Posteriori prob.
P(A 0 | B0 ) 
P(A 1| B1 ) 
P(B0 |A 0 )P(A 0 )
(1  p)p 0

; [0]수신[0]송신확률
P(B0 )
(1  p)p 0  pp 1
P(B1|A1 )P(A 1 )
(1  p)p 1

; [1]수신[1]송신확률
P(B1 )
pp 0  (1  p)p1
4.3 Random variables
1.개요
1) Random variable: A function whose domain is a sample
space and whose range is some set of real numbers
2) Discrete r. v. : X(k), k번째 sample ex)주사위 range {1,,6}
Continuous r. v. : X
ex) 8시~ 8시 10분 버스도착시간
3) Cumulative distribution function (cdf) or distribution fct.
FX(x) = P(X  x)
a) 0  FX(x) 1
b) if x1 < x2, FX(x1)  FX(x2), monotone-nondecreasing fct.
CNU
Dept. of Electronics
4
D. J. Kim
Lecture on Communication Theory
4) pdf (probability density fct.)
fX (x) 
d
FX (x)
dx
FX (x)   f X (ξ )dξ
x

 f X (x)dx  1
P(x1  X  x 2 )  x 2 f X (x)dx
x
1
pdf: nonnegative fct., total area = 1
ex2)
CNU
Dept. of Electronics
5
D. J. Kim
Lecture on Communication Theory
2. Several random variables (2 random variables)
1) Joint distribution fct. FX,Y (x, y)  P(X  x, Y  y)
2) Joint pdf
∂ 2FX,Y (x, y)
fX,Y (x, y) 

∂ x∂ y

- - fX,Y (ξ ,η ) dξ dη  1
3) Total area

FX (x)  - - fX,Y (ξ ,η ) dξ d
x

fX (x)  - fX,Y (x,η ) dη ; marginal density
4) Conditional prob. density fct. (given that X = fixed x)
f (x, y)
If fX (x)  0
fY (y | x)  X,Y
0
fX (x)

 fY (y | x)dy  1
If X,Y are statistically independent
fY(y|x) = fY(y)
 Statistically independent
fX,Y(x,y) = fX(x)fY(y)
4.4 Statistical Average
1. Mean or expected value
1) Continuous

μ X  E[X]   xf X (x)dx
ex)
1
10
0
CNU
Dept. of Electronics
10
1
1 2
E[X]   xdx 
x 5
10
20 0

10
6
D. J. Kim
Lecture on Communication Theory
2) Discrete

Nn (k)
  x k p(k)
k  
k  
n
1
11
ex) 주사위 E[X]  (1  2  3  4  5  6) 
6
3
E[X] 

 xk
2. Function of r. v.
Y=g(X)
X, Y : r. v.

E[Y]  E[g(X)]   g(x)f X (x)dx
ex) Y  g(X)  cos(X)
1
-π  x  π

where f X (x)   2π

otherwise
0
π
1
1
E[Y]  π cosx
dx  
sinx  0
2π
2π
π
π
3. Moments
1) n-th moments

E[X n ]   x n f X (x)dx
n  1  E[X]  μ x mean
n  2  E[X 2 ]
2) Central moments
mean square value of X

E[(X  μ X ) n ]   (x  μ X ) n f X (x)dx
 n  2, σ 2X  var[X]  E[(X  μ X ) 2 ]
where σ X is standard deviation
CNU
Dept. of Electronics
7
D. J. Kim
Lecture on Communication Theory
X2의 meaning: randomness, effective width of fX(x)
그 이유는 Chebyshev inequality을 통해서 알 수 있다.
2
P( X - μ X
σ
 ε )  X2 ; Chebyshev inequality
ε
σ X  E[(X  μ X )2 ]  E[X 2 ]  2μ XE(X)  μ X  E[X 2 ]  μ X
2
2
2
If μ X  0, σ X  E[X 2 ]
2
2
σ X : variance, E[X 2 ] : mean square value
4. Characteristic function
Characteristic function X(v)
fX(x)

φ X (v)  E[exp(jvx)]   f X (x)exp(jvx)dx
 f X (x) 
1 
 φ X (v)exp(-jvx)dv
2π 
ex4) Gaussian Random Variable
 (x  μ X ) 2 
1
    x  
f X (x) 
exp  
2
2πσ X
2σ X 

1

2
φ X (v)  exp  jvμ X  v 2σ X 
2


 x2 
1

If μ X  0, f X (x) 
exp  2 
2πσ X
2
σ

X 
 v 2σ X 2 

φ X x   exp  
2


central moments
1 3  5    (n  1)σ X n for n even
E[(x  μ X ) ]  
for n odd
0
n
CNU
Dept. of Electronics
8
D. J. Kim
Lecture on Communication Theory
5. Joint moments
Joint moments


E[X i Y j ]    x i y j fX,Y (x, y)dxdy
Correlatio n


E[XY]    xyfX,Y (x, y)dxdy
Covariance
cov[XY]  E[(X  E[X])(Y  E[Y])]
 E[XY]  μ Xμ Y
Correlati on coefficien t
cov[XY]
ρ
σ Xσ Y
 X and Y are uncorrelat ed  cov [XY]  0
 
 E[XY]  0
 X and Y are orthogonal
E[X] = 0 or E[Y] = 0
 uncorrelated
X, Y are orthogonal
O uncorrelated
X, Y are statistically independent
X
CNU
Dept. of Electronics
9
D. J. Kim
Lecture on Communication Theory
4.5 Transformations of Random variables: Y=g(X)
1. Monotone transformations: one-to-one
Y
y
X
x
fY (y) 
fX (x)
f (x)
 X
dy/dx dg/dx xg1(y)
2. Many-to-one transformations
fY (y)  
k
fY (x)
dg/dx k x  g1(y)
k
where xk = solution of g(x) = y
CNU
Dept. of Electronics
10
D. J. Kim
Lecture on Communication Theory
4.6 Random processes or schocastic process
 r. v. {X}: Outcomes of a random experiment is mapped into a
number
 r. p. {X(t)} or {X(t,s)}: Outcomes of a random experiment is
mapped into a waveform that is fct. of time indexed
ensemble (family) of r. v.
 Sample function
sample space
xj(t) = X(t,sj)
{x1(t),x2(t),,xn(t)}
 {x1(tk),x2(tk),xn(tk)} = {X(tk,s1),X(tk,s2)X(tk,sn)}
constitutes a random variable
 r. p. 의 예) X(t) = A cos (2fct+), Random Binary Wave,
gaussian noise
CNU
Dept. of Electronics
11
D. J. Kim
Lecture on Communication Theory
4.7 Stationary
1. r. p. X(t) is stationary in the strict sense
– If FX(t1τ ),,X(t k τ ) (x1 ,  x k )  FX(t1)X(t k ) (x1 ,  x k )
for all time shift , all k and all possible t1,,tk.
< observation >
1) k = 1, FX(t)(x) = FX(t+)(x) = FX(x) for all t & .
1st order distribution fct. of a stationary r. p. is independent
of time
2) k = 2 &  = -t, FX(t 1),X(t 2 ) (x1, x 2 )  FX(0), X(t 2 -t1) (x1, x 2 ) for all t1& t2
2nd order distribution fct. of a stationary r. p. depends only
on the differences between the observation time
2. Two r. p. X(t),Y(t) are jointly stationary if the joint
distribution functions of r. v. X(t1),,X(tk) and Y(t1’),
,Y(tk’) are invariant with respect to the location of the
origin t = 0 for all k and j, and all choices of observation
times t1,,tk and t1’, ,tk’.
ex6)
CNU
Dept. of Electronics
12
D. J. Kim
Lecture on Communication Theory
probability of the joint event
A={ai < X(ti)  bi} i=1, 2, 3
P(A)  FX(t ),X(t ),X(t ) (b1, b 2 , b3 )  FX(t
1
2
3
),X(t ),X(t ) (a1, a 2 , a 3 )
1
2
3
4.8 Mean, Correlation, and Covariance functions
1. Mean of r. p. μ X (t)  E[X(t)]   xf x(t) (x)dx, x : r. v.
 For stationary r. p. μ X (t)  μ X  constant, for all t
2. Autocorrelation fct. of r. p. X(t)
R X (t1, t 2 )  E[X(t1 )X(t 2 )]


   x1x 2 fX(t )X(t ) (x1, x 2 )dx 1dx 2
1
2
 For stationary r. p.
RX(t1,t2) = RX(t2-t1)
CNU
Dept. of Electronics
13
D. J. Kim
Lecture on Communication Theory
3. Autocovariance fct. of stationary r. p. X(t)
CX(t1,t2)=E[(X(t1) - X)(X(t2) - X)]
=RX(t2 - t1) - X2
4. Wide-sense stationary
μ X (t)  μ X  constant , for all t

R X (t1 , t 2 )  R X (t 2  t1 ) for all t1 and t 2
 strict-sense stationary
o
x
wide sense stationary
5. Properties of the Autocorrelation Function
 Autocorrelation fct. of stationary process X(t)
RX()=E[X(t+)X(t)] for all t
 Properties
a) Mean-square value
by setting  = 0
RX(0) = E[X2(t)]
b) RX(): even fct.
RX() = RX(-)
c) RX() has its maximum at  = 0, RX()  RX(0)
pf. of c) E[(X(t  τ )  X(t)) 2 ]  0
E[X 2 (t  τ )]  2E[X(t  τ )X(t)]  E[X 2 (t)]  0
2R X (0)  2R X (τ )  0
 R X (0)  R X (τ )  R X (0)
CNU
Dept. of Electronics
14
D. J. Kim
Lecture on Communication Theory
 Physical meaning of RX()
 “Interdependence “ of X(t) and X(t+)
 Decorrelation time 0: for  > 0, RX() < 0.01RX(0)
ex7) Sinusoidal wave with Random phase
X(t)  Acos(2π f c t  Θ )
1

where fΘ (θ )   2π

0
R X (τ )  E[X(t  τ )X(t)]
π  θ  π
otherwise
 E[A 2 cos(2π f c t  2π f cτ  Θ )cos(2π f c t  Θ )]

CNU
Dept. of Electronics
A2
cos(2π f cτ )
2
15
D. J. Kim
Lecture on Communication Theory
ex8) Random Binary Wave
1
P(  A)  P(-A) 
2
 EX(t)  0
1
 , 0  td  T
fTd (t d )   T
0, otherwise
RX(0) = E[X(t)X(t)] = A2
RX(T) = E[X(t)X(t+T)] = 0
CNU
Dept. of Electronics
16
D. J. Kim
Lecture on Communication Theory
6. Cross-correlation Functions
 r. p. X(t) with RX(t,u)
 r. p. Y(t) with autocorrelation RY(t,u)
 Cross-correlation fct. of X(t) and Y(t)
 RXY(t,u) = E[X(t)Y(u)]
 RYX(t,u) = E[Y(t)X(u)]
 Correlation Matrix of r. p. X(t) and Y(t)
R (t,u) R XY (t,u) 
R(t, u)   X

R YX (t,u) R Y (t,u) 
 If X(t) and Y(t) are each w. s. s. and jointly w. s. s.
R X (τ ) R XY (τ )
R(τ )  

R YX (τ ) R Y (τ )
where  = t-u
여기서 RXY()  RXY(-) i.e. not even fct.
RXY(0) is not maximum
RXY() = RYX(-)
CNU
Dept. of Electronics
17
D. J. Kim
Lecture on Communication Theory
ex9) Quadrature - Modulated Processes
X1(t) and X2(t) from w. s. s. r. p. X(t)
X1(t)=X(t)cos(2fct + )
1
0  Θ  2π
X2(t)=X(t)sin(2fct + )
where Θ   2π
 0
 is independent of X(t)
Cross-correlation fct.
R12() = E[X1(t)X2(t-)]
= E[X1(t)X2(t-)]E[cos(2fct+)sin(2f1t-2fc +)]
1
=  R X ( )sin(2π f C )
2
R12(0)=E[X1(t)X2(t)]=0
orthogonal
4.9 Ergodicity
Expectatio n or ensemble average of r. p. X(t)

 average " across the process"


Time average or long -term sample average

 average " along the process"
For sample function x(t) of w. s. s. r. p. x(t) with -T t  T
– Time average (dc value)
μ X (T) 
CNU
1 T
x(t)dt

T
2T
Dept. of Electronics
18
D. J. Kim
Lecture on Communication Theory
– Mean of time average X(T)
μ X (T)  unbiased estimate of ensemble -averaged mean μ X
Thus
 μ X ; mean of r. p. x(t)
2T T
 μ X dx

1 T
2T T
 E[x(t)]dt
E[μ X (T)] 
1 T
1. w. s. s. r. p. X(t) is ergodic in the mean
μ X (T)  μ X
lim
T 
If 
var[μ X (T)]  0
lim
T 
2. w. s. s. r. p. X(t) is ergodic in the autocorrelation fct.
R X (τ , T)  R X (τ )
lim
T 
If 
var[R X (τ , T)]  0
lim
T 
1
where RX(,T) =
 x(t  τ )x(t)dt
2T T
= time averaged autocorrelation fct.
of sample fct. x(t) from w. s. s. r. p. x(t)
T
4.10 Transmission of a r. p. through a linear filter
구해보면
w.s.s r.p
w.s.s r.p
FX(t 1 )X(t k ) (x1    xk )  FY(t1 )Y(t k ) (y1,    yk ) 구할 수 없다
CNU
Dept. of Electronics
19
D. J. Kim
Lecture on Communication Theory
1. Mean of Y(t)

μ Y (t)  E[Y(t)]  E[  h(τ 1 )X(t  τ 1 )dτ 1 ]

  h(τ 1 )E[X(t  τ 1 )]dτ 1

  h(τ 1 )μ X (t  τ 1 )dτ 1

 μ X  h(τ 1 )dτ 1
 μ Y  μ X H(0)
 w. s. s. X(t)
 X(t), Y(t) are w. s. s.
2. Autocorrelation fct.
R Y (t, u)  E[Y(t)Y(u) ]


 E[  h(τ 1 )X(t τ 1 )dτ 1  h(τ 2 )X(u τ 2 )dτ 2 ]




  dτ 1h(τ 1 )  dτ 2 h(τ 2 )R X (t τ 1 , u  τ 2 )
  dτ 1h(τ 1 )  dτ 2 h(τ 2 )R X (τ τ 1  τ 2 )
where τ  t  u  w. s. s. X(t)
 Y(t) is also w. s. s.
 Mean square value E[Y2(t)]=RY(0)


E[Y2 (t)]    h(τ 1 )h(τ 2 )R X (τ 2 τ 1 )dτ 1dτ 2  constant
CNU
Dept. of Electronics
20
D. J. Kim
Lecture on Communication Theory
4.11 Power Spectral density
1. Mean square value of Y(t)를 p. s. d. 로 표현
h1(1)
H(f)
Power spectral density or power spectrum of w. s. s. r. v. X(t)

S X (f)   R X (τ )exp(  j2π fτ )dτ
Mean square value of Y(t)


[watt/Hz]

E[Y 2 (t)]    [  H(f)exp(j2π fτ 1 )df]h(τ 2 )R X (τ 2 τ 1 )dτ 1dτ 2





  dfH(f) - dτ 2 h(τ 2 ) - R X (τ 2 -τ 1 )exp(j2π fτ 1 )dτ 1 (Let τ  τ 2 -τ 1 )

  dfH(f) - dτ 2 h(τ 2 )exp(j2π fτ 2 ) - R X (τ )exp(-j2π fτ )dτ

 - H(f) SX (f)df
2
∴ E[Y 2 (t)]  (2Δ f)SX (f C )

Freq. density of average power in r. p. X(t)
CNU
Dept. of Electronics
21
D. J. Kim
Lecture on Communication Theory
2. Properties of the Power Spectral Density
1) Einstein - Wiener- Khintchine relations

SX (f)   R X (τ )exp(  j2π fτ )dτ


R X (τ )   SX (f)exp(j2π fτ )df
where, X(t) : w. s. s. r. p.
2) Property 1.
For w. s. s. r. p., SX (0)   R X (τ )dτ
3) Property 2.
Mean square value of w. s. s. r. p.

E[X2 (t)]  R X (0)   S X (f)df
4) Property 3.
For w. s. s. r. p., SX(f)  0 for all f.
5) Property 4.
SX(-f) = SX(f): even fct.
 RX(-) = RX()
6) Property 5.
The p. s. d., appropriately normalized, has the properties
usually associated with a probability density fct.
S (f)
PX (f)   X
 SX (f)df
7) rms bandwidth of w. s. s. r. p. X(t)

Wrms  (  f p X (f)df )
CNU
2
Dept. of Electronics
1
2
22
D. J. Kim
Lecture on Communication Theory
ex10) Sinusoidal wave with Random Phase
R. p. X(t) = A cos (2fC(t) + )
where  is uniform r. v. over [-, ]
A2
cos(2π f C t)
2
A2
 SX (f) 
[δ (f  f C )  δ (f  f C )]
4
R X (τ ) 
ex11) Random Binary wave with +A & -A
τ
 2
A (1  ) τ  T
R X (τ )  
T
0
τ T

τ
)exp(-j2π fτ )dτ
T
 A 2 Tsinc 2 (fT)
SX (f)  T A 2 (1 
T
CNU
Dept. of Electronics
23
D. J. Kim
Lecture on Communication Theory
Energy spectral density of a rectangular pulse g(t)
Eg (f)  A 2T 2sinc 2 (fT)
 S X (f) 
Eg (f)
T
ex12) Mixing of a r. p. with a sinusoidal process.
Y(t)  X(t)cos(2π f C t  Θ)


w.s.s r.p
r.v and independen t of X(t)
1
R Y (τ )  R X (τ )cos(2π f Cτ )
2
1
SY (f)  SX (f  f C )  SX (f  f C )
4
3. Relation among the Power Spectral Density of the Input
and Output Random Process

SY (f)   R Y (τ )exp(  j2π fτ )dτ



    h(τ 1 )h(τ 2 )R X (τ  τ 1  τ 2 )exp(  j2π fτ )dτ 1dτ 2 dτ
( let τ  τ 1  τ 2  τ 0
i.e.τ  τ 0  τ 1  τ 2 )
SY (f)  H(f)H  (f)SX (f)
 SY (f)  H(f) SX (f)
2
CNU
Dept. of Electronics
24
D. J. Kim
Lecture on Communication Theory
ex13) Comb filter
H(f)  1 - exp(-j2π fT)
 1 - cos (2π fT)  jsin(2π fT)
H(f)  1 - cos(2π fT)   sin 2 (2π fT)
2
2
 21 - cos(2π fT) 
 4sin 2 (π fT)
 SY (f)  4sin 2 (π fT)S X (f)
For small f , i. e., π fT  1 , sin(π fT)  π fT
SY (f)  4π 2 f 2 T 2SX (f)
differentiator
CNU
Dept. of Electronics
25
D. J. Kim
Lecture on Communication Theory
4. Relation among the Power Spectral Density and the
Amplitude Spectrum of a Sample Function
Sample fct. x(t) of w. s. s. & ergodic r. p. X(t) with SX(f)
X(f,T): FT of truncated sample fct. x(t)
X(f, t)  -T x(t)exp(-j2π ft)dt
T
obtain R X (τ ) using time -average formula
R X (τ )  lim
T 
1 T
 x(t  τ )x(t)dt
2T T


1
2
E X(f, T)
T  2T
2
1  T
 lim
E T x(t)exp(  j2π ft)dt 


T  2T
SX (f)  lim
Conclusion) Sample function 으로부터 SX(f)를 구할 수 있다.
5. Cross Spectral Density
A measure of the freq. interrelationship between 2 random
process
R XY (τ )  SXY (τ )
R YX (τ )  SYX (τ )

R XY (τ )  R YX (τ )  SXY (f)  SYX (f)  SYX (f)
CNU
Dept. of Electronics
26
D. J. Kim
Lecture on Communication Theory
ex14)
– X(t) and Y(t) has zero mean, w. s. s. r. p.
– Consider Z(t) = X(t)+Y(t)
– Auto correlation of Z(t)
R Z (t, u)  E[Z(t)Z(u) ]
 R X (t, u)  R XY (t, u)  R YX (t, u)  R Y (t, u)
(let τ  t - u)
R Z (τ )  R X (τ )  R XY (τ )  R YX (τ )  R Y (τ )

SZ (f)  SX (f)  SXY (f)  SYX (f)  SY (f)
when X(t) and Y(t) are uncorrelated
SZ (f)  SX (f)  SY (f)
ex15)
X(t), Y(t); Jointly w. s. s. r. p.
X(t)
h1(t)
V(t)
Y(t)
h2(t)
Z(t)
where h1, h2 are stable, linear, time-invariant filter
Cross correlation fct. of V(t) and Z(t)

R VZ (τ )   h 1 (τ 1 )h1 (τ 2 )R XY (τ  τ 1  τ 2 )dτ 1dτ 2

 SVZ (f)  H1 (f)H 2 (f)SXY (f)
CNU
Dept. of Electronics
27
D. J. Kim
Lecture on Communication Theory
CNU
Dept. of Electronics
28
D. J. Kim
Lecture on Communication Theory
4.12 Gaussian Process
1. Definition
Process X(t) is a Gaussian process if every linear functional
of X(t) is a Gaussian r. v. Y   Tg(t)X(t)dt
0
g(t) : some fct., Y : r. v.
If the r. v. Y is a Gaussian distributed r. v. for every g(t), then
X(t) is a Gaussian process
여기서
 (y  μ Y ) 2 
1
f Y (y) 
exp 

2
2πσ Y
2σ Y 

normalized (μ Y  0,σ Y  1) Gaussian distributi on : N(0,1)
2
 y2 
1
f Y (y) 
exp   
2π
 2
CNU
Dept. of Electronics
29
D. J. Kim
Lecture on Communication Theory
2. Virtues of Gaussian process
1) Gaussian process has many properties that make analytic
results possible
2) Random processes produced by physical phenomena are
often such that a Gaussian model is appropriate.
3. Central Limit Theorem
1) Let Xi, I = 1, 2, , N be a set of r. v. that satisfies
a) The Xi are statistically independent
b) The Xi have the same p. d. f. with mean X and variance X2
 Xi : set of independently and identically distributed (i. i. d.) r. vs.
 Now Normalized r. v.
Yi 
1
(Xi  μ X ) , i  1,2,  , N.
σX
 E[Yi ]  0
var[Yi ]  1
1 N
define r. v. VN 
 Yi
N i1
< Central limit theorem >
The probability distribution of VN approaches a normalized
Gaussian distribution N(0,1) in the limit as N approaches
infinity. 즉 Normalized r. v. 이 많이 모여서 하나의 r. v. 을
만들면 이는 N(0,1) 이 된다.
CNU
Dept. of Electronics
30
D. J. Kim
Lecture on Communication Theory
4. Properties of Gaussian Process
1) Property 1.
X(t)
Gaussian P.
h(t)
stable, linear
Y(t)
Gaussian P.
If a Gaussian process X(t) is applied to a stable linear filter,
then the random process Y(t) developed at the output of the
filter is also Gaussian.
2) Property 2.
Consider the set of r. v. or samples X(t1), X(t2), , X(tn)
obtained by observing a r. p. X(t) at times t1, t2, , tn.
If the process X(t) is Gaussian, then this set of r. vs. are
jointly Gaussian for any n, with their n-fold joint p. d. f. being
completely determined by specifying the set of means
μ X(ti )  E[X(t i )] , i  1,2,  , n
and the set of auto covariance functions
C X (t k , t i )  E[(X(t k )  μ X(tk ) )(X(ti )  μ X(ti ) )
3) Property 3.
If random variables X(t1), X(t2), , X(tn) from Gaussian
process X(t) are uncorrelated, i. e.
E[(X(t k )  μ X(tk ) )(X(t i )  μ X(ti ) )]  0, i  k
then these random variables are statistically independent
4.13 Noise
 External: e. g. atmospheric, galactic, man-made noise
 Internal: e. g. spontaneous fluctuation of I or V in electric
circuits  shot, themal noise
CNU
Dept. of Electronics
31
D. J. Kim
Lecture on Communication Theory
Channel Test Model
Attenuation
Multipath

Modulation
White Noise
Impulse Noise
Demod
Micro-Reflections
Hum
Amplitude
Multipath
Mod.
Modulation
120Hz+
Harmonics
h1(t)
H1(f)
Ingress
(Shortwave
rad.or
CB,ham)
Phase
Noise and
Freq.
Offset
f(x)
Non Linearity
(Amp
clipping
laser)
Common
Path
Distortion
Products
(Nonlinear
device)
H2(f)
Plant
Response
(Group delay)
Burst
Noise
Impulse Thermal
Noise
Noise
전기제품
on/off
Co-channel
Interference
Adjacent
Channel
Interference
< H. W > Chap 4, 4.6, 4.15, 4.23
CNU
Dept. of Electronics
32
D. J. Kim
Related documents