Download Gaussian Random Variables and Vectors

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Matrix calculus wikipedia , lookup

System of linear equations wikipedia , lookup

Gaussian elimination wikipedia , lookup

Principal component analysis wikipedia , lookup

Transcript
ECE 461
Fall 2006
August 31, 2006
Gaussian Random Variables and Vectors
The Gaussian Probability Density Function
This is the most important pdf for this course. It also called a normal pdf.
#
"
1
(x − m)2
fX (x) = √
.
exp −
2σ 2
σ 2π
It can be shown this fX integrates to 1 (i.e., it is a valid pdf), and that the mean of the random
variable X with the above pdf is m and the variance is σ 2 .
The statement “X is Gaussian with mean m and variance σ 2 ” is compactly written as “X ∼
N (m, σ 2 ).”
The cdf corresponding to the Gaussian pdf is given by
FX (x) =
Z
x
fX (u)du =
−∞
Z
x
−∞
"
#
(u − m)2
1
√
exp −
du.
2σ 2
σ 2π
This integral cannot be computed in closed-form, but if we make the change of variabe
we get
2
Z x−m
σ
1
v
x−m
√
FX (x) =
exp −
dv = Φ
,
2
σ
2π
−∞
u−m
=v
σ
where Φ is the cdf of a N (0, 1) random variable, i.e.,
2
Z x
u
1
√
du.
exp −
Φ(x) =
2
2π
−∞
Note that due to the symmetry of the Gaussian pdf,
Φ(−x) = 1 − Φ(x).
A closely related function to Φ is the Q function which is defined by:
2
Z ∞
u
1
√
exp −
du.
Q(x) = 1 − Φ(x) =
2
2π
x
Some end point properties of Φ and Q are given below:
Q(∞) = Φ(−∞) = 0,
Q(−∞) = Φ(∞) = 1,
Q(0) = Φ(0) = 0.5
For computing the Q function in Matlab, we may use the Matlab functions erf or erfc after
modifying them appropriately.
c
V.V.
Veeravalli, 2006
1
Jointly Gaussian Random Variables
Two random variables X and Y are said to be jointly Gaussian if their joint density satisfies the
equation
1
(x − mX )2 2ρ(x − mX )(y − mY ) (y − mY )2
1
p
−
.
fX,Y (x, y) =
+
exp −
2
2(1 − ρ2 )
σX σY
σX
σY2
2πσX σY 1 − ρ2
Note that the following properties hold:
2
• X is Gaussian with mean mX and variance σX
• Y is Gaussian with mean mY and variance σY2
• The conditional densities fX|Y (x|y) and fY |X (y|x) are also Gaussian
• ρ is the correlation coefficient between X and Y . If ρ = 0, then X and Y are independent.
• Z = aX + bY is also Gaussian (what are the mean and variance of Z?)
The definition of jointly Gaussian random variables extends quite naturally to n variables X1 , X2 , . . . , Xn .
Let the vectors X and m, and matrix Σ be defined by






X1
m1
Σ11 Σ12 . . . Σ1n
 X2 
 m2 
 Σ21 Σ22 . . . Σ2n 






X =  .  m = E[X] =  .  Σ = E[(X − m)(X − m)⊤ ] =  .
..
..
.. 
 .. 
 .. 
 ..
.
.
. 
Xn
mn
Σn1 Σn2 . . . Σnn
where mi = E[Xi ] and Σij = cov(Xi , Xj ). Then the random variables X1 , X2 , . . . , Xn are jointly
Gaussian if their joint density is given by
1
1
⊤ −1
exp − (x − m) Σ (x − m) .
f (x) = p
2
(2π)n det(Σ)
The statement “X1 , X2 , . . . , Xn are jointly Gaussian with mean m and covariance matrix Σ” can
be compactly written as “X ∼ N (m, Σ)”.
Properties of jointly Gaussian random variables include:
• Any subset of jointly Gaussian random variables is also jointly Gaussian.
• Any subset of jointly Gaussian random variables conditioned on any other subset of the
original random variables is also jointly Gaussian.
• Jointly Gaussian random variables that are uncorrelated are also independent.
• Linear combinations of jointly Gaussian random variables are also jointly Gaussian. In particular, suppose we produce the vector Y = [Y1 Y2 . . . Ym ]⊤ using the linear transformation
Y = AX, where A is an m × n matrix. Then,
Y ∼ N AmX , AΣX A⊤
i.e., Y is jointly Gaussian with mean mY = AmX , and covariance matrix ΣY = AΣX A⊤ .
c
V.V.
Veeravalli, 2006
2