Download Random Variables - University of Nevada, Reno

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Outline
•
•
•
•
•
Random variables.
CDF and pdf.
Joint random variables.
Correlated, independent, orthogonal.
Correlation, convolution, correlation
coefficient.
• Normal distribution.
Random Variables
M. Sami Fadali
Professor of Electrical Engineering
University of Nevada, Reno
1
Examples
Random Variable
S
s1
X(.)
R
2
X :S R
• Function mapping the elements of the sample space S
to the real line R.
• Equivalent Event: real value associated with the
elementary event(s).
• Probability of the real value = sum of probabilities of
the original associated elementary event(s) in the
sample space.
3
• Example: Throw die, outcome 1- 6 dots.
Random Variable: maps i dots to i, i =1, …, 6.
• Example: Measurement of any physical quantity
with additive random error (noise) .
• Example: pitch, card game, collect tricks.






Probabilities of equivalent events:
4
Cumulative Probability
Distribution Function
Properties of the CDF
Definition: The cumulative distribution
function (CDF) of a random variable is a
function defined for each real number x as
follows
1.
2.
3.
as
as
is a nondecreasing function of
5
6
Properties of the pdf
Probability Density Function (pdf)
• Continuous random variable
• Nonnegative function f defined
on the real line.
• For every real interval
1.
2.
fX(x)
3.
• First two properties follow from the
axioms of probability.
• Integrate:
x
x
x + dx
P x  X  x  dx   f X ( x)dx
7
8
Example: Spin the Pointer
Uniform Distribution
 1 , a  x  b
f X ( x )  b  a
0,
elsewhere
1
1/(b  a)
a
x1
x2
b
2
2 1
x
• Probability of value in any subinterval of
is proportional to its length.
• Area of rectangle must be unity.

1
2
2 1
MATLAB >> rand(m, n) % a=0, b=1, m by n
9
10
Expectation of a Random Variable
Density & Distribution
• Expected value or mean of .
• Justified by relative frequency
• Discrete
.
• Continuous
11
12
Function of a Random Variable
Properties of Expectation
• Expected value of the function
• Discrete:
• Continuous:
13
Jensen’s Inequality
• For a random variable
14
Moments
moment: expectation of
power
• First moment is the mean
• Second moment is the mean square
• Discrete
and a function
• Convex function:
• Continuous
1
15
16
Variance
Properties of the Variance
• Second moment about the mean
Standard Deviation: square root of variance
• Discrete
• Continuous
• Uncorrelated:
17
Variance Property: Proof
18
Example: Uniform Distribution
Find the mean and variance
Mean:
Mean Square:
,
Variance:
19
20
Series Expansion and Moments
Moment Generating Function
L
• Laplace transform of pdf (also defined with
• Use 2-sided Laplace transform table.
• Moments:
→
21
Normal or Gaussian Density
Characteristic Function
=
22
→
• Fourier transform: Use Fourier transform tables
•
•
•
•
Symmetric about the mean
.
)=
(larger for sharper peak)
Peak value (at
Mode (most likely value) = mean
Standard normal distribution,
zero-mean, unit variance
/
• Moments:
→
23
24
Right Tail Probability
Why is it important?
• Probability of exceeding a given value.
>> p = normspec([-Inf,1],0,1,'outside')
• Complementary cumulative distribution.
• Fits many physical phenomena.
• Central limit theorem.
/
• Completely described by mean and variance.
• Independent  uncorrelated.
=Prob. of false alarm.
25
26
Error Function
Inverse
monotonically decreasing
invertible.
• Erf : error function
erf(x)
0.8
• Inverse is important in some applications
prob. of false alarm).
(signal detection:
0.7
0.6
Erfc: complementary error
function (invertible)
⁄
1
0.9
0.5
0.4
erfc(x)
0.3
0.2
0.1
0
27
x
0
0.5
1
1.5
2
2.5
3
28
Erf and Gaussian Density
Relation to Normal Distribution
/
Probability Less than Upper Bound is 0.9452
0.8
/
0.7
0.6
• Normal Distribution:
mean zero, variance 0.5
N
Density
0.5
0.3
0.2
2.
0.1
0
-2
⁄
• For negative
⁄
0.4
-1.5
-1
-0.5
0
0.5
Critical Value
1
1.5
2
x
.
3.
use
29
MATLAB: Computing Probabilities
(similar for Maple)
>> erf(x) %
30
Example: Test Scores
• Test scores are normally distributed with
N
Error function
>> erfc(x) % Complementary error function
>> 0.5*(1+erf(x/sqrt(2)) ) % St. Normal P (t < x)
>> 0.5*erfc(x/sqrt(2)) % St. Normal P (t > x)
>> Qinv=sqrt(2)*erfinv(1-2*P) % Inverse Q(P)
31
>> fun = @(x) exp(-(x-83).^2/128)./sqrt(128*pi);
>> integral(fun,83-16,83+16) % within 2 sigma
ans =
0.9545
32
Impulsive pdf
Pseudorandom Number Generators
f(y)
>> rand % Uniform distribution over [0,1]
>> randn % Standard normal
Shifting and Scaling (also see random):
>> y = sigmay*randn + ybar
1
0.5 (Y)
0.9
0.8
0.7
0.6
• Use impulse for
discrete or mixed
random variables.
• For
0.5
0.4
0.3
0.2
0.1
0
-2
F(y)
-1.5
-1
-0.5
0
0.5
Y
1
1.5
2
2.5
3
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
N
0.2
N
0.1
0
-2
-1.5
-1
-0.5
0
0.5
33
Example: Half-wave rectifier
1
1.5
2
2.5
3
34
Density/Distribution
Half-wave rectifier driven by noise
N
1
R+  R+
R  0
35
36
Multiple Random Variables
Multivariate Distributions
Multivariate: vector of random variables
= n by 1 vector
Bivariate: 2 variables
A
x2
Discrete case: joint prob. = 2-dim. array
dx2
Obtain marginal prob. by adding col. or row
dx1
x20
Generalize:
x1
x10
37
38
Conditional Distribution
Marginal Distributions
• Conditional probability
• Conditional density of
Marginal pdf
given
| ,…,
, ,…,
Example
,…,
, ,…,
| ,…,
,…,
, ,…,
| ,…,
39
,…,
40
Bayes Rule for Random Vars.
,
Independence
| |
| | ,…,
Independent
,…, | | ,…, |
,…,
,…, | | ,…, ,…,
41
42
Independent vs. Uncorrelated
Sum of Independent Random Vars.
• Independent
• Uncorrelated
• Independent  Uncorrelated.
y
• Uncorrelated  Independent??
– Not true, in general.
– True for multivariate Gaussian.
z+dz
dy
dx
x
Y
Y
Y
z =x + y
43
44
Central Limit Theorem
Correlation Coefficient
Given n independent random variables
• Normalized measure of correlation between
.
• Value between 1 and 1
• Zero for uncorrelated
• Property of Convolutions: Convolution of a large number
of positive functions is approximately Gaussian.
• Central Limit Theorem: Z is asymptotically Gaussian.
Lim
N
→
45
Zero Correlation Coefficient
• Reduces to variance property for
46
Unity Correlation Coefficient
• Uncorrelated
• For uncorrelated
47
48
Range of Correlation Coefficient
Orthogonal Random Variables
∗
Proof:
Quadratic in a has no real roots
Discriminant:
 negative or zero discriminant for quadratic in a (zero
, equal roots)
for
49
Correlation
50
Covariance Matrix
and Covariance
Generalization of 2nd moment & variance to
vector case.
• Can be written in terms of variances and correlation
coefficients.
• Diagonal for uncorrelated (& independent) variables.
51
52
Multivariate Normal
/
Proof:
• Generalization of normal distribution to n linearly
independent random variables.
• If are mutually uncorrelated they are also independent.
Quadratic form
53
Independent/Uncorrelated
54
Bivariate Gaussian
• If Gaussian are mutually uncorrelated they are
also mutually independent.
/
/
55
56
Conclusion
Properties of Multivariate Normal
• Probabilistic description of random
variables.
• Moments, characteristic function, moment
generating function.
• Correlation and covariance.
• Correlated, independent, orthogonal.
• Normal (Gaussian) random variable.
• Density N
completely defined by
and .
• If joint pdf is normal:
– Uncorrelated  Independent.
– All marginal and conditional pdfs are normal.
• Linear transformation of normal vector
gives a normal vector (next presentation).
57
References
• Brown & Hwang, Introduction to Random Signals and
Applied Kalman Filtering, Wiley, NY, 2012.
• Stark & Woods, Probability and Random Processes, Prentice
Hall, Upper Saddle River, NJ, 2002.
• R. M. Gray & L. D. Davisson, Random Processes: A
mathematical Approach for Engineers, Prentice Hall,
Englewood Cliffs, NJ, 1986.
• M. H. De Groot, M. J. Schervish, Probability & Statistics,
Addison-Wesley, Boston, 2002.
• S. M. Kay, Fundamentals of Statistical Signal Processing:
Detection Theory, Prentice Hall, 1998.
• A. Papoulis and S.U. Pillai, Probability, Random Variables,
and Stochastic Processes, 4th Ed., McGraw Hill, Boston,
59
MA, 2002.
58
Related documents