Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
Main topics in the course on probability theory
The concept of probability – Repetition of basic skills
Multivariate random variables – Chapter 1
Conditional distributions – Chapter 2
Transforms – Chapter 3
Order variables – Chapter 4
The multivariate normal distribution – Chapter 5
The exponential family of distributions - Slides
Convergence in probability and distribution – Chapter 6
Probability theory 2011
Objectives
Provide a solid understanding of major concepts in
probability theory
Increase the ability to derive probabilistic relationships in
given probability models
Facilitate reading scientific articles on inference based on
probability models
Probability theory 2011
The concept of probability – Repetition of basic
skills
“Gut: Introduction” + More
Whiteboard
Probability theory 2011
Multivariate random variables
Gut: Chapter 1
Slides
Probability theory 2011
Joint distribution function - Copula
F( X ,Y ) ( x, y ) P( X x, Y y )
provides a complete description of the two-dimensional distribution of
the random vector (X , Y)
Probability theory 2011
Joint distribution function
P ( x X x x, y Y y y )
F( X ,Y ) ( x, y ) P( X x, Y y )
F ( x x, y y ) F ( x, y y ) F ( x x, y ) F ( x, y )
F ( x x, y y ) F ( x, y y ) F ( x x, y ) F ( x, y )
x
x
x
2 F ( x, y )
F ( x, y y ) F ( x, y )
x
x y
x
x
yx
Probability theory 2011
Joint probability density
2 F ( x, y)
f ( X ,Y ) ( x, y)
xy
f
( X ,Y )
( x, y )dxdy 1
R2
P( X , Y ) D f ( X ,Y ) ( x, y )dxdy
D
Probability theory 2011
Joint probability function
f
( X ,Y )
( x, y ) 1
P( X , Y ) D
f
( x , y )D
( X ,Y )
Probability theory 2011
( x, y )
Marginal distributions
P ( a X b)
f ( X ,Y ) ( x, y )dy dx
a
b
Marginal probability
density of X
Probability theory 2011
Independence
Independent events
Independent stochastic variables
Sufficient that
F( X ,Y ) ( x, y ) FX ( x) FY ( y )
Probability theory 2011
Covariance
Assume that E(X) = E(Y) = 0. Then, E(XY) can be regarded as a
measure of covariance between X and Y
More generally, we set
Cov( X , Y ) E( X E( X )) (Y E(Y ))
Cov(X , Y) = 0 if X and Y are independent. The converse need not
be true.
Probability theory 2011
Covariance rules
Cov ( X , X ) Var ( X )
Cov ( X 1 X 2 , Y ) ...
Cov (aX , Y ) ...
Cov ( X , b) ...
Var ( X Y ) Var ( X ) Var (Y ) 2Cov ( X , Y )
Probability theory 2011
Covariance and correlation
Scale-invariant covariance
X E ( X ) Y E (Y ) Cov( X , Y )
Cov( X / X , Y / Y ) E
X
Y
X
Y
Probability theory 2011
Inequalities
Cov( X , Y )2 Var ( X )Var (Y )
2 1
Proof: Assume that
Var ( X ) Var (Y ) 1
Then, observe that
Var (aX Y ) a 2 2aCov( X , Y ) 1 0 for a 1 and a -1
Probability theory 2011
Functions of random variables
Let Y = a + bX
Derive the relationship between the probability density
functions of Y and X
Probability theory 2011
Functions of random variables
Let X be uniformly distributed on (0,1) and set
Y ln X
Derive the probability density function of Y
Probability theory 2011
Functions of random variables
Let X have an arbitrary continuous distribution, and suppose
that g is a (differentiable) strictly increasing function. Set
Y g( X )
Then
FY ( y) P(Y y) P( X g 1 ( y)) FX ( g 1 ( y))
and
d 1
d 1
1
fY ( y ) f X ( g ( y )) g ( y ) f X ( g ( y )) g ( y )
dy
dy
1
Probability theory 2011
Linear functions of random vectors
Let (X1, X2) have a uniform distribution on
D = {(x , y); 0 < x <1, 0 < y <1}
Set
Then
a1 b11 b12 X 1
Y a BX
a2 b21 b22 X 2
1
| det( B 1 ) |
f (Y1 ,Y2 ) ( y1 , y2 ) | det( B) |
0, otherwise
Probability theory 2011
Functions of random vectors
Let (X1, X2) have an arbitrary continuous distribution, and
suppose that g is a (differentiable) one-to-one transformation.
Set
(Y1 , Y2 ) g ( X1 , X 2 )
Then
x1
f (Y1 ,Y2 ) ( y1 , y2 ) f ( X1 , X 2 ) (h1 ( y1 , y2 ), h2 ( y1 , y2 )) xy1
2
y1
where h is the inverse of g.
Proof: Use the variable transformation theorem
Probability theory 2011
x1
y2
x2
y2
Random number generation
Uniform distribution
Bin(2; 0.5)
Po(4)
Exp(1)
Probability theory 2011
Random number generation
- the inversion method
Let F denote the cumulative distribution function of a probability
distribution.
Let Z be uniformly distributed on the interval (0,1)
Then, X = F-1(Z) will have the cumulative distribution function F.
How can we generate normally distributed random numbers?
Probability theory 2011
Random number generation:
method 3 ( the envelope-rejection method)
Generate x from a probability density g(x) such that cg(x) f(x)
Draw u from a uniform distribution on (0,1)
Accept x if u < f(x)/cg(x)
1.20
***************************
Justification:
Let X denote a random number
from the probability density g.
Then
P(t X t h { X is accepted})
f (t )
f (t )
h g (t )
h
cg (t )
c
1.00
0.80
f(x)
cg(x)
0.60
0.40
0.20
0.00
-6
-4
-2
0
x
How can we generate normally distributed random numbers?
Probability theory 2011
2
4
6
Random number generation
- LCGs
Linear congruential generators are defined by the recurrence relation
Y j 1 aY j b (mod M )
Numerical Recipes in C advocates a generator of this form with:
a = 1664525, b = 1013904223, M = 232
Drawback: Serial correlation
Probability theory 2011
Exercises: Chapter I
1.3, 1.8, 1.14, 1.18, 1.30, 1.31, 1.33
Probability theory 2011