Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Statistics 510: Notes 19
Reading: Sections 6.3, 6.4, 6.5, 6.7
I. Sums of Independent Random Variables (Chapter 6.3)
It is often important to be able to calculate the distribution
of X Y from the distribution of X and Y when X and Y are
independent. At the end of last class, we derived the
results:
FX Y (a) P{ X Y a}
FX (a y) fY ( y)dy
and
d
f X Y ( a )
FX (a y ) fY ( y ) dy
da
f X (a y ) fY ( y )dy
Example 1: Sum of two independent uniform random
variables. If X and Y are two independent random
variables, both uniformly distributed on (0,1), calculate the
pdf of X Y .
II. Conditional Distributions (Chapters 6.4-6.5)
(1) The Discrete Case:
If X and Y are jointly distributed discrete random variables,
the conditional probability that X xi given that Y y j is,
if pY ( y j ) 0 , then the conditional probability mass
function of X|Y is
p X |Y ( xi | y j ) P( X xi | Y y j )
P( X xi , Y y j )
P(Y y j )
p X ,Y ( xi , y j )
pY ( y j )
This is just the conditional probability of the event
X xi given that Y y j .
If X and Y are independent random variables, then the
conditional probability mass function is the same as the
unconditional one. This follows because if X is
independent of Y, then
p X |Y ( x | y ) P( X x | Y y )
P ( X x, Y y )
P(Y y )
P( X x) P(Y y )
P(Y y )
P( X x)
Example 3: In Notes 17, we considered the situation that a
fair coin is tossed three times independently. Let X denote
the number of heads on the first toss and Y denote the total
number of heads.
The joint pmf is given in the following table:
y
x
0
1
2
3
0
1/8
2/8
1/8
0
1
0
1/8
2/8
1/8
What is the conditional probability mass function of X
given Y? Are X and Y independent?
(2) Continuous Case
If X and Y have a joint probability density function
f ( x, y ) , then the conditional pdf of X, given that Y=y is
defined for all values of y such that fY ( y) 0 , by
f ( x, y)
f X |Y ( x | y) X ,Y
f ( y) .
Y
To motivate this definition, multiply the left-hand side by
dx and the right hand side by (dxdy ) / dy to obtain
f ( x, y )dxdy
f X |Y ( x | y )dx X ,Y
fY ( y )dy
P{x X x dx, y Y y dy}
P{ y Y y dy}
P{x X x dx | y Y y dy}
In other words, for small values of dx and dy ,
f X |Y ( x | y ) represents the conditional probability that X is
between x and x dx given that Y is between y and y dy .
The use of conditional densities allows us to define
conditional probabilities of events associated with one
random variable when we are given the value of a second
random variable. That is, if X and Y are jointly continuous,
then for any set A,
P{X A | Y y} f X |Y ( x | y)dx .
A
In particular, by letting A (, a] , we can define the
conditional cdf of X given that Y y by
a
FX |Y (a | y) P( X a | Y y) f X |Y ( x | y)dx .
Note that we have been able to give workable expressions
for conditional probabilities even though the event on
which we are conditioning (namely the event Y y ) has
probability zero.
Example 4: Suppose X and Y are two independent random
variables, both uniformly distributed on (0,1). Let
T1 min{ X , Y }, T2 max{ X , Y } (these are called the order
statistics of the sample – Section 6.6). What is the
conditional distribution of T2 given that T1 t ? Are T1 and
T2 independent?
III. Joint Probability Distribution of Functions of Random
Variables
Let X1 and X 2 be jointly continuous random variables with
joint pdf f X1 , X 2 . It is sometimes of interest to obtain the
joint distribution of random variables Y1 and Y2 , which arise
as functions of X1 and X 2 . Specifically, suppose that
Y1 g1 ( X1 , X 2 ) and Y2 g2 ( X1 , X 2 ) for some functions
g1 and g 2 .
Assume that the functions g1 and g 2 satisfy the following
conditions:
1. The equations y1 g1 ( x1 , x2 ) and y2 g2 ( x1 , x2 ) can be
uniquely solved for x1 and x2 with solutions given by, say,
x1 h1 ( y1 , y2 ), x2 h2 ( y1 , y2 ) .
2. The functions g1 and g 2 have continuous partial
derivatives at all points ( x1 , x2 ) and are such that the
following 2x2 determinant
g1 g1
x1 x2
g g g g
J ( x1 , x2 )
1 2 1 2 0
g 2 g 2 x1 x2 x2 x1
x1 x2
at all points ( x1 , x2 ) .
Under these two conditions, it can be shown that the
random variables Y1 and Y2 are jointly continuous with joint
density function given by
fY1 ,Y2 ( y1 , y2 ) f X1 , X 2 ( x1 , x2 ) | J ( x1, x2 ) |1
(1.1)
A proof of equation (1.1) proceeds along the following
lines:
P{Y1 y1 , Y2 y2 }
f X1 , X 2 ( x1 , x2 )dx1dx2
( x1 , x2 ):
g1 ( x1 , x2 ) y1
g 2 ( x1 , x2 ) y2
The joint density function can now be obtained by
differentiating the above equation with respect to y1 and
y2 . That the result of this differentiation will be equal to
the right hand side of equation (1.1) is an advanced
calculus result.
Example 5: Let ( X , Y ) denote a random point in the plane
and assume that the rectangular coordinates X and Y are
independent standard normal random variables. We are
interested in the joint distribution of R, , the polar
coordinate representation of the point.
2
2
1
( R X Y , tan (Y / X ) )