Download 3.4 Joint Probability Distributions

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
3.4 Joint Probability Distributions

Joint Probability of two discrete random
variables

Joint probability of two continuous
random variables

Marginal distributions

Conditional probability distributions

Independence of two or more random
variables
1
Reality

However, there are many problems in which two or
more random variables need to be studied
simultaneously
For example:
1. We might wish to study the number of available
check-in counters at an airport in conjunction with the
number of customers waiting in queue.
2. We might wish to study the yield of chemical
reaction together with the temperature at which the
reaction is run.
2
Typical questions to ask are:



“What is average number of customers in the
queue given that number of available counters is
5 ?”
“Is the yield independent of the temperature?”
“What is the average yield if temperature is 40
C?”
To answer the questions of this type,
we need to study what are called twodimensional
or
multi-dimensional
random variables of both discrete and
continuous types.
3
1.
Joint Probability Distribution
of Discrete Random Variables
Definition 3.8 Let X and Y be random variables
The order the pair (X, Y) is called a two
dimensional random variable.
y
Y(s)
sω
Ω
S
X(s)
x
4
Definition 3.8 A function f(x, y) is the joint probability
distribution function or probability mass function for
two-dimensional discrete random variable (X, Y) if:
1. f(x, y)  0 for all (x, y)
2.
  f ( x, y )
x
y
=1
3. P(X = x, Y = y) = f(x, y)
For any region A in the xy plane,
P[(X, Y)A] =
  f ( x, y)
A
f(x, y) represents the probability distribution for the
simultaneous occurrence of (X, Y) in any pair of (x, y)
within the range of random variables X and Y.
5
Table for Joint Probability Distribution
P( X  xi , Y  y j )  pij
Y
X
x1
x2

xi

(1)
pij  0 ;
y1
p11
p21

pi1

i, j  1,2, 
y2 
yj

p12  p1 j 
p22  p2i 


pi 2  pij 


(2)
 p
ij
i
j
1
6
Example 3.8, page 75
Two refills for a ballpoint pen are selected at random
from a box that contains 3 blue refills, 2 red refills, and
3 green refills.
If X is the number of blue refills and Y is the number of
red refills selected, find
(a) the joint probability function f(x,y), and
(b) P((X,Y)A), where A is the region {(x, y) | x + y  1}
7
(a) find the joint probability function f(x,y)
X
Y
0
1
2
0
3/28
3/14
1
9/28
2
C30 C 20 C32
f(0,0) = P(X=0,Y=0) =
2
C
8
=3/28
3/28
3/14
1/28
(See Table 3.1, page 75.)
C30 C21C31
f(0,1) = P(X=0,Y=1) =
C82
=3/14
In general,
x y 2 x  y
f(x,y)
C3 C 2 C3
= P(X =x, Y = y) =
2
C8
8
(b) P ((X,Y)A), where A is the
region {(x, y) | x + y  1}
because (0, 0), (0, 1) and (1, 0) are the ones ,
such that x + y  1, so:
P((X,Y) A) = f(0, 0) + f(0, 1) + f(1, 0)
= 3/28 + 3/14 + 9/28 = 9/14
9
2. Joint probability of two
continuous random variables
Definition 3.9 Let X and Y be continuous random variables.
The order the pair (X,Y) is called a two dimensional
continuous random variable. A function f(x, y) is the joint
density function for (X, Y) if
1. f(x, y)  0

2.
for all (x, y)

  f ( x, y )dxdy
=1
 
3. P((X, Y)A) =  f ( x, y )dxdy
A
for any region A in the xy plane. 10
Example 3.9, page 76
A candy company distributes boxes of chocolates with a
mixture of creams, toffees, and nuts coated in both light and
dark chocolate. For a randomly selected box, let X and Y,
respectively, be the proportions of the light and dark
chocolates that are creams and suppose that the joint density
function is
(square)
0  x  1, 0  y  1
c(2 x  3 y ),
f ( x, y )  
0,
elsewhere
(may be circular, triangular or other regions)
(a) Determine c.
(b) Find P[(X, Y)A], A is the region
A={(x, y) | 0 < x < ½, ¼ < y < ½ }
(c) Find P[(X, Y)B], where B is the region
B={(x, y) | 0 < x < y < 1 }
11
Solution
(a) c =
1
1
1
0
0
  (2 x  3 y )dxdy
=2/5
1 1
  c(2 x  3 y)dxdy  1
(need to have
)
0 0
1/ 2 1/ 2
(b) P[(X, Y)A] =   (2 / 5)(2 x  3 y )dxdy =13/160
1/ 4 0
y
2
(
(c) P[(X, Y)B] =  5 )( 2 x  3 y)dxdy
1
B
=
1

0
1
[  52 (2 x  3 y)dy]dx
x
1/2
1/4
1/2
1
12
x
3.4 Joint Probability Distributions
– Joint Probability of two discrete random
variables
– Joint probability of two continuous
random variables
– Marginal distribution
– Conditional probability distributions
– Independence of two or more random
variables
13
Definition 3.8 A function f(x, y) is the joint probability
distribution function or probability mass function for
two-dimensional discrete random variable (X, Y) if:
1. f(x, y)  0 for all (x, y)
2.
  f ( x, y )
x
y
=1
3. P(X = x, Y = y) = f(x, y)
For any region A in the xy plane,
P[(X, Y)A] =
  f ( x, y)
A
f(x, y) represents the probability distribution for the
simultaneous occurrence of (X, Y) in any pair of (x, y)
within the range of random variables X and Y.
14
2. Joint probability of two
continuous random variables
Definition 3.9 Let X and Y be continuous random variables.
The order the pair (X,Y) is called a two dimensional
continuous random variable. A function f(x, y) is the joint
density function for (X, Y) if
1. f(x, y)  0

2.
for all (x, y)

  f ( x, y )dxdy
=1
 
3. P((X, Y)A) =  f ( x, y )dxdy
A
15
for any region A in the xy plane.
3. Marginal distribution
Definition 3.10 The marginal distributions of X alone
and of Y alone are
P(X = x) = g(x) = y f ( x, y) y P( X  x, Y  y)
P(Y = y) = h(y) =  f ( x, y )  P( X  x, Y  y )
x
x
for the discrete case, and by


g ( x) 
 f ( x, y)dy

for the continuous case.
,
h( y ) 
 f ( x, y)dx

16
Example 3.10, page 77
Use table 3.1, page 75 to find the marginal distributions for X
and Y of Example 3.8.
X
Y
0
1
0
3/28
9/28
1
3/14
3/14
2
1/28
g(x)
5/14
2
h(y)
3/28 15/28
0
g(x) 5/14
1
2
15/28 3/28
3/7
1/28
15/28
X
3/28
1
Y
0
h(y) 15/28
1
3/7
2
1/28
g(0) = P(X = 0) = f(0, 0) + f(0, 1) + f(0, 2) = 3/28 + 3/14 + 1/ 28 = 5/14
g(1) = P(X = 1) = f(1, 0) + f(1, 1) + f(1, 2) = 9/28 + 3/14 + 0 = 15/28
g(2) = P(X = 2) = f(2, 0) + f(2, 1) + f(2, 2) = 3/28 + 0 + 0 = 3/28 17
Example
Suppose (X, Y) has the joint density function
6e 2 x 3 y ,
0  x, 0  y
f ( x, y)  
elsewhere
0,
(a) Find P{X  x, Y  y} when x>0, y>0
x
y

 2 t 3 s
2 x
3 y
6
e
dt
ds

(
1

e
)(
1

e
)

0
0
(b) Find marginal distribution of X.

g ( x)   6e 2 x 3 y dy  2e 2 x
x≥ 0
0
and g(x)=0 else where.
18
4.Conditional probability distributions

Clearly, for discrete cases,
f ( x, y )
P ( X  x, Y  y )
P[X = x | Y = y] =
= h( y )
P(Y  y )

For continuous cases, it can be shown that
x f (t , y )
P[X  x | Y = y] = 
dt
  h( y )

Hence it is natural to define the conditional
probability distributions as follows.
19
Definition
Definition 3.11 Let X and Y be two random variables,
discrete or continuous. The conditional distribution of the
random variable X, given Y = y,
f ( x | y) 
f ( x, y )
h( y )
, h(y) > 0
Similarly, the conditional distribution of the random
variable Y, given X = x, is
f ( x, y )
f ( y | x) 
g ( x)
P(a < X < b | Y = y) =
 f ( x | y)
a  x b
P(a < X < b | Y = y) = b f ( x | y)dx
a
, g(x) > 0
for discrete cases.
for continuous cases.
20
Example 3.12, page 79
Referring to Example 3.8, table 3.1, find the conditional
distribution of X , given that Y=1 and use it to determine
P(X = 0|Y= 1)
X
0
1
0
3/28
9/28
1
3/14
3/14
2
1/28
g(x)
5/14
Y
2
h(y)
f(x|1)=f(x,1) /h(1)=(7/3) f(x,1)
x=0,1,2
3/28 15/28 f(0|1)=(7/3)f(0,1)=(7/3)(3/14)=1/2
3/7
f(1|1)=(7/3)f(1,1)=(7/3)(3/14)=1/2
f(2|1)=(7/3)f(2,1)=(7/3)(0)=0
1/28
15/28
3/28
1
X
0
f(x|1) 1/2
1
2
1/2
0
P(X = 0|Y= 1) =f(0|1) = f(0, 1)/h(1) = (3/14)/(3/7) = ½
21
Example 3.14 Page81
Given the joint density function:
 x(1  3 y 2 ) / 4,
f ( x, y)  
0,
0  x  2, 0  y  1
elsewhere ,
Find g(x), h(y), f(x|y), and evaluate P(1/4<X<1/2|Y=1/3).
Solution:
By definition

1

0
g ( x)   f ( x, y )dy  

2

0
h( y)   f ( x, y)dx  
Therefore,
x(1  3 y 2 )
xy xy3 y  1 x
dy 

 , 0 x2
4
4
4 y0 2
x(1  3 y 2 )
x 2 3x 2 y 2 x  2 1  3 y 2
dx 


, 0  y  1.
4
8
8 x0
2
f ( x. y) x(1  3 y 2 ) / 4 x
f ( x y) 

 , 0 x2
2
h( y)
(1  3 y ) / 2 2
1/ 2 x
1
1
1
3
P(  X  y  )  
dx  .
1/ 4 2
4
2
3
64
and
22
5.Independence of
two or more random variables
Definition 3.12 Let X and Y be two random variables,
discrete or continuous, with joint probability distribution
f(x,y) and marginal distribution g(x) and h(y),
respectively. The random variable X and Y are said to be
statistically independent if and only if
f(x, y) = g(x)h(y)
for all (x, y) within their range.
23
Example for discrete:
Y
X
-1
1/2
2
1
2
2
4
h( y )
2
20
20
20
5
0
1
1
20
20
2
1
20
5
2
2
2
20
20
4
2
g (x )
1
1
20
4
4
1
2
5
∵ f(x, y) = g(x)h(y) for ∀ x, y
∴ X and Y are said to be
statistically independent
24
Example
1.Example 3.15, page 82. From table 3.1, page 75,
X
Y
0
1
0
3/28
9/28
1
3/14
3/14
2
1/28
g(x)
5/14
2
h(y)
3/28 15/28
3/7
1/28
15/28
3/28
f(0, 1) = 3/14,
g(0) = 5/14,
h(1) = 3/7,
f(0,1)  g(0)h(1)
1
X and Y are not statistically independent.
2.Example 3.14, page 81.
g(x)h(y)=f(x,y)
X and Y are statistically independent
.
25
Read page 82 – 83
• Joint probability distribution for more than two
random variables.
• Different marginal distributions and conditional
distributions
Definition 3.13 Let X1, X2, …, Xn be n random
variables, discrete or continuous, with joint
distribution f(x1, x2, …, xn) and marginal distributions
f1(x1), f2(x2) , …, fn(xn), respectively. The random
variables X1, X2, …, Xn are said to be mutually
statistically independent if and only if
f(x1, x2, …, xn) = f1(x1) f2(x2)··· fn(xn)
for all x1, x2, …, xn.
• Remark: Distribution of X = X1 + X2 + ··· + Xn
26
Example 3. 16, page 83.
Suppose that the shelf life, in years, of a certain
perishable food product packaged in cardboard containers
is a random variable whose probability density function is
given by
e  x x  0
f ( x)  
0 elsewhere.
Let X1, X2, and X3 represent the shelf lives for three of these
containers selected independently, find P(X1< 2, 1 < X2 <
3, X3 >2).
P(X1< 2, 1 < X2 < 3, X3 >2)=
 3
2 x x x
1
2
3
2
0
  e
1
dx1dx2 dx3
27
= (1  e2 )(e1  e3 )e2  0.0372
Example 4

The joint density function of X and Y is given by
f ( x, y)  xe  x( y 1) , x  0, y  0
Determine the conditional densities.
Solution: The marginal densities for the given X and Y are
g ( x)  e  x , x  0
h( y ) 
1
( y 1) 2
,y0
Hence, from the formulas for conditional densities, we have
xe  x ( y 1)
2  x ( y 1)
f ( x | y) 

x
(
y

1
)
e
2
( y  1)
xe  x ( y 1)
 xy
f ( y | x) 

xe
ex
Because f ( x, y)  g ( x)h( y) , it is clear that X and Y are not
independent.
28
Related documents