Download Joint Distributed Random Variables

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Probability wikipedia , lookup

Randomness wikipedia , lookup

Transcript
Joint Distributed Random Variables
Joint Distributed Random Variables
Joint Distribution Function (§ 6.1)
Independent Random Variables (§ 6.2)
Sums of Independent Random Variables (§ 6.3)
Conditional Distribution
Discrete Case (§ 6.4)
Continuous Case (§ 6.5)
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Joint Distribution Function
♦ Joint Distribution Function
Definition
For any two random variables X and Y , the joint cumulative probability distribution
function of X and Y is defined to be
F (x, y ) = P{X ≤ x, Y ≤ y },
−∞ < x, y < ∞.
Note 5.1: The joint cumulative probability distribution function
satisfies the following conditions
1. F (∞, ∞) = 1.
2. F (−∞, ∞) = F (−∞, y ) = F (x, −∞) = 0.
3. If a1 ≤ a2 and b1 ≤ b2 , then F (a1 , b1 ) ≤ F (a2 , b2 ).
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Joint Distribution Function
♦ Joint Distribution Function
Note 5.2: The marginal distribution functions of X and Y can be
obtained from the joint cumulative probability distribution function
of X and Y such that
FX (x) =
FY (y ) =
lim F (x, y ) = F (x, ∞);
n→∞
lim F (x, y ) = F (∞, y ).
n→∞
Note 5.3: In theory, all joint probability statements about X and
Y can be answered in terms of their joint distribution function
such that
P{a1 < X ≤ a2 , b1 < Y ≤ b2 } = F (a2 , b2 ) − F (a1 , b2 )
−F (a2 , b1 ) + F (a1 , b1 ).
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Joint Distribution Function
♦ Joint Distribution Function
Definition
The joint probability mass function for discrete random variables of X and Y is
defined to be
p(x, y ) = P{X = x, Y = y }.
Note 5.4: The joint cumulative distribution function for discrete
random variables of X and Y can be expressed by
F (x, y ) = P{X ≤ x, Y ≤ y } =
∑ ∑ p(s, t).
s≤x t≤y
Note 5.5: The joint probability mass function of X and Y satisfies
the following conditions
1. p(x, y ) ≥ 0 for every possible pair (x, y ).
2. ∑x ∑y p(x, y ) = 1 for all possible pair (x, y ).
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Joint Distribution Function
♦ Joint Distribution Function
Note 5.6: The marginal probability mass functions for the
univariate random variables X and Y can be derived from the
joint probability mass function of X and Y such that
pX (x)
=
P{X = x} =
∑
p(x, y );
∑
p(x, y ).
y :p(x,y )>0
pY (y )
=
P{Y = y } =
x:p(x,y )>0
Expected Value of g(X , Y )
Let p(x, y ) be the joint probability mass function of discrete random variables X
and Y , then the expected value of a scale function g(X , Y ) is given by
E g(X , Y ) = ∑ ∑ g(x, y )p(x, y ).
x
Qihao Xie
y
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Joint Distribution Function
♦ Joint Distribution Function
Example 5.1
Consider a random experiment by randomly drawn a ball from a box containing 10
balls. Each ball has an ordered pair of number on it such that (1, 1), (2, 1), (1, 2)
appear on 1 ball each, (3, 1), (2, 2) appear on 2 balls each, and (3, 2) appears on
3 balls. Let X and Y be the random variables represented respectively the first
and second values of the ordered pair, find (1) the joint probability mass function
of X and Y , (2) pX (x) and pY (y ), and (3) E(X ) and E(Y ).
Solution:
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Joint Distribution Function
♦ Joint Distribution Function
Definition
Given two random variables X and Y , there exists a nonnegative function f (x, y )
for all real values x and y such that for every set C of pairs of real numbers
P{(X , Y ) ∈ C} =
ZZ
f (x, y )dxdy .
(x,y )∈C
Then X and Y are said to be jointly continuous random variable, and the function
f (x, y ) is called the joint probability density function of X and Y .
Note 5.7: Given any two sets of real number A and B, and define
C = {(x, y ) : x ∈ A, y ∈ B}, then
P{X ∈ A, Y ∈ B} =
Z Z
f (x, y )dxdy .
A B
Note 5.8: The joint cumulative distribution function for continuous
random variables of X and Y can be expressed by
F (x, y ) = P{X ≤ x, Y ≤ y } =
Z x Z y
f (s, t)dsdt.
−∞ −∞
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Joint Distribution Function
♦ Joint Distribution Function
Note 5.9: The joint probability density function of X and Y can
be derived from the joint cumulative distribution function F (x, y )
such that
f (x, y ) =
∂2
F (x, y ).
∂ x∂ y
Note 5.10: The joint probability density function of X and Y
satisfies the following conditions
1. f (x, y ) ≥ 0 for every possible pair (x, y ).
2.
R∞ R∞
−∞ −∞ f (x, y )dxdy
= 1.
Note 5.11: The marginal probability density functions for the
univariate random variables X and Y can be derived from the
joint probability density function of X and Y such that
fX (x)
fY (y )
Z ∞
=
f (x, y )dy ;
−∞
Z ∞
=
Qihao Xie
f (x, y )dx.
−∞
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Joint Distribution Function
♦ Joint Distribution Function
Expected Value of g(X , Y )
Let f (x, y ) be the joint probability density function of continuous random variables
X and Y , then the expected value of a scale function g(X , Y ) is given by
Z
E g(X , Y ) =
∞ Z ∞
g(x, y )f (x, y )dxdy .
−∞ −∞
Example 5.2
If f (x, y ) = 6x 2 y , for 0 < x, y < 1, and 0 otherwise, find (1)
P{0 < X < 34 , 31 < Y < 3}, (2) F (x, y ), (3) fX (x) and fY (y ), and (4) E(X ) and E(Y ).
Solution:
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Independent Random Variables
♦ Independent Random Variables
Definition
The random variables X and Y are said to be independent if for any two sets of
real number A and B,
P{X ∈ A, Y ∈ B} = P{X ∈ A} · P{Y ∈ B}.
Note 5.12: The random variables X and Y are independent if,
and only if
F (x, y ) = FX (x)FY (y ),
where FX (x) and FY (y ) are marginal distribution of X and Y ,
respectively.
Proof: (exercise)
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Independent Random Variables
♦ Independent Random Variables
Note 5.13: The discrete random variables X and Y are
independent if, and only if
p(x, y ) = pX (x)pY (y ),
where pX (x) and pY (y ) are marginal distribution of X and Y ,
respectively.
Proof:
Example 5.3
The random variables of X and Y in Example 5.1 are not independent.
Solution:
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Independent Random Variables
♦ Independent Random Variables
Note 5.14: The continuous random variables X and Y are
independent if, and only if
f (x, y ) = fX (x)fY (y ),
where fX (x) and fY (y ) are marginal distribution of X and Y ,
respectively.
Proof: (exercise)
Example 5.4
The random variables of X and Y in Example 5.2 are independent.
Solution:
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Independent Random Variables
♦ Independent Random Variables
Proposition 5.1
Two random variables X and Y are independent if, and only if there exist two
functions g(x) and h(y ) such that
1. The joint probability mass function can be expressed as
p(x, y ) = g(x)h(y )
for eveny x ∈ R, y ∈ R.
2. The joint probability density function can be expressed as
f (x, y ) = g(x)h(y )
for eveny x ∈ R, y ∈ R.
Proof:
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Sums of Independent Random Variables
♦ Sums of Independent Random Variables
Proposition 5.2
Suppose that X and Y are two independent continuous random variables having
probability density functions fX (x) and fY (y ), respectively. Then, the probability
density function of the random variable Z = X + Y is given by
fZ (z) =
Z ∞
−∞
fX (z − y )fY (y )dy .
Notice that: fZ is called the convolution of the density functions fX (x) and fY (y ).
Corollary 5.1
If X ∼ Poi(λ1 ) and Y ∼ Poi(λ2 ) are two independent random variables, then
Z ∼ Poi(λ1 + λ2 ).
Proof:
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Sums of Independent Random Variables
♦ Sums of Independent Random Variables
Corollary 5.2
If X ∼ Exp(λ ) and Y ∼ Exp(λ ) are two independent random variables, then
Z ∼ Gamma(2, λ ).
Proof:
Corollary 5.3
If X ∼ Gamma(α1 , λ ) and Y ∼ Gamma(α2 , λ ) are two independent random
variables, then Z ∼ Gamma(α1 + α2 , λ ).
Proof:
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Sums of Independent Random Variables
♦ Sums of Independent Random Variables
Corollary 5.4
If X ∼ N(µ1 , σ12 ) and Y ∼ N(µ2 , σ22 ) are two independent random variables, then
Z ∼ N(µ1 + µ2 , σ12 + σ22 ).
Proof:
Corollary 5.5
Given an independent random variable X1 , . . . , Xn , each follows a normal
distribution with mean µi and variance σi2 , for i = 1, . . . , n, then
n
n
n
∑ Xi ∼ N ∑ µi , ∑ σi2 .
i=1
i=1
i=1
Proof: (exercises)
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Conditional Distribution
♦ Conditional Distribution
DISCRETE CASE
Definition
If X and Y is a joint discrete random variables having a joint probability mass
function p(x, y ), then the conditional probability mass function of X given Y = y is
defined to be
p(x, y )
pX |Y (x|y ) =
,
pY (y )
provided pY (y ) > 0 is the marginal probability mass function of Y .
Note 5.15: Suppose that pX |Y (x|y ) is the conditional probability
mass function of X given Y = y , then the conditional cumulative
distribution of X given Y = y is
FX |Y (x|y ) = P{X ≤ x|Y = y } =
∑ pX |Y (t|y ).
t≤x
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Conditional Distribution
♦ Conditional Distribution
Note 5.16: If X and Y are independent discrete random
variables, then
pX |Y (x|y ) = pX (x) and pY |X (y |x) = pY (y ).
Proof:
Example 5.5
Find the pX |Y (x|y ) for the random variables given in Example 5.1.
Solution:
Qihao Xie
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Conditional Distribution
♦ Conditional Distribution
CONTINUOUS CASE
Definition
If X and Y is a joint continuous random variables having a joint probability density
function f (x, y ), then the conditional probability density function of X given Y = y
is defined to be
f (x, y )
,
fX |Y (x|y ) =
fY (y )
provided fY (y ) > 0 is the marginal probability density function of Y .
Note 5.19: Suppose that fX |Y (x|y ) is the conditional probability
density function of X given Y = y , then the conditional cumulative
distribution of X given Y = y is
FX |Y (x|y ) = P{X ≤ x|Y = y } =
Qihao Xie
Z x
−∞
fX |Y (t|y )dt.
Introduction to Probability and Basic Statistical Inference
Joint Distributed Random Variables
⇒
Conditional Distribution
♦ Conditional Distribution
Note 5.20: If X and Y are independent continuous random
variables, then
fX |Y (x|y ) = fX (x) and fY |X (y |x) = fY (y ).
Proof:
Example 5.6
Find the fX |Y (x|y ) for the random variables given in Example 5.2.
Solution:
Qihao Xie
Introduction to Probability and Basic Statistical Inference