Download Math/Stat 425 Joint Distributions of Several Random Variables

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Math/Stat 425
Joint Distributions of Several Random Variables
Motivation: There are many situations which involve presence of several random variables
and we are interested in their joint behavior. For example:
(i) A meteorological station may record the wind speed and direction, air pressure and
the air temperature.
(ii) Your physician may record your height, weight, blood pressure, cholesterol level
and more.
Case of several discrete random variables:
Def: For discrete X and Y , the function
p(x, y) = P ({X = x} ∩ {Y = y}) = P (X = x, Y = y)
is called the joint probability mass function of X and Y .
Similarly, for discrete X, Y, V , the function
p(x, y, v) = P ({X = x} ∩ {Y = y} ∩ {V = v}),
is called the joint probability mass function of X, Y, V ; similarly the definition extends to
the case of more than three random variables.
Ex: Two fair dice are rolled, yielding the scores X and Y . Then
p(x, y) =
1
, 1 ≤ x, y ≤ 6.
36
Ex.: Suppose a coin shows a head with probability p and tail with probability q = 1 − p.
Let X be the number of flips until the first head, and Y the number of flips until the first
tail. Then
 y−1
 p q, whenever x = 1 and y = 2, 3, . . .
pq x−1 , whenever y = 1 and x = 2, 3, . . .
p(x, y) = P (X = x, Y = y) =

0,
otherwise
is the (joint) probability mass function of X, Y .
Thus we can compute, for example,
P (X + Y ≤ 4) = p(1, 2) + p(2, 1) + p(1, 3) + p(3, 1) = pq + qp + p2 q + q 2 p = pq(2 + p + q).
In general, the following rule applies:
Let X and Y have joint pmf p(x, y), and C ⊂ R2 , then
X
P ((X, Y ) ∈ C) =
p(xi , yi ),
(xi ,yi )∈C
where the sum is over all possible points (xi , yi ) (of the random pair (X, Y )) which belong
to the two-dimensional (Borel) set C.
1
Ex: Let X and Y have the joint probability mass function given by:
µ ¶x µ ¶y
1
1
p(x, y) = c
, x, y ∈ N.
2
3
Find (i) the value of c, (ii) P (X > Y ), (iii) P (X = 2Y ).
Solution: For (i):
∞ X
∞
∞ X
∞ µ ¶x µ ¶y
1
X
X
1
1
1=
=c× 2
p(x, y) = c
2
3
1−
x=1 y=1
x=1 y=1
thus c = 2 and p(x, y) = 2
For (ii),
P (X > Y ) =
¡ 1 ¢x ¡ 1 ¢y
2
3
∞ X
∞
X
y=1 x=y+1
1
3
1−
1
3
1
= c,
2
.
"∞ µ ¶
#µ ¶
µ ¶x µ ¶y
∞ X
X
1
1
1 u+y+1
1 y
2
=2
,
2
3
2
3
y=1
where we put u = x − y − 1. Thus,
" ∞ µ ¶ #µ ¶ µ ¶
∞ X
∞
X
1 u
1 y 1 y X 1
P (X > Y ) =
=
2
2
3
1−
y=1
1
2
×
u=0
y=1
1
2
u=0
µ ¶y
∞ µ ¶
2 · 16
1
2
1 X 1 y−1
= .
= 2· ·
=
1
6
6
6
5
1− 6
y=1
For (iii),
P (X = 2Y ) =
∞
X
y=1
µ ¶2y µ ¶y
¶
∞
∞ µ
1
X
X
1
1
1
2
1 y
p(2y, y) =
2
.
=2
= 2· 12 1 = 2· =
2
3
12
11
11
1 − 12
y=1
y=1
Case of several continuous random variables:
Def. The random variables X and Y are said to be jointly continuous with joint density
f (x, y), if for all real a < b and c < d,
Z dZ b
P (a < X < b, c < Y < d) =
f (x, y)dxdy,
c
a
where naturally we must have that:
Z ∞Z ∞
f (x, y)dxdy = 1 and f (x, y) ≥ 0 for all x, y ∈ R.
−∞
−∞
In general, for C ∈ R2 ,
ZZ
P ((X, Y ) ∈ C) =
f (x, y)dxdy,
C
whenever the above integral exists.
Ex. Let X and Y have a joint density
f (x, y) = c(x + y), 0 ≤ x, y ≤ 1.
2
Find (i) the value of c, (ii) P (X > Y ), (iii) P (X < Y 2 ).
Solution: For (i),
¶
Z 1Z 1
Z 1 µZ 1
Z 1 µZ
(x + y)dxdy = c
xdx dy + c
1=c
0
0
0
0
0
i.e. c = 1 and f (x, y) = (x + y), for 0 ≤ x, y ≤ 1.
For (ii),
¶
Z 1 µZ 1
Z
P (X > Y ) =
(x + y)dx dy =
0
1
= +
2
y
1
0
1µ
0
¶
c
c
dx ydy = + = c,
2 2
¶
1 y2
2
−
+ y − y dy
2
2
1µ
Z
0
¶
3 2
1 1 1
1
y − y dy = + − = .
2
2 2 2
2
For (iii),
Z
2
1
ÃZ
P (X < Y ) =
y2
!
Z
(x + y)dx dy =
0
0
0
1µ 4
y
2
¶
+y
3
dy =
1
1
2+5
7
+ =
=
.
10 4
20
20
Marginal Distributions
Discrete Case:
Def. Suppose we know the joint pmf p(x, y) of the random variables X, Y , then one can
obtain the pmf pX of the random variable X:
X
X
p(x, y),
P ({X = x} ∩ {Y = y}) =
pX (x) = P (X = x) =
y
y
where the sum is taken over all possible values y of Y . Distribution of X (given by pX ) is
called a marginal distribution of X.
Similarly, marginal distribution of y is given by its pmf pY :
X
p(x, y),
pY (y) =
x
where the sum is over all possible values x of X.
Ex. Suppose X and Y have a joint distribution given by:
µ ¶x µ ¶y
1
1
p(x, y) = 2
, x, y ∈ N.
2
3
Then marginal distribution of X is given by its pmf pX (x):
µ ¶x µ ¶y
µ ¶x 1
∞
X
1
1
1
3
2
=2
pX (x) =
2
3
2
1
−
y=1
3
1
3
µ ¶x
1
=
, x ∈ N.
2
Thus, for example,
P (X ≤ 3) = pX (1) + pX (2) + pX (3) =
1 1 1
7
+ + = .
2 4 8
8
Continuous Case: If X and Y have a joint density f (x, y) then X has a (marginal) density
fX (x):
Z
∞
fX (x) =
f (x, y)dy
−∞
and Y has a (marginal) density fY (y):
Z
fY (y) =
∞
f (x, y)dx.
−∞
Ex. Let X and Y have a joint density
f (x, y) = 3(x + y), where 0 ≤ x + y ≤ 1 and x, y > 0.
Find P (X < 0.2).
Solution: The density of X is fX (x) given by:
Z
fX (x) =
0
1−x
3
3(x + y)dy = 3x(1 − x) + (1 − x)2
2
3
3
= (1 − x)(1 − x + 2x) = (1 − x2 ), where 0 ≤ x ≤ 1.
2
2
Thus,
Z
0.2
P (X < 0.2) =
0
3
3
(1 − x2 )dx =
2
2
Ã
¯0.2 !
x3 ¯¯
0.23
x− ¯
= 0.296
= 0.3 −
3 0
2
Independence of random variables
Def. The random variables X1 , . . . Xn are independent if and only if for all (measurable)
sets A1 , . . . An ,
n
P (X1 ∈ A1 , . . . , Xn ∈ An ) = Π P (Xi ∈ Ai ) .
i=1
In particular, for discrete X and Y , the random variables X and Y are independent if and
only if their probability mass functions satisfy:
pX,Y (x, y) = pX (x)pY (y) for all x, y ∈ R.
If X and Y are jointly continuous random variables, then X and Y are independent if and
only if their densities satisfy:
fX,Y (x, y) = fX (x)fY (y) for all x, y ∈ R.
4
Ex. For X, Y with the joint density
fX,Y (x, y) = 4e−2(x+y) , x, y > 0,
check whether X and Y are independent.
Marginal density fX (x) is given by:
Z ∞
Z
−2(x+y)
−2x
fX (x) =
4e
dy = 2e
0
∞
2e−2y dy = 2e−2x , x > 0.
0
Similarly, the marginal density fY (y) is given by:
Z ∞
fY (y) =
4e−2(x+y) dx = 2e−2y , y > 0.
0
Thus, we obtain that
fX,Y (x, y) = 4e−2(x+y) = 2e−2x 2e−2y = fX (x)fY (y), for all x, y ∈ R.
Therefore, X and Y are independent Exp(λ = 2) random variables.
Ex.: If X, Y have a joint density
f (x, y) = 3(x + y), where 0 ≤ x + y ≤ 1 and x, y > 0,
then
and
3
fX (x) = (1 − x2 ), 0 ≤ x ≤ 1
2
3
fY (y) = (1 − y 2 ), 0 ≤ y ≤ 1,
2
but
3
3
fX,Y (x, y) = 3(x + y) 6= (1 − x2 ) (1 − y 2 ) = fX (x)fY (y),
2
2
i.e. random variables X and Y are identically distributed (have the same distribution)
but are NOT independent.
Sums of Independent Random Variables
Ex. Suppose X and Y are independent Exp(λ) random variables. Find the density of
V =X +Y.
Solution: First note that independence implies that the joint density of X and Y is given
by:
fX,Y (x, y) = fX (x)fY (y) = λ2 e−λx−λy , x, y > 0.
ZZ
Z v Z v−x
FV (v) = P (X + Y ≤ v) = P ((X, Y ) ∈ C) =
f (x, y)dxdy =
λ2 e−λx−λy dydx,
C
0
where
C = {(x, y) ∈ R2 : x > 0, y > 0, x + y ≤ v}.
5
0
Thus,
Z
FV (v) =
µZ
v
λe
0
¶
v−x
−λx
−λy
λe
Z
0
Z
v
= 1 − e−λv −
v
dy dx =
³
´
λe−λx 1 − e−λ(v−x) dx
0
λe−λx−λv+λx dx = 1 − e−λv − λe−λv v, v > 0 .
0
Therefore, by differentiating above equality we obtain that
fV (v) = λe−λv + λ2 e−λv v − λe−λv = λ2 ve−λv , v > 0.
I.e. V = X + Y is Gamma(α = 2, λ).
In general, the following formula holds:
Convolution Formula: If X and Y are independent continuous random variables with
densities fX (x) and fY (y) respectively, then the density of V = X + Y has the following
form:
Z ∞
fV (v) =
fX (x)fY (v − x)dx .
−∞
Proof:
ZZ
FV (v) = P (X + Y ≤ v) =
Z
fX (x)fY (y)dydx =
C
Z
∞
−∞
v−x
−∞
fX (x)fY (y)dydx,
where
C = {(x, y) ∈ R2 : x + y ≤ v}.
Thus,
FV (v) =
Z
µZ
∞
−∞
fX (x)
¶
v−x
fY (y)dy dx =
−∞
Z
Z
v
Z
−∞
fX (x)
v
−∞
¶
fY (u − x)du dx
∞
=
−∞
µZ
∞
−∞
fX (x)fY (u − x)dxdu,
where we put u = y + x.
Thus,
d
fV (v) =
FV (v) =
dv
Z
∞
−∞
fX (x)fY (v − x)dx
¤
Ex.: For X and Y independent Exp(λ), and V = X + Y , applying above convolution
formula,
Z ∞
fV (v) =
fX (x)fY (v − x)dx
−∞
where
fX (x) = 0, x ≤ 0; fY (v − x) = 0, v − x ≤ 0,
I.e. the integrand equals to zero if x ≤ 0 or x ≥ v. Therefore,
Z v
Z v
−λx −λ(v−x)
fV (v) =
λe
λe
dx =
λ2 e−λv dx = λ2 e−λv v, v > 0.
0
0
6
Related documents