Download Independence of random variables

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Lecture 29
Agenda
1. Independence of random variables
2. Use of independence in relation to Mgf
Independence of random variables
We recall that, two events A and B are said to be independent if,
P (A ∩ B) = P (A)P (B)
i.e.
P (A|B) = P (A).
i.e. the information that B has happened does not affect the probability of
A in any way.
We also recall that for two discrete random variables X and Y , we said
that X and Y are independent if
P (X = x, Y = y) = P (X = x)P (Y = y)
∀x ∈ Range(X), y ∈ Range(Y ). This is also same as saying
P (Y = y|X = x) = P (Y = y)
∀x ∈ Range(X), y ∈ Range(Y ). We now turn our attention to continuous
random variables.
Definition 1. Let X and Y be two continuous random variables. We say X
and Y are independent if
fX,Y (x, y) = fX (x) × fY (y)
∀x, y ∈ R.
We can verify that the above definition is same as saying
fY |X=x (y) = fY (y)
∀x such that fX (x) > 0 and ∀y ∈ R.
1
We also recall that for two discrete random variables X and Y ,
if X and Y are independent
then
E [g(X)h(Y )] = E[g(X)] × E[h(Y )]
for any two functions g() and h().
The similar result holds for continuous random variables.
Lemma 1. X and Y are two continuous random variables.
If X and Y are independent
then
E [g(X)h(Y )] = E[g(X)] × E[h(Y )]
for any two functions g() and h().
Proof.
Z
∞
Z
∞
E [g(X)h(Y )] =
g(x)h(y)fX,Y (x, y)dxdy
Z−∞
∞
Z−∞
∞
Z−∞
∞
g(x)h(y)fX (x)fY (y)dxdy
Z ∞
fY (y)h(y)
g(x)fX (x)dxdy
=
=
−∞
Z−∞
∞
−∞
fY (y)h(y)E[g(X)]dy
Z ∞
= E[g(X)]
fY (y)h(y)dy
=
−∞
−∞
= E[g(X)]E[h(Y )]
Thus the same result holds true for continuous random variables. But
this is not a coincidence. If you take a course in “Advanced Probability”,
you shall prove the following theorem.
Theorem 1. If {X1 , X2 , . . . , Xk } are k independent random variables,
and f1 (), f2 (), . . . , fk () are any k functions,
then
E[f1 (X1 )f2 (X2 ) . . . fk (Xk )] = E[f1 (X1 )] × E[f2 (X2 )] × . . . × E[fk (Xk )]
Note :: We haven’t said whether the k random variables are discrete or
continuous or mixed; because the theorem holds anyway. Also instead of
taking two random variables, we have taken k.
2
Use of independence in relation to Mgf
Let {X1 , X2 , . . . , Xk } be k independent random variables.
Define Y = X1 + X2 + . . . + Xk and choose any t ∈ R. Then
MY (t) = E[etY ]
= E et(X1 +X2 +...+Xk )
= E etX1 etX2 . . . etXk
Now we note that {X1 , X2 , . . . , Xk } are independent and if we take,
f1 (x1 ) = etx1
f2 (x2 ) = etx2
..
.
fk (xk ) = etxk
we get
E etX1 etX2 . . . etXk = E[etX1 ] × E[etXk ] × . . . × E[etXk ].
Thus we get the following theorem,
Theorem 2. If {X1 , X2 , . . . , Xk } are k independent random variables
then
MX1 +X2 +...+Xk (t) = MX1 (t) × MX2 (t) × . . . × MXk (t)
We shall use the above theorem to derive some results.
Mgf of Binomial
Let X1 , . . . , Xn be independent random variables such that for all i = 1, . . . , n
Xi ∼ Bernoulli(p).
Now if we define X = X1 + X2 + . . . + Xn then,
X ∼ Binomial(n, p)
Since {X1 , X2 , . . . , Xn } are independent,
MX (t) = MX1 (t) × MX2 (t) × . . . × MXn (t)
3
For each i = 1, 2, . . . , n
MXi (t) = E(etXi )
= et.0 × (1 − p) + et.1 × p
= 1 − p + pet
Thus
MX (t) = (1 − p + pet )n
and we have got the Mgf of Binomial without doing a lot of algebra.
Sum of two independent Gamma random variables
Let’s recall that if,
X ∼ Gamma(α, β)
then for t <
1
β
MX (t) =
1
(1 − βt)α
This was done in the HW solutions.
Now let,
X1 ∼ Gamma(α1 , β)
X2 ∼ Gamma(α2 , β)
X1 ⊥
⊥ X2
(i.e. X1 and X2 are independent)
Define Y = X1 + X2 , and for t <
1
β
MY (t) = MX1 (t)MX2 (t)
1
1
=
α
(1 − βt) 1 (1 − βt)α2
1
=
(1 − βt)(α1 +α2 )
We note that the mgf of Y matches with tha mgf of Gamma(α1 + α2 , β)
and thus from the Uniqueness Theorem from Lecture 25,
Y ∼ Gamma(α1 + α2 , β).
Thus we have the following result.
4
Lemma 2. If
X1 ∼ Gamma(α1 , β)
X2 ∼ Gamma(α2 , β)
X1 ⊥
⊥ X2
then,
X1 + X2 ∼ Gamma(α1 + α2 , β)
Homework ::
To be given later
5