Download Notes

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Lecture 6
Functions of Random Variables
To determine the probability distribution for a function of n
random variables
, we must find the joint probability
distribution for the random variables themselves.
From now on we will assume that populations are large in
comparison to the sample size and that the random variables are
independent.
How to find the probability distribution of a function of r.v’s?
Method of Distribution Functions:
Suppose
and U is a function of Y. Then to find
we integrate f(y) over the region
. Then
is found by differentiating
.
Example:
Let
{
. Find
.
.
Solution:
Bivariate case: Let
and
be random variables with joint
density
and
. Then for every point
,
there is one and only one value of U.
Example: Suppose
and
{
Given
Solution:
have joint density function
.
, find
and E(U).
Example: Let
,
Unif(0,1). Given
Solution:
denote a random sample of size n = 2 from
+ , find
.
Summary: Let U be a function of
.
1. Find the region
;
2. Find
by integrating
;
3. Find
by differentiating
.
Example: Let Y be a continuous random variable with
, and
.
over
and
Example: (#6.7) Let Z~N(0,1),
Find
Solution:
.
√
. Let
.
Method of Transformations:
Let h(y) be an increasing function of y, i.e.
,
and that U=h(Y) where
function of u, i.e.
. Then
is an increasing
.
Note:
If h(y) is a decreasing function of y,
function of u, i.e.
is a decreasing
.
Example:
Let
{
. Find
.
.
Solution:
Remark: To use this method, h(y) must be either increasing or
decreasing for all y such that
. The set of points
is called the support of the density
.
Summary: Let U=h(Y), where h(y) is either an increasing or
decreasing function of y for all y such that
.
1. Find the inverse function,
2. Evaluate
3. Find
;
|
by
Example (Bivariate case): (#6.31)
{
Given
Solution:
;
, find
.
|.
Method of Moment-Generating Function:
Theorem: Let
and
denote the mgf’s of random
variables X and Y, respectively. If both mgf’s exists and
for all values of t, then X and Y have the same probability
distribution.
Proof: omitted.
Example: Z ~ N(0,1). Given
Solution:
, find
.
Theorem: Let
mgf’s
be independent random variables with
, respectively. If
, then
.
Proof:
Theorem: Let
variables, and
∑
Proof:
(
∑
)
, be independent random
∑
be constants. If
, then
.
Theorem: Let
variables, and
Proof:
(
)
, be independent random
Then ∑
.
Example: (#6.50) Let Y ~ Bin(n, p). Show that n – Y ~ Bin(n, 1-p).
Solution:
Summary: Let U be a function of
.
1. Find the mgf for U,
.
2. Compare
with other well-known mgf’s. If
for all t, then U and V have identical distributions.