Download Functions of Random Variables

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Transcript
Functions of
Random
Variables
Notes of STAT 6205 by Dr. Fan
Overview
• Chapter 5
• Functions of One random variable
o General: distribution function approach
o Change-of-variable approach
• Functions of Two random variables
o Change-of-variable approach
•
•
•
•
Functions of Independent random variables
Order statistics
The Moment Generating Function approach
Random functions associated with normal distributions
o Student’s t-distribution
• The Central Limit Theorem
o Normal approximation of binomial distribution
• (Section 10.5) Chebyshev’s Inequality and convergence
in probability
6205-Ch5
2
General Method:
Distribution Function Approach
• Goal: to find the distribution of Y=h(X)
• When: the pdf of X, f(x) is known
• Then the cdf of Y, G(y) is:
G ( y )  P[Y  y ]  P[h( X )  y ] 
 f ( x)dx
h ( x ) y
• And the pdf of Y, g(y)=G’(y)
6205-Ch5
3
Examples/Exercises
• Let X~U(0,10) and Y=X^3. Find the cdf and pdf of Y
• Let X~Exp(mu=2) and Y=Exp(X). Find the cdf and
pdf of Y
• Let X~Gamma(a,b) and X=log(Y). Find the pdf of Y
(Loggamma distribution)
6205-Ch5
4
Change of Variable Approach
• When: the pdf of X is known and Y=h(X), a
monotonic function (i.e. its inverse function exists; X
= V(Y) )
6205-Ch5
5
Examples/Exercises
Let Y=(1-X)^3 and find its pdf g(y)
• Problem 1: f(x)=x/2, 0<x<2
• Problem 2: f(x)=3(1-x)^2, 0<x<1
• Problem 3: verify that the g attained in problem 2 is
a proper pdf
• Problem4: revisit the problems in Slide 4
6205-Ch5
6
Transformations of Two
Random Variables
• Let f(x1,x2) be the joint pdf of X1,X2
• Let Y1=u1(X1,X2) and Y2=u2(X1,X2)
• where u1, u2 have inverse functions, that is, X1=v1(Y1,Y2) and
X2=v2(Y1,Y2)
• Goal: find the joint pdf of Y1,Y2, g(y1,y2)
g ( y1 , y 2 ) | J | f [v1 ( y1 , y2 ), v2 ( y1 , y2 )], where ( y1 , y2 )  S1
x1
y1
and the Jacobian J 
x 2
y1
6205-Ch5
x1
y 2
.
x 2
y 2
7
Examples/Exercises
1. f(x1,x2)=2 where 0<x1<x2<1;
Y1=X1/X2 and Y2=X2
2. X1, X2 are independent exp(1) variables;
Y1=X1-X2 and Y2=X1+X2
3. Reading: Examples 5.2-3, 4
6205-Ch5
8
Independent Random Variables
• Let X1, X2, …,Xn be independent random variables
• Joint pmf (or pdf) of X1, X2, …, Xn:
f(x1,x2,…,xn)=f1(x1)f2(x2)…fn(xn)
• Random sample from a distribution f(x):
X1, X2, … Xn are independent and identically
distributed; f(x1,x2,…,xn)=f(x1)f(x2)…f(xn)
6205-Ch5
9
Examples/Exercises
• Let X1, X2, …, Xn be a random sample from
Exp(0.5). Find the joint p.d.f of this sample.
• Exercise: What is the probability of seeing at least
one Xi less than one? Exactly one less than one?
6205-Ch5
10
Functions of Independent R. V.s
Theorem 5.3-2
Let X1, X2, …, Xn be independent r. v.s. Then:
E[u1 ( X 1 )u2 ( X 2 )...un ( X n )]   E[ui ( X i )]
Theorem 5.3-3 (page 238)
6205-Ch5
i
11
Examples/Exercises
• Given a random sample of size n from a distribution
with mean mu and SD sigma, find the mean and
variance of the sample mean
6205-Ch5
12
Moment Generating Function
6205-Ch5
13
Examples/Exercises
• Example: Prove that the sum of i.i.d. Ber(p) r.v.s is a
Bin(n, p) r. v.
• Exercise: Prove that the sum of i.i.d. Exp(mu) r. v.s is
a Gamma(a=n, b=0.5) r. v.
1) What is the m.g.f. of Exp(mu)?
2) What is the m.g.f. of Gamma(a,b)?
3) Prove this problem using m.g.f.
6205-Ch5
14
Random Variables Assoc. With
Normal Distributions
Theorem 1:
The distribution of the sum of i.i.d. normal r. v.s is
also normal
Theorem 2:
The distribution of the sum of normal r. v.s is
also normal
Theorem 3:
The distribution of the average of normal r. v.s is
also normal
6205-Ch5
15
Student’s t-distribution
6205-Ch5
16
Proof:
1) Show S^2 and X-bar are independent
2) Use m.g.f to prove the distribution is chi-square
Example: Show that the one-sample t test statistic is tdistributed with (n-1) degree of freedom
X 
T
~ t (n  1)
S/ n
6205-Ch5
17
Features of t distribution t(r)
• Shape:
Bell-shaped
• Center and Spread:
mean=0 if r > 1
variance =r/(r-2) if r > 2
(undefined otherwise)
• M.G.F. does not exist
• Asymptotic distribution: (show simulation results)
As d.f. r goes to infinity, t(r) approaches to N(0,1)
6205-Ch5
18
Central Limit Theorem
6205-Ch5
19
Examples/Exercises
• Illustration: Bin(n, p) goes to Normal as n goes to infinity
[Aplia: STAT 1000 homework 4 Q3]
• Problem: Let X-bar be the mean of a random sample of
n=25 currents in a strip of wire in which each
measurement has a mean of 15 and a variance of 4.
Estimate the probability of X-bar falling between 14.4
and 15.6.
• Problem: Suppose BART wants to perform some quality
control. They know the waiting time for one at a BART
station is U(10,30). In a random sample of 30 people,
what tis the (approximate) probability that the average
waiting time is more than 22 minutes? Recall the mean
and variance for U(10,30) is 20 and 33.33 respectively.
6205-Ch5
20
Chebyshev’s Inequality
If the r. v. X has a mean m and variance s^2, then for
every k > 1,
P(| X   | k )  1 / k
2
Q: how to use this inequality to set up a lower bound
of P(|X - m|< ks)?
Example: Use this inequality to find a lower bound of
the probability that X is no more than 2 S.D. from the
mean. Is the lower bound close to the exact
probability if X ~ N( m, s^2 )
6205-Ch5
21
Example: Tossing a Coin
If we want to estimate p, the chance of heads for a
given coin, how many times share we toss it in order to
get a sufficient accurate estimate?
Let Y be the # of heads on n flips; sample estimate of
p, p-hat = Y/n. Use the Chebyshev’s Inequality to find
the required sample size n.
6205-Ch5
22
(Weak) Law of Large Number
Let X1, X2, …, Xn be i.i.d. r. v.s with finite mean m and
finite S.D. s. Then X-bar converges to m in probability.
Proof. By Chebychev’s Inequality.
6205-Ch5
23