Download AMS 572 Lecture Notes #1 Sep. 4th, 2009 Review Probability: eg

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
AMS 572 Lecture Notes #1
Sep. 4th, 2009
Review Probability:
eg. Suppose each child’s birth will result in either a boy or a girl with equal probability. For a
randomly selected family with 2 children, what is the chance that the chosen family has 1) 2
boys? 2) 2 girls? 3) a boy and a girl?
Solution: 25%; 25%; 50%
P(B and B)= P(B∩B)= P(B)∙P(B)=0.5∙0.5=0.25
P(boy and girl)= P( B1
G2 or B2
G1 ) = P(( B1 G2 ) ( B2
G1 ))
Binomial Experiment:
1) It consists of n trials
2) Each trial results in 1 of 2 possible outcomes, “S” or “F”
3) The probability of getting a certain outcome, say “S”, remains the same, from trial to
trial, say P(“S”)=p
4) These trials are independent, that is the outcomes from the previous trials will not
affect the outcomes of the up-coming trials
Eg. 1
n=2, let “S”=B, P(B)=0.5
Let X denotes the total # of “S” among the n trials then X~B(n, p)
And the probability mass function (pmf) is
P( X  x)  ( nx ) p x (1  p) n  x , x=0,1,…,n
n
 P( X  x)  1
i 0
Eg. 1 (continued)
n=2, p=0.5, S=G, birth=trial
Answer: Let X denotes the total of girls form the 2 births. Then X~B(n=2, p=0.5)
1) P(2 boys)=P(X=0)= ( 20 )0.50 (1  0.5) 20 =.25
2) P(2 girls)=P(X=2)= ( 22 )0.52 (1  0.5) 2 2 =.25
3) P(1 boys and a girl)=P(X=1)= ( 21 )0.51 (1  0.5) 21 =.5
4) What is the probability of having at least 1 boy?
P(at least 1 boy)=P(X  0)=P(X=1)+P(X=0) =.5+.25=.75
Eg 2. An exam consists of 10 multiple choice questions. Each question has 4 possible choices.
Only 1 is correct. Jeff did not study for the exam. So he just guesses at the right answer for
each question (pure guess, not an educated guess). What is his chance of passing the exam?
That is , to make at least 6 correct answers.
Answer: Yes, this is a binomial experiment with n=10, p=0.25, “S”=chose the right answer for
each question.
Let X be the total # of “S”
P(pass)=P(X  6)=P(X=6 or X=7 or X=8 or X=9 or X=10)
= P(X=6)+ P(X=7)+ P(X=8)+ P(X=9)+ P(X=10)
= ( 106 )0.256 (1  0.5) 4 +…
Sep. 11th, 2009
1. Binomial distribution
2. Normal distribution
pdf:

1
f X ( , ) 
e
2
2
( x   )2
2 2
X~N(  ,  2 ), then P(X<100)= 
100

f ( x)dx
Eg. μ=14,  2 =16  σ=4
( x 14)

1
e 32 dx  1
2 4
2
P(X<100)=P(X  100)= 
100

( x 14)

1
e 32 dx =.5
2 4
2
P(X<14)=P(X  14)= 
14

Transformation to the standard normal R.V
x
X~N(  ,  2 )  Z 

~ N (0,1)
Eg. X~N(   14,  2  16 )
P(X<10)=P(
X  14 10  14

)=P(Z<-1)=.1587
4
4
General Terminology:
If X is a continuous R.V with p.d.f f(x). Its cumulative probability distribution
function is:
x
F ( x)  P( X  x)   f (v)dv
[ f ( x) 

d
F ( x) ]
dx
Definition: Moment Generating Function (m.g.f)
Let X be a continuous R.V with p.d.f f(x), then its m.g.f is

M X (t )  E(e tX )   e tx f ( x)dx , where t is a small positive variable.

And if g(x) is a function of X, then the mathematical expectation of g(x) is defined as:

E[g(x)]=  g ( x) f ( x)dx

Special cases:

1) Mean:   E( X )   x  f ( x)dx


2) Variance:  2  E[( X   )2 ]   ( x   )2  f ( x)dx  E( X 2 )  [ E( X )]2

Under regularity conditions, there is 1-1 correspondence between f(x) and M X (t )
Eg. If X
N ( ,  2 ) , what is the M X (t ) ?
1
( x   )2
M X (t )  E (e )   exp( xt )
exp(
)dx
2 2
2
xt
Solution:

1
2 2 xt  x 2  2 x   2
exp(

)dx
2 2
2 

1
x 2  2(    2t ) x   2
exp(

)dx
2 2
2 
1
1
[ x  (   2 2t )]2
 exp( t   2t 2 )
exp(

)dx
2
2 2
2 
1
 exp( t   2t 2 )
2
Note:
1
[ x  (   2 2t )]2
exp(

)dx =1
2 2
2 
e.g. If X
e.g. If Z 
N (  2,  2  5) , then the m.g.f of X is : M X (t )  exp(2t  2.5t 2 )
x
1
~ N (0,1) , then the m.g.f of Z is: M X (t )  e 2

i .i .d .
, X n ~ N (  ,  2 ) , then X ~ N (  ,
e.g. If X 1 , X 2 ,
Proof: M X (t )  E (e )  E (e
tX
= E (et
*
( X1   X n )
t
X 1  X 2  X n
n
)
), where t *  t / n,
 M X1 X n (t * )  M X1 (t * )M X n (t * )
 (e
e
 X ~ N ( ,
1
2
t *   2t *2
)n
t 1
t
n  n 2 ( ) 2
n 2
n
2
n
)
e
t 
1 2 2
t
2 n
2
n
).
t2
Sep. 14th, 2009
1. Review: Binomial & Normal distribution
Normal distribution: X
p.d.f :
N ( ,  2 )

1
f X ( , ) 
e
2
( x   )2
2 2
2
1
m.g.f: M X (t )  exp( t   2t 2 )
2
Standard normal: Z 
x

~ N (0,1) , then
P(Z  1.96)  0.975  1  .025  Z0.025  1.96
P(Z  1.645)  0.95  1  .05  Z0.05  1.645
2. A “good” random sample:
A simple random sample: everyone in the population has the same chance to be
selected.
3. Sampling distribution of the mean.
Let X1 , X 2 ,
, X n denote the heights of n randomly selected subjects from the adult
U.S male population. Furthermore, suppose the population distribution is N (  ,  2 )

X1 , X 2 ,

n is the sample size

X1, X 2 ,
, X n is a random sample
i .i .d
, X n ~ N (  ,  2 ) , i.i.d means independent identically distributed
n

Sample mean: X 
X
i 1
i
E( X )  
,
n
n

Sample variance: S 2 
(X
i 1
i
 X )2
n 1
, E(S 2 )   2
Goal: To estimate the populations mean μ. → Confidence interval for μ
e.g We are 99% sure that μ will be in the following interval [5’77”, 5’88”]
4. Distribution of the sample mean when the population is normal.
i .i .d
Let X i ~ N (  ,  2 ), i  1,..., n  X ~ N (  ,
2
n
)
5. Confidence interval for μ when the population is normal and the
population variance  2 is known.
Pivotal Quantity (P.Q): Z 
X 
~ N (0,1)
/ n
1   = P( z / 2  Z  z / 2 ) , where Z ~ N (0,1)
= P (  z / 2

X 

 z / 2 )
n
= P ( z / 2

X

 z / 2 )
n
= P( z / 2

n
   X  z / 2

n
)
= P( X  z / 2

n
   X  z / 2

 The 100(1-)% C.I. for  is [ X  z / 2
n

n
)

, X  z / 2
n
]
Sep. 17th, 2009
1. Confidence interval for 1 population mean  when the population
is normal and the population variance  2 is known.
1) Point estimator for  : X ~ N (  ,
2) Pivotal quantity for  : Z 
2
n
)
X 
~ N (0,1)
/ n
3) Confidence interval (C.I):
1   = P( z / 2  Z  z / 2 )
 1    P( X  z /2

n
   X  z /2
The 100(1-α)% C.I for  is [ X  z / 2

n

n
)
, X  z / 2

n
]
2. Review: Point estimator
1) Maximum likelihood estimator (M.L.E)
e.g1 We have a random sample of size n from a normal population.
i .i .d
That is X i ~ N (  ,  2 ), i  1,..., n
Suppose  2 is known. How can we derive the MLE for  ?
a) Likelihood function:
L  f ( X1 , X 2 ,
, X n )  f ( X1 ) f ( X 2 )
f (Xn)
n
n
i 1
i 1
n

1
( x   )2
 (x  ) )
2
2
exp(
)

(2

)
exp(
2
2
2 2
2
2
  f ( X i ) 
b) Log likelihood:
n
 ( xi   )2
l  ln L   ln(2 2 ) 
2
2 2
c)
l  ( xi   )

0   X

2
If  2 is unknown, and we also want the MLE for  2
( xi   )2
l
n
1 n

2




0



 ( X i  X )2
 2
2 2
2 4
n i 1
e.g2 We have a random sample of size n from a Bernoulli population.
i .i .d
That is X i ~ Bernoulli ( p), i  1,..., n , and f ( xi )  p xi (1  p)1 xi , xi  0,1
How can we derive the MLE for p?
n
, Xn)   f (Xi )
a) L  f ( X1 , X 2 ,
i 1
n
 Xi
n
=  p X i (1  p)1 X i  p i1 (1  p)
n
n
 Xi
i 1
i 1
n
n
i 1
i 1
b) l  ln L  ( X i ) ln p  (n   X i ) ln(1  p)
n
c)
dl

dp
 Xi
i 1
p
n

n   Xi
i 1
1 p
0
n
 p
X
i 1
n
n
i 1
i 1
 (1  p) X i  np  p  X i  0
i
n
2) Method of Moment Estimators (MOME)
e.g3 We have a random sample of size n from a Bernoulli population.
i .i .d
That is X i ~ Bernoulli ( p), i  1,..., n , and f ( xi )  p xi (1  p)1 xi , xi  0,1
How can we derive the MOME for p?
population moment
sample moment
E( X )  p
1st
n
X
i 1
i
n
n
Let E ( X ) 
X
i 1
n
n
X
i
, then P̂ =
i 1
n
i
=X
e.g4 We have a random sample of size n from a normal population.
i .i .d
That is X i ~ N (  ,  2 ), i  1,..., n , find the MOME for  and  2 .
population moment
sample moment
n
1st
E( X )  
X
i 1
n
i
 X
n
2nd E ( X 2 )   2   2
 X i2
i 1
n
n
2 
X
i 1
n
2
i
 ( X )2
3. Pivotal Quantity
Definition: Pivotal Quantity is a function of a sample and the parameter of interest.
Furthermore, we know its distribution entirely.
e.g Recall in the inference on μ, when we have normal population and  2 is known.
The P.Q is Z 
To find a P.Q
X 
~ N (0,1)
/ n
a) Start from the point estimator: X ~ N (  ,
b) W  X   ~ N (0,
c) Z 
2
n
) is a P.Q
X 
~ N (0,1) is also a P.Q
/ n
2
n
)
Related documents