Download Random Variables

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Introduction to Stochastic Models
GSLM 54100
1
Outline
 random
variables
 discrete:
Bernoulli, Binomial, geometric,
Poisson
 continuous:
 jointly
uniform, exponential
distributed random variables
 independent
 variance
 two
random variables
and covariance
useful ideas
2
Random Variable

a real-valued function defined on 

example

N = the number landed by a throw of a dice

X = 2N-4.5

X

1
-2.5
2
-0.5
3
1.5
4
3.5
5
5.5
6
7.5
3
Events from Random Variables
 events
generated by random variables
similarly, P(X > x),
P(X  x), P(X < x),
P(X  x) are
events

X

x
{| X() = x}
an event
4
Random Variable

E(Y) = x1P(Y = x1) + x2P(Y = x2) + x3P(Y = x3) + …
= x1P(1) + x2P(2) + x3P(3) + …

note the process: to find P(Y = xi), we need to trace the source of
randomness in i
{Y = xi}

{| Y() = xi}
to understand
this equivalence
is an art that
involves logic,
not mathematics
Y

1
x1
2
x2
3
x3
.
.
.
.
.
.
.
.
.
5
The Expected Value
of a Discrete Random Variable
 discrete
random variable X
 probability
 pn
mass function {pn}
= P(X = n)
 E(X)
= n npn
n here can be
any real number;
e.g., e, -
6
The Expected Value
of a Continuous Random Variable
 continuous
 density
 P(X
 P(X
random variable X
function f(x)
= x) = 0
 [x, x+]) =
 E(X)
=

 sf
x 
x
f ( s )ds;  f(x) for small 
( s )ds
7
Distributions Discussed
 discrete
 Bernoulli,
Binomial, geometric, Poisson
 continuous
 uniform,
exponential
8
Bernoulli Random Variable

X ~ Bern(p)


p0 = P(X = 0) = 1-p & p1 = P(X = 1) = p
suitable for classifying an item into one of the two
categories  an indicator variable

a product being defective (type 1, category A, etc.) with
probability p, and conformable (type 2, category B, etc.) o.w.

E(X) = p

V(X) = E[X E(X)]2 = E(X2) E2(X) = p(1-p)
9
Binomial Random Variable
X
~ Bin(n, p)
n
items, each being defective w.p. p, and
conformable o.w., independent from the status
of the other pieces
X
= the total number of defective items
10
Binomial Random Variable
X
~ Bin(n, p)
n k
n k
C
p
(1

p
)
, k = 0, 1, …, n
 P(X = k) = k
 simple
methods to show that E(X) = np and
V(X) = np(1-p) later
11
Geometric Random Variable

X ~ Geo(p)
X
= the number of flips to get the first head
given that a head appears with probability p,
0<p<1
 pk
= (1p)k-1p, k = 1, 2, ...; pk = 0 o.w.
 simple
methods to show E(X) = 1/p and V(X) =
(1-p)/p2 later
1
(1q )2
 1  2q  3q2  4q3  ....
12
Poisson Random Variable

X ~ Poisson() if pk =
e   k
k!
for k = 0, 1, 2, ...
limit of Bin(n, p) with np =  while p  0
and n  
 the
a
Binomial random variable with n being large and
each being type 1 with small probability p
 E(X)
=  and V(X) = 
lim 1 

n
1
n

n
 n
n
lim 1 


n
e
lim 1 

n
 n
n


n
1
n
e
lim 1 

e  
x
m 0
xm
m!

 e

 e1
n
13
Uniform Random Variable
X
~ uniform[a, b]

density function, f(x) = 1/(ba), x  (a, b)

E(X) = (a+b)/2 and V(X) = (b-a)2/12
14
Exponential Random Variable
X
~ exp()
 density
function f(x) = e-, x > 0; f(x) = 0 o.w.

E(X) = 1/, and V(X) = 1/2

cumulative distribution F(x) = 1-e- x, for x > 0
15
Jointly Distributed Random Variables
 the
joint cumulative probability distribution
function of X and Y
 F(a,
b) = P(X ≤ a, Y ≤ b), −∞ < a, b < ∞
 discrete:
joint probability mass function
p(x, y) = P(X = x, Y = y)
 continuous:
joint probability density function
P(X ∈A, Y ∈ B) =
 A B f ( x, y )dxdy
16
Some Properties of E()
 E[aX
+ bY] = aE[X] + bE[Y]
 E[X
+ Y] = E[X] + E[Y]

for discrete X ,
  x g ( x) p( x),
 .E[ g ( X )]   

  g ( x) f ( x)dx, for continuous X

for discrete X and Y ,
  y  x g ( x, y) p( x, y),
 .E[ g ( X , Y )]    
for continuous X and Y

  g ( x, y) f ( x, y)dxdy,
17
Meaning of E()

three different meanings of E() in E[X + Y] = E[X]
+ E[Y]

Example (context from Ex#1 of WS#5): How to
find E(X+Y)? E(X)? E(Y)?
Y
X
1
2
3
1
0
1/8
1/8
2
1/4
1/4
0
3
1/8
0
1/8
18
Independent Random Variables
 events
A and B being independent:
P(A|B) = P(A)  P(AB) = P(A)P(B)
similarly, P(X > x),
P(X  x), P(X < x),
P(X  x) are
events

X

x
{| X() = x}
an event
19
Independent Random Variables

two random variables X and Y being independent  all
events generated by X and Y being independent

discrete X and Y
P(X = x, Y = y) = P(X = x) P(Y = y) for all x, y

continuous X and Y
fX ,Y(x, y) = fX(x) fY(y) for all x, y

any X and Y
FX ,Y(x, y) = FX(x) FY(y) for all x, y
20
Independent Random Variables
 (Ex
#4(a) of WS #5) Let X be equally likely
to be 1, 2, and 3. Y = X+3 and Z = 2X-1. (a).
Argue that Y and Z are dependent
21
Independent Random Variables
 Example
 flipping
1.9.3 of notes Sample_space_2.pdf
2 coins independently
T
= number of tails in 2 flips
H
= the number of heads in the 2 flips
 Hi
= the number of head in the ith flip, i = 1, 2
 H1  H2?
H1  H? H  T?
22
Proposition 2.3

E[g(X)h(Y)] = E[g(X)]E[h(Y)] for independent X, Y


different meanings of E()
Ex #7 of WS #5 (Functions of independent random
variables)

X and Y be independent and identically distributed
(i.i.d.) random variables equally likely to be 1, 2, and 3

Z = XY

E(X) = ? E(Y) = ? distribution of Z? E(Z) = E(X)E(Y)?
23
Variance and Covariance
(Ross, pp 52-53)
 Cov(X,
Y) = E(XY)  E(X)E(Y)
 Cov(X,X)
 Cov(X,
Y) = Cov(Y, X)
 Cov(cX,
 Cov(X,
= Var(X)
Y) = cCov(X, Y)
Y + Z) = Cov(X, Y) + Cov(X, Z)
 Cov(iXi, jYj) = i j Cov(Xi, Yj)
n
n
 Var
. (  X i )   Var ( X i )  2  Cov( X i , X j )
i 1
i 1
1i j n
24
Two Useful Ideas

for X = X1 + … + Xn, E(X) = E(X1) + … + E(Xn),
no matter whether Xi are independent or not

for a prize randomly assigned to one of the n
lottery tickets, the probability of winning the
price = 1/n for all tickets
 the
order of buying a ticket does not change the
probability of winning
25
Applications of the Two Ideas

the following are interesting applications

mean of Bin(n, p) (Ex #7(b) of WS #8)

variance of Bin(n, p) (Ex #8(b) of WS #8)

the probability of winning a lottery (Ex #3(b) of WS #9)

mean of hypergeometric random variable (Ex #4 of WS
#9)

mean and variance of random number of matches (Ex
#5 of WS #9)
26
Mean of Bin(n, p)
Ex #7(b) of WS #8
 Let
X ~ Bin(n, p). Find E(X) from
E(I1+…+In).
27
Variance of Bin(n, p)
Ex #8(b) of WS #8
 Let
X ~ Bin(n, p). Find V(X) from
V(I1+…+In).
28
Probability of Winning a Lottery
Ex #3(b) & (c) of WS #9

a grand prize among n lotteries

(b) Let n  3. Find the probability that the third
person who buys a lottery wins the grand prize

(c). Let Ii = 1 if the ith person buys the lottery wins
the grand prize, and Ii = 0 otherwise, 1  i  n

(i). Show that all Ii have the same (marginal)
distribution

Find cov(Ii, Ij) for i  j

n
n
i 1
i 1
Verify Var (  X i )   Var ( X i )  2  Cov( X i , X j )
1i j n
29
Hypergeometric
in the Context of Ex #4 of WS #9
3
balls are randomly picked from 2 white &
3 black balls
X
= the total number of white balls picked
P( X  0) 
P( X  2) 
C02C33
C35
C22C13
C35
1

10
3

10
P( X  1) 
C12C23
C35
3

5
E(X) = 6/5
30
Hypergeometric
in the Context of Ex #4 of WS #9
 Ex
#4(c). Assume that the three picked
balls are put in bins 1, 2, and 3 in the order
of being picked
 (i). Find
P(bin i contains a white ball), i = 1,
2, & 3
 (ii).
Define Bi = 1 if the ball in bin i is white
in color, i = 1, 2, and 3. Find E(X) by
relating X to B1, B2, and B3
31
Hypergeometric
in the Context of Ex #4 of WS #9
 Ex
#4(d). Arbitrarily label the white balls
as 1 and 2.

(i). Find P(white ball 1 is put in a bin); find
P(white ball 2 is put in a bin)
 (ii).
Define Wi = 1 if the white ball i is put in
a bin, i = 1, 2. Find E(X) by relating X to W1
and W2
32
Mean and Variance
of Random Number of Matches
Ex #5 of WS #9







gift exchange among n participants
X = total # of participants who get back their own gifts
(a). Find P(the ith participant gets back his own gift)
(b). Let Ii = 1 if the ith participant get back his own gift,
and Ii = 0 otherwise, 1  i  n. Relate X to I1, …, In
(c). Find E(X) from (b)
(d). Find cov(Ii, Ij) for i  j
(e). Find V(X)
33
Example 1.11 of Ross
34
Chapter 2

material to read: from page 21 to page 59 (section
2.5.3)

Examples highlighted: Examples 2.3, 2.5, 2.17, 2.18,
2.19, 2.20, 2.21, 2.30, 2.31, 2.32, 2.34, 2.35, 2.36, 2.37

Sections and material highlighted: 2.2.1, 2.2.2, 2.2.3,
2.2.4, 2.3.1, 2.3.2, 2.3.3, 2.4.3, Proposition 2.1,
Corollary 2.2, 2.5.1, 2.5.2, Proposition 2.3, 2.5.3,
Properties of Covariance
35
Chapter 2
 Exercises
#5, #11, #20, #23, #29, #37, #42,
#43, #44, #45, #46, #51, #71, #72
36