Download Probability and Stochastic Processes

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Probability wikipedia , lookup

Transcript
Probability and Stochastic
Processes
References:
Wolff, Stochastic Modeling and the Theory of
Queues, Chapter 1
Altiok, Performance Analysis of Manufacturing
Systems, Chapter 2
Chapter 0
1
Basic Probability
• Envision an experiment for which the result is unknown. The
collection of all possible outcomes is called the sample space. A set of
outcomes, or subset of the sample space, is called an event.
• A probability space is a three-tuple (W ,, Pr) where W is a sample
space,  is a collection of events from the sample space and Pr is a
probability law that assigns a number to each event in . For any
events A and B, Pr must satsify:
–
–
–
–
Pr(W) = 1
Pr(A)  0
Pr(AC) = 1 – Pr(A)
Pr(A  B) = Pr(A) + Pr(B), if A  B = .
• If A and B are events in  with Pr(B)  0, the conditional probability
of A given B is
Pr  A  B 
Pr  A B  
Pr  B 
Chapter 0
2
Random Variables
A random variable is “a number that you don’t know… yet”
Sam Savage, Stanford University
•
•
•
•
•
•
•
•
•
Discrete vs. Continuous
Cumulative distribution function
Density function
Probability distribution (mass) function
Joint distributions
Conditional distributions
Functions of random variables
Moments of random variables
Transforms and generating functions
Chapter 0
3
Functions of Random Variables
• Often we’re interested in some combination of r.v.’s
– Sum of the first k interarrival times = time of the kth arrival
– Minimum of service times for parallel servers = time until next
departure
• If X = min(Y, Z) then X  x if and only if Y  x and Z  x
– therefore, Pr  X  x   1  Pr Y  x, Z  x 
– and if Y and Z are independent, Pr  X  x   1  Pr Y  x  Pr  Z  x 
• If X = max(Y, Z) then Pr  X  x   Pr Y  x, Z  x 
• If X = Y + Z , its distribution is the convolution of the
distributions of Y and Z. Find it by conditioning.
Chapter 0
4
Conditioning (Wolff)
• Frequently, the conditional distribution of Y given X is
easier to find than the distribution of Y alone. If so,
evaluate probabilities about Y using the conditional
distribution along with the marginal distribution of X:
Pr Y  A 

 Pr  Y  A X  x  f  x  dx
X

– Example: Draw 2 balls simultaneously from urn containing four
balls numbered 1, 2, 3 and 4. X = number on the first ball, Y =
number on the second ball, Z = XY. What is Pr(Z > 5)?
– Key: Maybe easier to evaluate Z if X is known
4
Pr  Z  5    Pr  Z  5 X  x  Pr  X  x 
x 1
Chapter 0
5
Convolution
• Let X = Y+Z.
Pr  X  x Z  z   Pr Y  Z  x Z  z   Pr Y  x  z Z  z 
FX  x    Pr Y  x  z Z  z  f Z  z  dz  



 

x z
fY Z  y Z  z  f Z  z dydz
• If Y and Z are independent,
FX  x   


x z
 
fY  y  f Z  z dydz
– Example: Poisson
– Note: above is cdf. To get density, differentiate:
fX  x 
x z

d
d 


FX  x  
f
z
f
y
dy
dz

f Z  z  fY  x  z dz
Z    
Y  







dx
dx
Chapter 0
6
Moments of Random Variables

• Expectation = “average” E  X    xf X  x dx or  x Pr  X  x 


E  g  X    g  x  f X  x dx or  g  x  Pr  X  x 

2
2
2


Var
X

E
X

E
X

E
X

E
X
“volatility”  
       


• Variance =
• Standard Deviation Var  X 
• Coefficient of Variation
CvX  Var  X  E  X 
(s.c.v.) CvX2  Var  X  E 2  X 
Chapter 0
7
Linear Functions of Random Variables
• Covariance Cov  X , Y   E  X  E  X   Y  E Y    E  XY   E  X  E Y 
• Correlation   Cov  X , Y 
XY
Var  X  Var Y 
If X and Y are independent then Cov  X , Y    XY  0
E  X  Y   E  X   E Y 
E  aX   aE  X 
Var  aX   a 2 Var  X 
Var  X  Y   Var  X   Var Y   2Cov  X , Y 
Chapter 0
8
Transforms and Generating Functions
• Moment-generating function

M *    E e X    e x f X  x dx

E  X k  
d k E e X 
d k
 0
• Laplace transform (nonneg. r.v.) E e sX   e sx f  x dx, s  0
X

 0

E  X k    1
k
• Generating function (z – transform)
d k E e  sX 
ds k
s 0
Let N be a nonnegative integer random variable; Pn  Pr  N  n  , n  0,1, 2,...
G  z    n 0 Pn z n  E  z N  ,

z  1.
dG  z 
d 2G  z 
2
EN 
, EN  EN  
dz z 1
dz 2 z 1
Chapter 0
9
Special Distributions
• Discrete
–
–
–
–
Bernoulli
Binomial
Geometric
Poisson
• Continuous
–
–
–
–
Uniform
Exponential
Gamma
Normal
Chapter 0
10
Bernoulli Distribution
“Single coin flip” p = Pr(success)
n 1
 p,
N = 1 if success, 0 otherwise Pr  N  n   
1  p, n  0
EN  p
Var  N   p 1  p 
CvN2 
1 p
p
M *    1  p  pe 
Chapter 0
11
Binomial Distribution
“n independent coin flips” p = Pr(success)
N = # of successes
 n k
nk
Pr  N  k     p 1  p  , k  0,1,..., n
k
E  N   np
Var  N   np 1  p 
1 p
Cv 
np
2
N
M    1  p  pe
*

 n
Chapter 0
12
Geometric Distribution
“independent coin flips” p = Pr(success)
N = # of flips until (including) first success
Pr  N  k   1  p 
k 1
p, k  1, 2,...
EN  1 p
Var  N   1  p  p 2
CvN2  1  p
Memoryless property: Have flipped k times without success;
Pr  N  k  n N  k   1  p 
n 1
Chapter 0
p (still geometric)
13
z-Transform for Geometric Distribution
(1-p)n-1p,
Given Pn =
G  z    n 1 1  p 


n 1
n = 1, 2, …., find G  z   n0 Pn z n

pz n  pz  n 1  1  p  z 

n 1
 pz  n 0  1  p  z 

n
pz

1
, using geometric series  n 0 a n 
for a  1
1  1  p  z
1 a
Then,
dG  z 
p
EN 

dz z 1 1  p  pz 2

z 1
1
p
2
d
G z
2 1  p 
2 p
2
EN2 EN 

,
so
E
N

and


2
2
2
dz
p
p
z 1
Var  N   E  N
2
   E  N 
2
Chapter 0
1 p
 2
p
14
Poisson Distribution
“Occurrence of rare events”  = average rate of occurrence
per period;
N = # of events in an arbitrary period
 k e 
Pr  N  k  
, k  0,1, 2,...
k!
EN  
Var  N   
CvN2  1 
Chapter 0
15
Uniform Distribution
X is equally likely to fall anywhere within interval (a,b)
1
fX  x 
, a xb
ba
EX  
Var  X 
ab
2
b  a


2
12
b  a


2
3b  a 
2
Cv X2
a
Chapter 0
b
16
Exponential Distribution
X is nonnegative and it is most likely to fall near 0
f X  x    e  x , x  0
FX  x   1  e   x , x  0
EX  
Var  X  
1

1
2
CvX2  1
Also memoryless; more on this later…
Chapter 0
17
Gamma Distribution
X is nonnegative, by varying parameter b get a variety of shapes

 b xb 1e  x
b 1  x
fX  x 
 b
EX  
Var  X  
, x  0, where   b    x e dx for b  0
0
b

b
2
1
Cv 
b
2
X
When b is an integer, k, this is called the Erlang-k distribution,
and   k    k 1! Erlang-1 is same as exponential.
Chapter 0
18
Normal Distribution
X follows a “bell-shaped” density function
1
 x
2
fX  x 
e 
,   x  
 2
2
2
EX   
Var  X    2
From the central limit theorem, the distribution of the sum of
independent and identically distributed random variables
approaches a normal distribution as the number of summed
random variables goes to infinity.
Chapter 0
19
m.g.f.’s of Exponential and Erlang
If X is exponential and Y is Erlang-k,
M X*   

  
and M Y*    

 





k
Fact: The mgf of a sum of independent r.v.’s equals the
product of the individual mgf’s.
Therefore, the sum of k independent exponential r.v.’s (with
the same rate ) follows an Erlang-k distribution.
Chapter 0
20
Stochastic Processes
A stochastic process is a random variable that changes over time,
or a sequence of numbers that you don’t know yet.
• Poisson process
• Continuous time Markov chains
Chapter 0
21
Stochastic Processes
Set of random variables, or observations of the same random
variable over time:  X t , t  0 (continuous-parameter) or
 X n , n  0,1,... (discrete-parameter)
Xt may be either discrete-valued or continuous-valued.
A counting process is a discrete-valued, continuousparameter stochastic process that increases by one each
time some event occurs. The value of the process at time t
is the number of events that have occurred up to (and
including) time t.
Chapter 0
22
Poisson Process
Let  X  t  , t  0 be a stochastic process where X(t) is the
number of events (arrivals) up to time t. Assume X(0)=0 and
(i) Pr(arrival occurs between t and t+t) = t  o  t  ,
where o(t) is some quantity such that limt 0 o  t  / t  0
(ii) Pr(more than one arrival between t and t+t) = o(t)
(iii) If t < u < v < w, then X(w) – X(v) is independent of X(u) –
X(t).
Let pn(t) = P(n arrivals occur during the interval (0,t). Then …
n
 t
e  t 
pn  t  
,n  0
n!
Chapter 0
23
Poisson Process and Exponential Dist’n
Let T be the time between arrivals. Pr(T > t) = Pr(there are no
 t
arrivals in (0,t) = p0(t) = e
Therefore, FT  t   Pr T  t   1  e  t , t  0, and
fT  t    e   t , t  0
that is, the time between arrivals follows an exponential
distribution with parameter  = the arrival rate.
The converse is also true; if interarrival times are exponential,
then the number of arrivals up to time t follows a Poisson
distribution with mean and variance equal to t.
Chapter 0
24
When are Poisson arrivals reasonable?
1.
The Poisson distribution can be seen as a limit of the
binomial distribution, as n , p0 with constant =np.
-
-
2.
3.
many potential customers deciding independently about arriving
(arrival = “success”),
each has small probability of arriving in any particular time interval
Conditions given above: probability of arrival in a small
interval is approximately proportional to the length of the
interval – no bulk arrivals
Amount of time since last arrival gives no indication of
amount of time until the next arrival (exponential –
memoryless)
Chapter 0
25
More Exponential Distribution Facts
1.
Suppose T1 and T2 are independent with T1 exp  1  , T2 exp  2 
Then Pr T1  T2  
1
1  2
Suppose (T1, T2, …, Tn ) are independent with Ti exp  i 
Let Y = min(T1, T2, …, Tn ) . Then Y exp  1  2  ...  n 
3. Suppose (T1, T2, …, Tk ) are independent with Ti exp   
2.
Let W= T1 + T2 + … + Tk . Then W has an Erlang-k distribution with
k 1
 w

density function
fW  w  
e   w , w  0 with
 k  1!
E W  
Var W  
k

and
k
2
Chapter 0
26
Continuous Time Markov Chains
A stochastic process  X  t  , t  0 with possible values
(state space) S = {0, 1, 2, …} is a CTMC if
Pr  X  u  t   j X  s  , s  u   Pr  X  u  t   j X  u  
“The future is independent of the past given the present”
Define p t  Pr  X u  t  j X u  i  (note: indep. of u )

  
ij  
 
Then
0  pij  t   1,
 p t   1
ij
j
Chapter 0
27
CTMC Another Way
1.
2.
Each time X(t) enters state j, the sojourn time is
exponentially distributed with mean 1/qj
When the process leaves state i, it goes to state j  i with
probability pij, where pii  0, 0  pij  1,  pij  1
Let P  t    pij  t   , where P  0   I
Then
j
 j  t   Pr  X  t   j    pij  t   i  0 
i
Chapter 0
28
CTMC Infinitesimal Generator
The time it takes the process to go from state i to state j
Tij exp  qij 
Then qij is the rate of transition from state i to state j, qi   qij
j
The infinitesimal generator is
 q0
q
Q   10
 q20


q01
q1
q21
q02
q12
q2
  q0
 q p
   1 10
  q2 p20
 
 
Chapter 0
qo p01
q1
q2 p21
q0 p02
q1 p12
q2






29
Long Run (Steady State) Probabilities
Let limt  pij  t    j
•
Under certain conditions these limiting probabilities can be
shown to exist and are independent of the starting state;
•
They represent the long run proportions of time that the
process spends in each state,
•
Also the steady-state probabilities that the process will be
found in each state.
Then  Q  0 with
 1
i i
or, equivalently, q j j   qi pij i for all j  0,1, 2,...

i j
rate out of j = rate into j
Chapter 0
30
Phase-Type Distributions
• Erlang distribution
• Hyperexponential distribution
• Coxian (mixture of generalized Erlang) distributions
Chapter 0
31