Download STA 348 Introduction to Stochastic Processes Lecture 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
STA 348
Introduction to
Stochastic Processes
Lecture 1
1
Adminis-trivia

Instructor: Sotirios Damouras


Contact Info:



Pronounced Sho-tee-ree-os or Sam
email: [email protected]
Office hours: SE/DV 4062, every Mon 2-4pm and
Tues 4-5pm, or by appointment (email)
Course web page:


https://portal.utoronto.ca/ (UofT Portal)
All course material (outline, lecture slides,
assignments & solutions) posted on portal
Outline

Textbook:



Introduction to probability models 10th Ed, by Sheldon
M. Ross (in bookstore)
Cover (parts of) § 1-8, & extra topics if time permits
Evaluation:



9 Weekly Assignments, best 8/9 worth 20%
 Due @ start of tutorial, NO late submissions
2 Term Tests, worth 20% each
 NO make-up tests. Weight shifted to Final exam with
UofT Medical note AND absence declaration on ROSI
Final Exam, worth 40%
Important Dates
Sept
Oct
Nov
LEC (TUE 2-4pm
@ IB 220)
LEC (THU 3-4pm
@ IB 200)
TUT (FRI 3-4pm
@ IB 200)
6
8
9 no tutorial
13
15
16 assign 1 due
20
22
23 assign 2 due
27
29
30 assign 3 due
4
6
7 assign 4 due
11 Midterm 1
13
14 no tutorial
18
20
21 assign 5 due
25
27
28 assign 6 due
1
3
4 assign 7 due
8 Midterm 2
10
11 no tutorial
15
17
18 assign 8 due
22
24
25 assign 9 due
4
What Is This Course About?

Modeling & analyzing behavior of a collection
of dependent random variables (RV’s)


(X1,X2,…) = {Xt}t=1,2,… is a Stochastic Process
How is this different from Statistics?


Statistics: X1, X2, … is independent random
sample from some distribution/population
Stochastic Processes: { Xt }t=1,2,… is collection of
dependent RV’s, describing a random process at
different points t=1, 2,… in time or space

E.g. Country’s population at year t = 1, 2, …
5
Example 1

Gamble $10 in Roulette (betting on
red / black) till you double or loose it



If you win bet on red/black, you double bet amount
P(winning individual bet) = 18/(36+2) = .4737
Which is the best strategy for maximizing the
chance of doubling money (reaching $20):
A.
B.
C.
Bet $10 all at once
Bet $1 at a time
It doesn’t matter
6
Example 2

You are tossing a fair coin, i.e. P(Heads) =
P(Tails) = ½, and counting the # of tosses till
one of two patterns occurs


Pattern 1 = (H,H) & Pattern 2 = (H,T)
Which pattern appears first on average?
A.
B.
C.
Pattern 1
Pattern 2
Both are equally likely
7
Example 3

A type of bacterium reproduces in
the following way:



With prob. ½ it splits into 2 identical copies
With prob. ½ it dies before dividing
If you place 10 such bacteria on a Petri dish,
what happens to their (long-run) population
A.
B.
C.
It will certainly survive indefinitely
It will certainly die out eventually
It will can either survive or die (w/ some prob’s)
8
Example 4

Consider service queue (e.g. airport security)



People arrive at rate λ, and
People get served at rate μ (λ < μ)
If rate λ doubles, how should rate μ change
so that the mean time a person stays in the
system (wait + service time) stays the same?
A.
B.
C.
μ should double
μ should less than double
μ should more than double
9
Stochastic Processes

How to analyze collection of RV’s?


If RV’s are independent work with marginals


f1,2,...  x1 , x2 ,...  f1  x1   f 2  x2   ... (as in Stats)
If RV’s are dependent, work with conditionals


E.g. {Xt}t=1,2,… with joint pdf f1,2,...  x1 , x2 ,...
f1,2,...  x1 , x2 ,...  f1|2,...  x1 | x2 ,...  f 2|3,...  x2 | x3 ,...  ...
Stochastic Processes mostly deal with
various “types” of conditional dependence

But first, need to brush up our Probability Theory
10
Experiments & Events




Experiment: process with random result
Outcome: elementary result of experiment
Sample Space (S): Set of all outcomes
Event: An arbitrary collection of outcomes


Events are subsets of S, denoted by capital letters
E.g. Rolling a 6-sided die, E = {even roll} = {2,4,6}
Venn Diagram:
outcomes
●1
●3
●5
●2 ●4
●6
S
event E
11
Combining Events



Union:

A  B  {A or B}

in1 Ai  A1  A2 
A
 An
Intersection:
A

A  B  A  B  {A and B}

in1 Ai  A1  A2 
Complement:

AUB
B
A∩B
B
 An
A
AC
Ac  {not A}
12
De Morgan’s Laws
 A  B



C
c
c
A

B

A

B


c
 A  B
c
A B
 A  B
 Ac  Bc
C
A B
More generally:
n
i 1
Ai   A1c  A2c 
 Anc

n
i 1
Ai   A1c  A2c 
 Anc
c
c
13
Probabilities

Consider an experiment with sample space S.
A probability (measure) is a function P(·) that
assigns numbers P(A) to events A⊂S, so that:
1. P  A   0
2. P  S   1
3. If A1 , A2 , A3 ,
then P 
Events A1,A2 ,

i 1
are mutually exclusive events,
Ai    i 1 P  Ai 

are mutually exclusive if Ai  Aj   for all i  j
14
Conditional Probability &
Independence

Conditional Probability: P(A|B) is probability of
event A given that event B has occurred
P  A  B
P  A | B 
, for P  B   0
P  B

Independence: Events A, B are independent if

 P  A | B   P( A)
P  A  B   P  A P  B   

 P  B | A  P( B)
15
Mutual Independence
Generalization to n≥2 events:

A finite collection of events  A1 , A2 ,
, An  is called
(mutually ) independent if for any sub-collection
A , A
k1

k2
,

, Akm : P

m
i 1

 
Aki  i 1 P Aki
m
Pairwise indep. does not imply mutual indep.
P  Ai  Aj   P  Ai  P  Aj  , i  j 
  A1 , A2 ,
, An  are mutually independent
16
Example

S=
(H,H) (H,T)
(T,H) (T,T)
Consider flipping two fair coins & define events
A={(H,H),(H,T)}, B={(H,H),(T,H)}, C={(H,H),(T,T)}

Are A, B, and C pairwise independent?

Are A, B, and C mutually independent?
17
Rules of Probability

c
P
(
A
)  1  P( A)
Complement Rule:

Addition Rule:
P( A  B)  P( A)  P( B)  P( A  B)


B
A
If A, B mutually exclusive,
then P( A  B)  P( A)  P( B)
A B
Multiplication Rule:
P( A  B)  P( A | B) P( B)  P( B | A) P( A)

If A, B independent,
then P( A  B)  P( A) P( B)
18
Rules of Probability

Generalizations for n≥2 events: For any finite
collection of events {A1,A2,...,An}

Addition Rule:
P

n
i 1

Ai   i 1 P  Ai    i  j P  Ai  Aj  
n
  i  j k P  Ai  Aj  Ak  

  1
n 1
P

n
i 1
Ai
Multiplication Rule:
P


n
i 1

Ai  P  A1   P  A2 | A1   P  A3 | A1  A2  
 P  An | A1  A2 
 An 1 
19

Law of Total Probability

S
Partition of S is finite set of
B2
events {B1,B2,...,Bn}, such that:
Bi  B j  , i  j &

n
i 1
Bi  S
B1
A
B3
For any event A and partition {B1,B2,...,Bn},
P  A  i 1 P  A  Bi   i 1 P  Bi  P  A | Bi 
n

From addition
rule, since:
n

 A   A  B1    A  B2     A  Bn 


and  A  Bi    A  B j   , i  j 20
Bayes’ Formula

Let {B1,B2,...,Bn} be a partition of S such that
P(Bi)>0, for i=1,2,...,n. Then, for any event A
P  B j | A 


P  Bj  P  A | Bj 

n
i 1
P  Bi  P  A | Bi 
P( B) P( A | B)
For n=2: P  B | A 
P( B) P( A | B)  P( B c ) P( A | B c )
Method for revising event Bj’s probability, given
information on occurrence of another event A


Know: P(Bj) prior probability, P(A|Bi), i=1,…,n
Want: P(Bj|A) posterior probability
21
Counting Rules

Permutation Rule: Number of permutations of r
objects, selected w/o repeats from n objects:
Prn  n  (n  1) 

 (n  r ) 
n!
, where 0  r  n
 n  r !
Combination Rule: Number of combinations of r
objects, selected w/o repeats from n objects:
n
n


P
n!
Crn     r 
, where 0  r  n
 r  r ! r ! n  r !

Binomial Theorem:
 x  y
n
 n  i n i
  i 0    x  y
i
n
22
Example

(Matching Problem: §1-Q32) At a party, #n
people get drunk & on their way out they grab
a coat at random. What is the probability that
nobody got their own coat?
23
24
Example

n# points are randomly drawn on a circle.
What is the probability that all points lie in a
semi-circle?
25
26
Related documents