Download X - Dr. Wissam Fawaz

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Probability wikipedia , lookup

Probability interpretations wikipedia , lookup

Transcript
Random variables (r.v.)

Random variable

Definition: a variable that takes on



various values in a random way
Examples

Number of items to check out at super-market
 Discrete (takes on a countable number of values)

End-to-end delay in a communication network
 Continuous ([propagation delay; +infinity])
Described thru

Probability mass function (pmf)

Probability density function (pdf)
1
Probability distribution
functions

The behavior of a

Discrete random variable

is captured by a probability mass function
 Example: # items to check out at super-market (X)


P[X = i] and i = 1, 2, . . ., 200 (a certain upper bound)
of a continuous random variable

is captured by a probability density function
 Example: End-to-end delay in a communication network (D)

fD(d)
2
Probability theory

Outcome
Origin

Experiments
1/2

H
Example 1

1/2
Toss a coin
 3 times
H
1/2
1/2

H
HHH
1/2
T
HHT
1/2
H
HTH
1/2
T
HTT
H
THH
T
THT
H
TTH
T
TTT
1/2
Possible r.v.
 X = # of heads
T
T
1/2
1/2
H
1/2
1/2
1/2
T

A discrete r.v.
Sample
space
1/2
3
Example 1 continued
Sample
space

X
{0,1,2,3}
What is the probability distribution of X?

Building on the following observation

P(X = i) = # of feasible sample points / total # of sample points

P(X=0) = 1/8; P(X=1) = 3/8

P(X=2) = 3/8; P(X=3) = 1/8
 P( X  i)  1
4
Example 2: cost functions

Cost function

X = cost_function (sample points)

Example
 If heads appears you make 10 $

If tail appears you pay 5 $

X = amount of money you make

=> X = {-15, 0, 15, 30} (discrete r.v.)
5
Example 3: continuous r.v.

End to end delay in a communication network

X

Continuous random variable

belongs to a state space
 Lower bound = propagation delay

Upper bound = +infinity
6
Discrete random variables:
probability mass functions

Discrete random variables

Classified according to their probability mass function

I will cover

Binomial distribution

Geometric distribution

Poisson distribution
7
Binomial distribution

Binomial distribution


is primarily associated with the tossing of a coin

A certain number of independent trials

Outcome #1 (or 1) with probability p (referred to as success)

Outcome#2 (or 2) with probability 1-p (referred to as failure)
Example: n trials (n=6)

What is the probability that the following sequence arises?
 1, 2, 1, 2, 1, 2
 Answer: Prob = p (1-p) p (1-p) p (1-p) = p3(1-p)3
8
Binomial random variable

Suppose

n independent trials resulting

in a “success” with probability p

And in a “failure” with probability (1-p)

If X represents the number of successes in the n trials
 => X is a binomial random variable with parameters (n, p)
9
Binomial distribution:
example1

Example:


Four fair coins are flipped. What is the probability that
two heads and two tails are obtained?
Solution

Let X equal the number of heads (successes)

=> X is a binomial r.v. with parameter (n=4, p=1/2)
 4  1 
P( X  2)    
 2  2 
2
2
 1   3
   
 2 8
10
Binomial distribution:
example2

Example

It is known


that any item produced by a certain machine will be defective
 with probability 0.1, independent of any other item.
What is the probability that in

A sample of three items

At most one will be defective?
11
Geometric random variable

Experiment


n trials

Each having probability p of being a success

Are performed until a success occurs
If X is the number of trials required until the first success

X is a geometric r.v. with parameter p

Its probability mass function is
12
Geometric r.v.: application

Time sharing

Jobs running on a computer


X: Represents how many times a job cycles around


gets queued in order to use the CPU
 A quantum of time is assigned to each process
=> is a geometric r.v.
6 tosses of a coin

The first outcome is Heads

How many more heads do I need before I get a tail?
13
Poisson distribution

Poisson distribution is

Associated with the observation of event occurrences
T=15 min
0
Event#1


Time
If N represents the number of events in T
 => N/T = average number of events /minute
interested in answering the following question

How many occurrences of this event take place per minute?
 The way it has been done

Either 0 or 1 event occurrence per minute
14
Poisson distributed random
variable

A Poisson random variable X

Characterizes the number of occurrences of an event


Typically an arrival => X = # arrivals per unit time
With parameter λ (average # of arrivals per unit time)
p( X  i )  e 

i
i!
The value of λ (arrival rate)
# arrivals ( N )

lim
T
T 
15
Poisson distribution:
example 1

Example

If number of accidents occurring on a highway per day



is a Poisson r.v. with parameter λ = 3,
What is the probability that no accidents occur today?
Solution
P( X  0)  e3  0.05
16
Poisson distribution:
example 2

Consider an experiment that

counts the number of α-particles emitted in

a one-second interval by one gram of radioactive material.

If we know that ,on average , 3.2 such α-particles are given off

what is a good approximation
 to the probability that no more than 2 α-particles appear?
P{ X  2}  e
3.2
 3.2e
3.2
(3.2) 2 3.2

e
 0.382
2
17
Binomial approximation to the
Poisson distribution
0
N events
ΔT

Time
Divide the time axis into ΔT
 small enough so that



T
At most only one arrival can occur
n ΔTs are required
It is like creating a binomial experiment
 Each ΔT is a trial

Outcome: 0 arrivals (p ?) or 1 arrival ((1-p)?)
18
Binomial approximation to the
Poisson distribution (cont’d)

With what probability we are going to have


1 arrival or 0 arrivals in one ΔT ?
Average arrival rate per ΔT interval
N .T
0  p  1 (1  p) 
T

As such
N  T
Pr(0arrivals )  1 
T
N  T
Pr(1arrivals ) 
T
i
n i
n
  T .N   T .N 
Pr( X  i)   
 1 

T 
 i  T  
19
Binomial approximation to the
Poisson distribution (cont’d)
 n  T .N   T .N 
Pr( X  i )   

 1 
T 
 i  T  
1 N
T  ;  
n T
i

n i
If you let n tends to infinity you will get
Pr( X  i )  e


i
i!
20
Cumulative distribution

Consider a discrete r.v. X

Taking on the values from 0 to infinity

The cumulative distribution function can be expressed


F (j) = P(X <= j) = P(X=0) + .... + P(X=j-1) + P(X=j)
For instance

Suppose X has a probability mass function given by

P(1) = ½, P(2) = 1/3, P(3) = 1/6

The cumulative function F of X is given by
0, j  1
1
 ,1  j  2
2
F ( j)  
 5 ,2  j  3
6
21
1,3  j

Residual distribution

Given by

P( X  j )   P( X  i)
i j

 P( X  i)  1  P( X  j )  1  P( X  j )
i 0
22
Expectation of a discrete
random variable

X is a discrete random variable

Having a probability mass function p(X)

=> Expected value of X is defined by E[X] as
E[ X ]   x. p( x)
x
E[aX  b]  aE[ X ]  b

a and b are constants
Variance of X
Var ( X )  E[ X 2 ]  ( E[ X ]) 2
23
Expectation:
example 1

Find E[X]

where X is the outcome when we roll a fair dice
1 1 1 1 1 1 7
E[ X ]  1   2   3   4   5   6  
6 6 6 6 6 6 2

Find Var(X)

when X represent the outcome when we roll a fair dice
1 1 1
1
1
 1  91
E[ X ]  1   4   9   16   25   36  
6 6 6
6
6
6 6
2
2
91  7  35
 Var ( X )     
6  2  12
24
Expectation:
example 2

Calculate E[X] when X is Binomially distributed

With parameters n and p
n i
E[ X ]   ip (i )   i  p (1  p ) n i
i 0
i 0  i 
n
n
in!
n!
i
n i

p (1  p)  
p i (1  p ) n i
i 1 ( n  i )!i!
i 1 ( n  i )! (i  1)!
n
n
n 1 n  1

 k
n!
i 1
n i
 p (1  p ) n 1 k
 np 
p (1  p)  np  
i 1 ( n  i )! (i  1)!
k 0  k 
n
 np[ p  (1  p )] n 1  np
25
Expectation:
example 3

Find E[X]

Of a geometric random variable X with parameter p

E[ X ]   np (1  p )
n 1
n 1

 p  nq n 1
n 1
d n
d   n
 E[ X ]  p
q  p q 
dq  n 1 
n 1 dq

p
 
d  q 
p
1

 

dq  1  q  1  q 2 p
26
Expectation:
example 4

Calculate E[X]

For Poisson random variable X with parameter λ

ie   i
e   i
E[ X ]  

i!
i 0
i 1 (i  1)!

 e


i 1

k
 (i  1)!  e  k!  e  e   .
i 1


k 0
as : k 0 k / k!  e .

27