Download Draper IR&D Project Progress Report Reliable Software

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
In the Name of the Most High
Performance Evaluation of Computer Systems
Introduction to Probabilities: Discrete Random Variables
By
Behzad Akbari
Tarbiat Modares University
Spring 2009
These slides are based on the slides of Prof. K.S. Trivedi (Duke University)
1
Random Variables



Sample space is often too large to deal with directly
Recall that in the sequence of Bernoulli trials, if we
don’t need the detailed information about the actual
pattern of 0’s and 1’s but only the number of 0’s and 1’s,
we are able to reduce the sample space from size 2n to
size (n+1).
Such abstractions lead to the notion of a random
variable
2
Discrete Random Variables

A random variable (rv) X is a mapping (function) from the sample
space S to the set of real numbers


If image(X ) finite or countable infinite, X is a discrete rv
Inverse image of a real number x is the set of all sample points
that are mapped by X into x:

It is easy to see that

3
Probability Mass Function (pmf)

Ax : set of all sample points such that,

pmf
4
pmf Properties



Since a discrete rv X takes a finite or a countable infinite set
values,
the last property above can be restated as,
5
Distribution Function


pmf: defined for a specific rv value, i.e.,
Probability of a set


Cumulative Distribution Function (CDF)
6
Distribution Function properties





7
Discrete Random Variables

Equivalence:
Probability mass function
 Discrete density function
(consider integer valued random variable)



cdf:
pmf:
pk  P( X  k )
F ( x) 
x

k 0
pk
pk  F (k )  F (k  1)
8
Common discrete random variables






Constant
Uniform
Bernoulli
Binomial
Geometric
Poisson
9
Constant Random Variable

pmf
1.0
c

CDF
1.0
c
10
Discrete Uniform Distribution

Discrete rv X that assumes n discrete value with equal probability
1/n
Discrete uniform pmf

Discrete uniform distribution function:

11
Bernoulli Random Variable
RV
generated by a single Bernoulli trial that has a
binary valued outcome {0,1}
Such a binary valued Random variable X is called the
indicator or Bernoulli random variable so that
 Probability
mass function:
p  P( X  1)
q  1  p  P( X  0)
12
Bernoulli Distribution

CDF
p+q=1
q
0.0
1.0
x
13
Binomial Random Variable

Binomial rv  a fixed no. n of Bernoulli trials (BTs)
RV Yn: no. of successes in n BTs
Binomial pmf b(k;n,p)

Binomial CDF


14
Binomial Random Variable
In
fact, the number of successes in n Bernoulli trials
can be seen as the sum of the number of successes in
each trial:
Y n  X 1 X 2  ...  X n
where Xi ’s are independent identically distributed
Bernoulli random variables.
15
Binomial Random Variable: pmf
pk
16
Binomial Random Variable: CDF
1.2
1
CDF
0.8
0.6
0.4
0.2
0
0
1
2
3
4
5
6
7
8
9
10
x
17
Applications of the binomial

Reliability of a k out of n system
n
n
j k
j k
Rkofn  1  B(k  1; n, R)   b( j; n, R)   ( nj )[ R] j [1  R]n  j


Series system:
n
Rseries  b(n; n, R)   ( )[ R] [1  R]
Parallel system:
j n
n
j
j
n j
 [ R]
n
n
R parallel  1  b(0; n, R)   ( nj )[ R] j [1  R]n  j  1  [1  R]n
j 1
18
Applications of the binomial



Transmitting an LLC frame using MAC blocks
p is the prob. of correctly transmitting one block
Let pK(k) be the pmf of the rv K that is the number of
LLC transmissions required to transmit n MAC blocks
correctly; then
p K (1)  b(n; n, p)  p n
p K (2)  [1  (1  p) 2 ]n  p n
and
p K (k )  [1  (1  p) k ]n  [1  (1  p) k 1 ]n
19
Geometric Distribution



Number of trials upto and including the 1st success.
In general, S may have countably infinite size
Z has image {1,2,3,….}. Because of independence,
20
Geometric Distribution (contd.)



Geometric distribution is the only discrete distribution that exhibits
MEMORYLESS property.
Future outcomes are independent of the past events.
n trials completed with all failures. Y additional trials are performed
before success, i.e. Z = n+Y or Y=Z-n
21
Geometric Distribution (contd.)


Z rv: total no. of trials upto and including the 1st success.
Modified geometric pmf: does not include the successful
trial, i.e. Z=X+1. Then X is a modified geometric random
variable.
22
Applications of the geometric


The number of times the following statement is
executed:
repeat S until B
is geometrically distributed assuming ….
The number of times the following statement is
executed:
while B do S
is modified geometrically distributed assuming ….
23
Negative Binomial Distribution


RV Tr: no. of trials until rth success.
Image of Tr = {r, r+1, r+2, …}. Define events:




A: Tr = n
B: Exactly r-1 successes in n-1 trials.
C: The nth trial is a success.
Clearly, since B and C are mutually independent,
24
Poisson Random Variable




RV such as “no. of arrivals in an interval (0,t)”
In a small interval, Δt, prob. of new arrival= λΔt.
pmf b(k;n, λt/n); CDF B(k;n, λt/n)=
What happens when
25
Poisson Random Variable (contd.)


Poisson Random Variable often occurs in situations,
such as, “no. of packets (or calls) arriving in t sec.” or
“no. of components failing in t hours” etc.
26
Poisson Failure Model
Let
N(t) be the number of (failure) events that occur in the time
interval (0,t). Then a (homogeneous) Poisson model for N(t)
assumes:
1.The probability mass function (pmf) of N(t) is:
 lt  / k !e

P N t   k   

k
 lt
k  0, 1, 2, 
Where l > 0 is the expected number of event
occurrences per unit time
2.The number of events in two non-overlapping intervals are
mutually independent
27
Note:
For a fixed t, N(t) is a random variable (in this
case a discrete random variable known as the
Poisson random variable).
The family {N(t), t  0} is a stochastic process,
in this case, the homogeneous Poisson
process.

28
Poisson Failure Model (cont.)
successive interevent times X1, X2, … in a
homogenous Poisson model, are mutually independent, and
have a common exponential distribution given by:
The
PX 1  t  1  e
lt
To
show this:
t0
P ( random
X 1  t ) variable,
 P ( N ( tN(t),
)  0with
)  ethe Poisson
the discrete
distribution, is related to the continuous random variable X1,
which has an exponential distribution
The mean interevent time is 1/l, which in this case is the
mean time to failure
 lt
Thus,
29
Poisson Distribution
Probability mass function (pmf) (or discrete
density function):

k
(
l
t
)
pk  PN (t )  k   e lt
k!

Distribution function (CDF):
( lt ) k
F x   e
k 0
k!
 x   lt
30
Poisson pmf
pk
lt=1.0
31
Poisson CDF
CDF
1
lt=1.0
0.5
0.1
1
2
3
4
5
6
7
8
9
t
10
32
Poisson pmf
pk
lt=4.0
lt=4.0
33
Poisson CDF
CDF
1
lt=4.0
0.5
0.1
1
2
3
4
5
6
7
8
9
10
t
34
Probability Generating Function (PGF)




Helps in dealing with operations (e.g. sum) on rv’s
Letting, P(X=k)=pk , PGF of X is defined by,
One-to-one mapping: pmf (or CDF) PGF
See page 98 for PGF of some common pmfs
35
Discrete Random Vectors

Examples:



Z=X+Y, (X and Y are random execution times)
Z = min(X, Y) or Z = max(X1, X2,…,Xk)
X:(X1, X2,…,Xk) is a k-dimensional rv defined on S

For each sample point s in S,
36
Discrete Random Vectors (properties)
37
Independent Discrete RVs

X and Y are independent iff the joint pmf satisfies:

Mutual independence also implies:


Pair wise independence vs. set-wide independence
38
Discrete Convolution

Let Z=X+Y . Then, if X and Y are independent,


In general, then,
39