Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Al-Imam Mohammad Ibn Saud University CS433 Modeling and Simulation Lecture 04 Statistical Models http://10.2.230.10:4040/akoubaa/cs433/ 27 Oct 2008 Dr. Anis Koubâa 1 Goals for Today Understand the difference between discrete and continuous random variables Review of the most common statistical models Understand how to determine the empirical distribution from a statistical sample. 2 2 Topics Discrete Random Variable Continuous Random Variable Discrete Probability Distributions Binomial Distribution Bernoulli Distribution Discrete Poisson Distribution Continuous Probability Distribution Uniform Exponential Normal Weibull Lognormal Empirical Distributions 3 3 Discrete and Continuous Random Variables 4 Discrete Random Variables X is a discrete random variable if the number of possible values of X (the sample space) is finite. Example: Consider jobs arriving at a job shop. Let X be the number of jobs arriving each week at a job shop. Rx = possible values of X (range space of X) = {0,1,2,…} p(xi) = probability the random variable is xi = p(X = xi) The collection of pairs [xi, p(xi)], i = 1,2,…, is called the probability distribution of X, p(xi) is called the probability mass function (PMF) of X. Characteristics of the PMF: p(xi), i = 1,2, … must satisfy: 1. p (x i ) 0, for all i 2. i 1 p (x i ) 1 5 Continuous Random Variables X is a continuous random variable if its range space Rx is an interval or a collection of intervals. The probability that X lies in the interval [a,b] is given by: b P(a X b) f ( x)dx a Where f(x) is the probability density function (PDF). Characteristics of the PDF: f(x) must satisfies: 1. f ( x) 0 , for all x in R X 2. f ( x)dx 1 RX 3. f ( x) 0, if x is not in RX Properties x0 1. P( X x0 ) 0, because f ( x)dx 0 x0 2. P(a X b) P(a X b) P(a X b) P(a X b) 6 Discrete versus Continuous Random Variables Discrete Random Variable Continuous Random Variable Finite Sample Space e.g. {0, 1, 2, 3} Probability Mass Function (PMF) p x i P X x i 1. p (x i ) 0, for all i 2. i 1 p (x i ) 1 Infinite Sample Space e.g. [0,1], [2.1, 5.3] Probability Density Function (PDF) f x 1. f ( x) 0 , for all x in R X 2. f ( x)dx 1 RX 3. f ( x) 0, if x is not in RX Cumulative Distribution Function (CDF) p X x p X x x i x p (x i ) p X x f t dt 0 x p a X b f x dx b a 7 Five Minutes Break You are free to discuss with your classmates about the previous slides, or to refresh a bit, or to ask questions. Administrative issues • Groups Formation • Choose a “class coordinator” 8 Expectation The expected value (the mean) of X is denoted by E(X) If X is discrete E ( x) xi p( xi ) all i If X is continuous A measure of the central tendency The variance of X is denoted by V(X) or var(X) or s2 E ( x) xf ( x)dx 2 V X E X E X Definition: 2 Also, V X E X 2 E X A measure of the spread or variation of the possible values of X around the mean The standard deviation of X is denoted by s Definition: square root of V(X) Expressed in the same units as the mean 9 Example: Continuous Random Variables Example: modeling the lifetime of a device Time is a continuous random variable Random Time is typically modeled as exponential distribution We assume that with average lifetime of a device is 2 years 1 x / 2 e , x0 f ( x) 2 0, otherwise Probability that the device’s life is between 2 and 3 years is: 1 3 x / 2 P(2 x 3) e dx 0.14 2 2 10 Example: Continuous Random Variables Cumulative Distribution Function: A device has the CDF: 1 x t / 2 F ( x) e dt 1 e x / 2 2 0 The probability that the device lasts for less than 2 years: P(0 X 2) F (2) F (0) F (2) 1 e 1 0.632 The probability that it lasts between 2 and 3 years: P(2 X 3) F (3) F (2) (1 e(3 / 2) ) (1 e1 ) 0.145 11 Example: Continuous Random Variables Expected Value and Variance Example: The mean of life of the previous device is: 1 x /2 x /2 E (X ) xe dx xe e x / 2dx 2 0 2 0 0 To compute variance of X, we first compute E(X2): 1 x / 2 2 x / 2 2 E ( X ) x e dx x e e x / 2 dx 8 0 2 0 0 2 Hence, the variance and standard deviation of the device’s life are: V ( X ) 8 22 4 s V (X ) 2 12 Discrete Probability Distributions Bernoulli Trials Binomial Distribution Geometric Distribution Poisson Distribution Poisson Process 13 Discrete Distributions Discrete random variables are used to describe random phenomena in which only integer values can occur. In this section, we will learn about: Bernoulli trials and Bernoulli distribution Binomial distribution Geometric and negative binomial distribution Poisson distribution 14 14 Modeling of Random Events with Two-States Bernoulli Trials Binomial Distribution 15 Bernoulli Trials In the theory of probability and statistics, a Bernoulli trial is an experiment whose outcome is random and can be either of two possible outcomes, "success" and "failure". In practice it refers to a single experiment, which can have one of two possible outcomes. These events can be phrased into “yes” or “no” questions: Did the coin land heads? Was the newborn child or a girl? Were a person's eyes green? 16 Bernoulli Distribution Consider an experiment consisting of n trials, each can be a success or a failure. Let Xj = 1 if the jth experiment is a success and Xj = 0 if the jth experiment is a failure The Bernoulli distribution (one trial): x j 1, j 1, 2,..., n p, PMF: p j (x j ) p (x j ) 1 p q , x j 0 , j 1, 2 ,..., n 0, otherwise Expected Value: E X j p Variance :V X j s 2 p 1 p Bernoulli process It is the n Bernoulli trials where trials are independent: p X 1, X 1,..., X n , p X 1 p X 2 ...p X n 17 Binomial Distribution A binomial random variable is the number of successes in a series of n trials. Example: the number of 'heads' occurring when a coin is tossed 50 times. A discrete random variable X is said to follow a Binomial distribution with parameters n and p, written X ~ Bi(n,p) or X ~ B(n,p) if it has the probability distribution: n k n k P X k p 1 p k where Where x = 0, 1, 2, ......., n n = 1, 2, 3, ....... p = success probability; 0 < p < 1 n n! k k ! n k ! Expected Value: E X n p Variance :V X s 2 n p 1 p 18 Binomial Distribution The trials must meet the following requirements: the total number of trials is fixed in advance; there are just two outcomes of each trial; success and failure; the outcomes of all the trials are statistically independent; all the trials have the same probability of success. 19 19 Binomial Distribution The number of successes in n Bernoulli trials, X, has a binomial distribution. n k n k p q p( X k ) k 0, The number of outcomes having the required number of successes and failures , k 0,1,2,..., n otherwise Probability that there are x successes and (n-x) failures The formula can be understood as follows: we want k successes (pk) and n − k failures (1 − p)n − k. However, the k successes can occur anywhere among the n trials, and there are C(n, k) different ways of distributing k successes in a sequence of n trials. 20 End of Part 01 Administrative issues • Groups Formation • Choose a “class coordinator” 21 Modeling of Discrete Random Time Geometric Distribution 22 Geometric Distribution Geometric Distribution represents the number X of Bernoulli trials to achieve the FIRST SUCCESS. It is used to represent random time until a first transition occurs q PMF: p (X k ) 0, k 1 p , k 0,1, 2,..., n otherwise CDF: F X p X k 1 1 p Expected Value : E X Variance :V X s 2 k 1 p k q p PMF 2 1 p p2 23 Negative Binomial Distribution 24 Negative Binomial Distribution The negative binomial distribution is a discrete probability distribution that can be used to describe the distribution arising from an experiment consisting of a sequence of independent trials, subject to several constraints. Firstly each trial results in success or failure, the probability of success for each trial, p, is constant across the experiment and finally the experiment continues until a fixed number of successes have been achieved. Negative Binomial Distribution The number of Bernoulli trials, X, until the kth success If X is a negative binomial distribution with parameters p and r, then: k r 1 r k p 1 p , k 1, 2,3... PMF r , p : p X k k 0, otherwise 1 p Expected Value : E X r p Variance :V X s 2 r 1 p p2 25 Modeling of Random Number of Arrivals/Events Poisson Distribution Poisson Process 26 Poisson Distribution the Poisson distribution is a discrete probability distribution that expresses the probability of a number of events occurring in a fixed period of time if these events occur with a known average rate and independently of the time since the last event. Poisson random variable represents the count of the number of events that occur in a certain time interval or spatial area. Example: The number of cars passing a fixed point in a 5 minute interval, The number of calls received by a switchboard during a given period of time. The number of message coming to a router in a given period of time 27 Discrete Poisson Distribution A discrete random variable X is said to follow a Poisson distribution with parameter l, written X ~ Po(l), if it has probability distribution PMF: P X k l k k! exp l The PMF represents the probability that there are k arrivals in a certain period of time. where X = 0, 1, 2, ..., n l > 0 is called the arrival rate. 28 Discrete Poisson Distribution Poisson distribution describes many random processes quite well and is mathematically quite simple. The Poisson distribution with the parameter l is characterized by: lk exp l for k 0,1, 2, .... PMF: p k P X k k ! 0, otherwise CDF: F k p X k k PMF li i ! exp l i 0 Expected value: E X l Variance: V X l CDF 29 Discrete Poisson Distribution The following requirements must be met in the Poisson Distribution: the length of the observation period is fixed in advance; the events occur at a constant average rate; the number of events occurring in disjoint intervals are statistically independent. 30 Example: Poisson Distribution The number of cars that enter the parking follows a Poisson distribution with a mean rate equal to l = 20 cars/hour or The probability of having exactly 15 cars entering the parking in one hour: 2015 p 15 P X 15 exp 20 0.051649 15! p 15 F 15 F 14 0.156513 0.104864 0.051649 The probability of having more than 3 cars entering the parking in one hour: p X 3 1 p X 3 1 F 3 1 p 0 p 1 p 2 p 3 0.9999967 USE EXCEL/MATLAB FOR COMPUTATIONS 31 Example: Poisson Distribution Probability Mass Function Poisson (l = 20 cars/hour) 20k p X k exp 20 k! Cumulative Distribution Function Poisson (l = 20 cars/hour) k 20i F k p X k exp 20 i! i 0 32 Five Minutes Break You are free to discuss with your classmates about the previous slides, or to refresh a bit, or to ask questions. Administrative issues • Groups Formation 33 Modeling of Random Number of Arrivals/Events Poisson Distribution Poisson Process 34 Poisson Process Wikipedia: A Poisson process, named after the French mathematician Siméon-Denis Poisson (1781 – 1840), is the stochastic process in which events (e.g. arrivals) occur continuously and independently of one another. Formal Definition: The Poisson Process is a counting function {N(t), t≥0} where N(t) is the number of events that have occurred up to time t , i.e. in the interval [0, t]. Fact: The number of events between time a and time b is given as N(b) − N(a) and has a Poisson distribution. The Poisson process is a continuous-time process: Time is continuous Its discrete-time counterpart is the Bernoulli process Bernoulli process is a discrete-time stochastic process consisting of a sequence of independent random variables taking values over two symbols. 35 Examples of using Poisson Process The number of web page requests arriving at a server may be characterized by a Poisson process except for unusual circumstances such as coordinated denial of service attacks. The number of telephone calls arriving at a switchboard, or at an automatic phone-switching system, may be characterized by a Poisson process. The number of raindrops falling over a wide spatial area may be characterized by a spatial Poisson process. The arrival of "customers" is commonly modelled as a Poisson process in the study of simple queueing systems. The execution of trades on a stock exchange, as viewed on a tick by tick basis, is a Poisson process. 36 (Homogenous) Poisson Process The homogeneous Poisson process is characterized by a CONSTANT rate parameter λ, also known as intensity, such that the number of events in time interval t , t t follows a Poisson distribution with associated parameter l t . Formally, A counting process N t ,t 0 is a (homogenous) Poisson process with mean rate l if: for t 0 and n 0,1, 2,... (l t ) n PMF: p N t t N t n p N (t ) n exp l t n! N t t N t describes the number of events in time interval t , t t The mean and the variance are equal E N t V N t l t 37 (Homogenous) Poisson Process Properties of Poisson process Arrivals occur one at a time (not simultaneous) N t ,t 0 has stationary increments, which means N t N s N t s The number of arrivals in time s to t is also Poissondistributed with mean l t s N t ,t 0 has independent increments 38 CDF of Exponential distribution Inter-Arrival Times of a Poisson Process 39 Inter-arrival time: time between two consecutive arrivals The inter-arrival times of a Poisson process are random. What is its distribution? Consider the inter-arrival times of a Poisson process (A1, A2, …), where Ai is the elapsed time between arrival i and arrival i+1 The first arrival occurs after time t MEANS that there are no arrivals in the interval [0,t], As a consequence: p A1 t p N t 0 exp l t p A1 t 1 p A1 t 1 exp l t The Inter-arrival times of a Poisson process are exponentially distributed and independent with mean 1/l 39 Splitting and Pooling Splitting A Poisson process can be split into two Poisson processes: The first with a probability p and the second with probability 1-p. N t N 1 t N 2 t where N 1 t and N 2 t are both Poisson processes with rates l p and l 1 p N(t) ~ Poi(l) lp l l(1-p) N1(t) ~ Poi[lp] N2(t) ~ Poi[l(1-p)] Pooling The summation of two Poisson processes is a Poisson process N 1 t N 2 t N t , where N t is a Poisson processes with rates l1 l2 N1(t) ~ Poi[l1] N2(t) ~ Poi[l2] l1 l 1 l2 l2 N(t) ~ Poi(l1 l2) 40 Modeling of Random Number of Arrivals/Events Poisson Distribution Non Homogenous Poisson Process 41 Non Homogenous (Non-stationary) Poisson Process (NSPP) The non homogeneous Poisson process is characterized by a VARIABLE rate parameter λ(t), the arrival rate at time t. In general, the rate parameter may change over time. l1 l2 l3 The stationary increments, property is not satisfied s ,t : N t N s N t s The expected number of events (e.g. arrival) between time s and time t is t ls ,t λ(u) du s 42 Example: Non-stationary Poisson Process (NSPP) The number of cars that cross the intersection of King Fahd Road and Al-Ourouba Road is distributed according to a non homogenous Poisson process with a mean l(t) defined as follows: 80 cars/mn if 8 am t 9am 60 cars/mn if 9am t 11pm l t 50 car/mn if 11am t 15 pm 70 car/mn if 15 pm t 17 pm Let us consider the time 8 am as t=0. Q1. Compute the average arrival number of cars at 11H30? Q2. Determine the equation that gives the probability of having only 10000 car arrivals between 12 pm and 16 pm. Q3. What is the distribution and the average (in seconds) of the inter-arrival time of two cars between 8 am and 9 am? 43 Example: Non-stationary Poisson Process (NSPP) Q1. Compute the average arrival number of cars at 11H30? l8:00,11:30 11:30 8:00 9:00 8:00 λ(u) du λ(u) du 11:00 9:00 λ(u) du 11:30 11:00 λ(u) du 80cars/mn 60mn 60cars/mn 120mn 50cars/mn 30mn 13500 cars Q2. Determine the equation that gives the probability of having only 10000 car arrivals between 12 pm and 16 pm. We know that the number of cars between 12 pm and 16 pm, i.e. N 16 N 12 follows a Poisson distribution. During 12 pm and 16pm, the average number of cars is l12:0016:00 180 50 60 70 13200 cars Thus, p N 16 N 12 13200 10000 10000 exp 13200 10000! Q3. What is the distribution and the average (in seconds) of the inter-arrival time 44 of two cars between 8 am and 9 am? (Homework) Two Minutes Break You are free to discuss with your classmates about the previous slides, or to refresh a bit, or to ask questions. Administrative issues • Groups Formation 45 Continuous Probability Distributions Uniform Distribution Exponential Distribution Normal (Gaussian) Distribution Weibull Distribution Lognormal Distribution 46 Continuous Distributions Continuous random variables can be used to describe random phenomena in which the variable can take on any value in some interval. In this section, the distributions studied are: Uniform Exponential Normal Weibull Lognormal 47 Uniform Distribution 48 Continuous Uniform Distribution The continuous uniform distribution is a family of probability distributions such that for each member of the family, all intervals of the same length on the distribution's support are equally probable A random variable X is uniformly distributed on the interval [a,b], U(a,b), if its PDF and CDF are: 1 , a x b PDF: f (x ) b a 0, otherwise a b Expected value: E X 2 x a 0, x a CDF: F (x ) , ax b a x b 1, Variance: V X b a b 2 12 49 Uniform Distribution PDF Properties p x 1 X x 2 is proportional to the length of the interval F X 2 F X 1 X 2 X1 b a Special case: a standard uniform distribution U(0,1). CDF Very useful for random number generators in simulators 50 Exponential Distribution Modeling Random Time 51 Exponential Distribution The exponential distribution describes the times between events in a Poisson process, in which events occur continuously and independently at a constant average rate. A random variable X is exponentially distributed with parameter m1/l > 0 if its PDF and CDF are: 1 x exp x 0 l exp l x , x 0 , PDF: f (x ) f (x ) 0, otherwise x 0 0, CDF: F (x ) x lt lx 0 le dt 1 e , x 0 Expected value: E X 1 l m m 0, m otherwise 0, F (x ) x 1 exp , m Variance: V X 1 l2 x 0 x 0 m2 52 Exponential Distribution µ=20 1 x exp x 0 , f (x ) 20 20 0, otherwise µ=20 0, F (x ) x 1 exp , 20 x 0 x 0 53 Exponential Distribution The memoryless property: In probability theory, memoryless is a property of certain probability distributions: the exponential distributions and the geometric distributions, wherein any derived probability from a set of random samples is distinct and has no information (i.e. "memory") of earlier samples. Formally, the memoryless property is: For all s and t greater or equal to 0: p X s t | X s p X t This means that the future event do not depend on the past event, but only on the present event The fact that Pr(X > 40 | X > 30) = Pr(X > 10) does not mean that the events X > 40 and X > 30 are independent; i.e. it does not mean that Pr(X > 40 | X > 30) = Pr(X > 40). 54 Exponential Distribution The memoryless property: can be read as “the probability that you will wait more than s+t minutes given that you have already been waiting t minutes is equal to the probability that you will wait s minutes.” In other words “The probability that you will wait s more minutes given that you have already been waiting t minutes is the same as the probability that you had wait for more than s minutes from the beginning.” p X s t | X s p X t The fact that Pr(X > 40 | X > 30) = Pr(X > 10) does not mean that the events X > 40 and X > 30 are independent; i.e. it does not mean that Pr(X > 40 | X > 30) = Pr(X > 40). 55 Example: Exponential Distribution The time needed to repair the engine of a car is exponentially distributed with a mean time equal to 3 hours. The probability that the car spends more than 3 hours in reparation 3 p X 3 1 p X 3 1 F 3 1 1 exp 0.368 3 The probability that the car repair time lasts between 2 to 3 hours is: p X 3 F 3 F 2 0.145 The probability that the repair time lasts for another hour given it has been operating for 2.5 hours: Using the memoryless property of the exponential distribution, we have: 1 p X 2.5 1 | X 2.5 p X 1 1 p X 1 exp 0.717 3 56 Normal (Gaussian) Distribution 57 Normal Distribution The Normal distribution, also called the Gaussian distribution, is an important family of continuous probability distributions, applicable in many fields. Each member of the family may be defined by two parameters, location and scale: the mean ("average", μ) and variance (standard deviation squared, σ2) respectively. The importance of the normal distribution as a model of quantitative phenomena in the natural and behavioral sciences is due in part to the Central Limit Theorem. It is usually used to model system error (e.g. channel error), the distribution of natural phenomena, height, weight, etc. 58 Normal or Gaussian Distribution A continuous random variable X, taking all real values in the range (-∞,+∞) is said to follow a Normal distribution with parameters µ and σ if it has the following PDF and CDF: 1 x m 2 PDF: f x exp s 2 2 s 1 x m 1 CDF: F x 1 erf 2 s 2 where Error Function: erf x 2 x exp t 2 0 The Normal distribution is denoted as X ~ N m , s 2 This probability density function (PDF) is a symmetrical, bell-shaped curve, centered at its expected value µ. The variance is s2. 59 Normal distribution Example The simplest case of the normal distribution, known as the Standard Normal Distribution, has expected value zero and variance one. This is written as N(0,1). 60 Normal Distribution Evaluating the distribution: Independent of m and s, using the standard normal distribution: Z ~ N 0,1 Transformation of variables: let Z X m xm F ( x ) P X x P Z s ( xm ) /s 1 z2 / 2 e dz 2 ( xm ) /s s , where ( z ) z 1 t 2 / 2 e dt 2 ( z )dz ( xs m ) 61 Normal Distribution Example: The time required to load an oceangoing vessel, X, is distributed as N(12,4) The probability that the vessel is loaded in less than 10 hours: 10 12 F (10) (1) 0.1587 2 Using the symmetry property, (1) is the complement of (-1) 62 Other Distributions Weibull Distribution Lognormal Distribution 63 Weibull Distribution A random variable X has a Weibull distribution if its pdf has the form: b x b 1 x b exp , x f ( x) a a a 0, otherwise 3 parameters: ( ) Location parameter: u, Scale parameter: b , b 0 Shape parameter. a, 0 Example: u = 0 and a = 1: Lifetime of objects When b = 1, X ~ exp(l = 1/a) 64 Lognormal Distribution A random variable X has a lognormal distribution if its pdf has the form: 1 ln x μ 2 exp , x 0 2 f ( x) 2π σx 2σ 0, otherwise m=1, s2=0.5,1,2. 2 Mean E(X) = em+s /2 2 2 Variance V(X) = e2m+s /2 (es - 1) Relationship with normal distribution When Y ~ N(m, s2), then X = eY ~ lognormal(m, s2) Parameters m and s2 are not the mean and variance of the lognormal general reliability analysis 65 65 Empirical Distribution 66 Empirical Distributions An Empirical Distribution is a distribution whose parameters are the observed values in a sample of data. May be used when it is impossible or unnecessary to establish that a random variable has any particular parametric distribution. Advantage: no assumption beyond the observed values in the sample. Disadvantage: sample might not cover the entire range of possible values. 67 Empirical Distributions In statistics, an empirical distribution function is a cumulative probability distribution function that concentrates probability 1/n at each of the n numbers in a sample. Let X1, X2, …, Xn be iid random variables in with the CDF equal to F(x). The empirical distribution function Fn(x) based on sample X1, X2, …, Xn is a step function defined by number of element in the sample x 1 Fn x n n n I X i x i 1 1 if X i x I X i x 0 otherwise where I(A) is the indicator of event A. For a fixed value x, I(Xi≤x) is a Bernoulli random variable with parameter p=F(x), hence nFn(x) is a binomial random variable with mean nF(x) and variance nF(x)(1-F(x)). 68 End of Chapter 4 69