Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Discrete Probability Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1 What’s It All About? • The behavior of many random processes can be placed into a handful of categories. – In this presentation, we will develop probability distributions for several common categories of discrete random processes. » Note that not every discrete random process can be modeled with one of these probability distributions. • In that case, you need to derive an appropriate probability distribution (usually using counting principles). – In the next presentation, we will do the same for continuous random processes. 2 Definition • Bernoulli trials are a set of n trials of a random process where the outcome of each trial is – Examples. » Roll a pair of dice » Flip a coin – A random process whose outcomes are Bernoulli trials is said to “satisfy the Bernoulli property”. 3 Binomial Distribution • Let RV X satisfy the Bernoulli property and be defined by X = {# of times of event A occurs in n trials} then where on any trial, – Note that order 4 Binomial Distribution • Where does this formula come from? – Run n trials of a Bernoulli process. Determine the probability that event A occurs for the first x trials and does not occur for the remaining n-x trials. Call this event B1. Since the trials are » Note any other arrangement of event A occurring x times and Ac occurring n-x times has the same probability. 5 Binomial Distribution Suppose there are M different arrangements in which event A occurs x times and Ac occurs n-x times (we don’t yet know the value of M). Let Bi, i=1,2,...,M, denote these M arrangements, then P(x) = P(B1∪B2∪ ... ∪BM) Since all Bi’s are So, what’s the value of M? 6 Binomial Distribution There are n locations to place the x events A, the remaining n-x locations will contain Ac. So, M is the number of different ways to place x items into n different locations and is given by Hence, 7 Aside • A combination, C(n,x), is the # of ways to select x elements (without replacement) from a set of n distinct elements where order does not matter and is given by – Example: # ways to arrange 2 apples and 4 mangos 8 Example • Consider a injection molding machine which makes interior trim components for cars. Define event A as A = {machine makes a bad part} Suppose that P(A) = 0.05 and that the quality of each part is not effected by those of the previous parts. Determine the probability that 2 of the next 10 parts produced are bad? 9 Example 10 Properties of Binomial Distribution • The expected value and the variance for a binomial distribution are given by and 11 CDF of a Binomial Distribution • The probability that event A occurs at most j ≤ n times is j F(j) = ∑ C(n, x)p x q n-x x=0 – Recall F(j) is called the € – Table II in the book’s appendices presents values of F(j) for different values of j, n and p. 12 Properties of Binomial Distribution A typical binomial distribution (n = 5 & p = 0.35) f(x) 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0 0 1 2 3 4 5 6 7 x 13 Negative Binomial Distribution • Let the random process satisfy the Bernoulli property and define RV X by X = {trial # when event A occurs for the rth time} then where on any trial, p = P(A) & q = P(Ac) = 1 - p – Note that order 14 Negative Binomial Distribution • Where does this formula come from? f(x) = (x -1)! = p r q x-r (r -1)!(x - r)! € 15 Special Case • For r=1, the negative binomial distribution is termed the geometric distribution where 16 Example • The probability a CNC lathe makes a defective part is 0.02 for any part made by the lathe. Determine the probability the lathe makes 22 parts before the 3rd defective part is produced. 17 Properties of Negative Binomial Distribution • The expected value and variance for a negative binomial distribution are E(X) = k/p and VAR(X) = kq/p2 18 Hypergeometric Distribution • Consider a discrete sample space S where N = # of elements in S & k = # of events A in S Randomly sample n elements without replacement from S and define the RV x = {# times an event A is selected} then – Note that 19 Hypergeometric Distribution • Where does this formula come from? 1. Number of ways to select n elements from a set of N elements where order does not matter is 2. Number of ways to select x events A from a set containing k events A is 3. Number of ways to select n-x events Ac from a set containing N-k events Ac is 20 Hypergeometric Distribution • From the counting formula, 21 Example • Suppose a batch of 10 car engines contains 2 that are defective. If 3 are selected at random without replacement, what is the probability that 1 of the 3 is defective? 22 Example • What is the probability that all 3 engines selected are defective? 23 Properties of Hypergeometric Distribution • The expected value and variance of a hypergeometric distribution are nk E(X) = N and nk " k %" N - n % €VAR(X) = N $#1- N '&$# N -1 '& 24 € Binomial vs. Hypergeometric Similarities • Sample n items, x of which are event A and remaining n-x are event Ac. • Order Differences • Binomial: – Bernoulli property • Hypergeometric: – Bernoulli property 25 Binomial vs. Hypergeometric • If the size of the sample space N is large and then probability f(x) is such that 26 Example • Random process satisfies Bernoulli property and P(bad part)=0.05 What’s the probability that 1 of 2 parts randomly selected is bad? • Suppose 2 parts are selected a random without replacement from a population of 100 parts where 5 are bad. What’s the probability that 1 is bad? 27 Another Aside • Suppose a set containing n elements has x1 elements of type 1, x2 elements of type 2, ..., xM elements of type M. Then the # ways to arrange all n elements of this set is – Note: n = x1 + x2 + ... + xM » Order matters among dissimilar types of elements, but it 28 Example • How many different ways can you arrange 2 apples, 3 oranges and 4 mangos? 29 Multinomial Distribution • Consider a discrete random process that satisfies the Bernoulli property with k outcomes A1, A2, ..., Ak which are all mutually exclusive and A1∪A2∪...∪Ak=S. Let pi = P(Ai) i = 1,2,...,k and define the k RV’s Xi = {# of times event Ai occurs in n trials} i=1,...,k then where x1+x2+...+xk= and p1+p2+...+pk= 30 Multinomial Distribution • The binomial distribution is a special case of the multinomial distribution with • Note the multinomial distribution has k random variables x1, x2, ..., xk. – Hence, our current definitions for expected value and variance won’t work. » We will learn about these later when we study processes with multiple RV’s. 31 Example • Consider a CNC lathe which produces a part of diameter d. Define events A1 = {d ok}, A2 = {d too large}, A3 = {d too small} where P(A1) = 0.93, P(A2) = 0.04, P(A3) = 0.03 and that the quality of each part is not effected by those of the previous parts. Determine the probability that of the next 12 parts, 2 will be too large and 1 will be too small. 32 Example 33 General Hypergeometric Distribution • Consider a sample space containing N discrete elements. Each of these elements is classified as one of J events A1, A2, ..., AJ where there are ki of each event Ai and k1+k2+...+kJ=N. Select n elements at random without replacement and let the J RV’s be defined as Xi = {# of times event Ai is selected} i=1,...,J then – Note that order 34 Example • Suppose there are 65 cars in a parking lot where 20 are Chevy’s, 15 are Fords, 17 are DaimlerChryslers and 13 are Hondas. If 10 cars are selected at random for emissions testing, what is the probability that 4 are Chevy’s, 3 are Fords, 2 are DaimlerChryslers and 1 is a Honda? 35 Example 36 Uniform Distribution • Consider a sample space with k distinct elements denoted by the RV’s x1, x2, ..., xk. If all k outcomes xi are equally likely to occur, then 37 Uniform Distribution • The expected value and variance for a uniform distribution are given by x1 + x 2 +... + x k E(X i ) = k and € 1 k 2 VAR(X i ) = ∑ (x i - E(X i )) k i=1 38 € Poisson Process • A random process is called a Poisson process if an average of λ events occur per unit time or unit space (e.g., unit length, unit volume, etc.) and which satisfies: 1. # of random events occurring in any segment of time or space is independent of the number that occurred in previous segments – Called 2. The average 3. The smaller the segment of time or space, the lower the probability of 2 random events occurring during that segment. – Hence, 2 or more random events cannot occur at the same time or space. 39 Poisson Distribution • Let the discrete RV for a Poisson process be defined as X = {# of events that occur during a specified time span t (or space )} then 40 Example • A computer network receives an average of 0.1 messages/sec. and is a Poisson process. Determine the probability that the number of messages X during a 50 second interval is: 1. Equal to 7. 2. At least 4. 41 Example 42 Poisson Distribution • The expected value and variance for a Poisson distribution are given by E(X) = λt and VAR(X) = λt 43