Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
2/23/2016 Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4 Independence 5-1.5 More Than Two Random Variables 5-5 General Functions of Random Variables 5-6 Moment Generating Functions 5-2 Covariance and Correlation 5-3 Common Joint Distributions 5-3.1 Multinomial Probability Distribution 5-3.2 Bivariate Normal Distribution 5-4 Linear Functions of Random Variables Chapter 5 Title and Outline 2 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 1 2/23/2016 Learning Objectives for Chapter 5 After careful study of this chapter, you should be able to do the following: 1. 2. 3. 4. 5. 6. 7. 8. Use joint probability mass functions and joint probability density functions to calculate probabilities. Calculate marginal and conditional probability distributions from joint probability distributions. Interpret and calculate covariances and correlations between random variables. Use the multinomial distribution to determine probabilities. Properties of a bivariate normal distribution and to draw contour plots for the probability density function. Calculate means and variances for linear combinations of random variables, and calculate probabilities for linear combinations of normally distributed random variables. Determine the distribution of a general function of a random variable. Calculate moment generating functions and use them to determine moments and distributions Chapter 5 Learning Objectives 3 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Joint Probability Mass Function Sec 5-1.1 Joint Probability Distributions 4 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 2 2/23/2016 Joint Probability Density Function The joint probability density function for the continuous random variables X and Y, denotes as fXY(x,y), satisfies the following properties: Figure 5-2 Joint probability density function for the random variables X and Y. Probability that (X, Y) is in the region R is determined by the volume of fXY(x,y) over the region R. Sec 5-1.1 Joint Probability Distributions 5 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-2: Server Access Time-1 Let the random variable X denote the time until a computer server connects to your machine (in milliseconds), and let Y denote the time until the server authorizes you as a valid user (in milliseconds). X and Y measure the wait from a common starting point (x < y). The joint probability density function for X and Y is f XY x, y ke0.001x0.002 y for 0 x y and k 6 106 Figure 5-4 The joint probability density function of X and Y is nonzero over the shaded region where x < y. Sec 5-1.1 Joint Probability Distributions 6 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 3 2/23/2016 Example 5-2: Server Access Time-2 The region with nonzero probability is shaded in Fig. 5-4. We verify that it integrates to 1 as follows: f XY x, y dydx ke 0.001x 0.002 y dy dx k e 0.002 y dy e 0.001x dx 00 00 0.002 x e 0.001x k dx 0.003 e 0.003 x dx e 0.002 0 0 1 0.003 1 0.003 Sec 5-1.1 Joint Probability Distributions 7 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-2: Server Access Time-3 Now calculate a probability: P X 1000, Y 2000 1000 2000 0 1000 k 0 1000 k 0 f XY x, y dydx x 2000 0.002 y 0.001x dy e dx e x e 0.002 x e 4 0.001x dx e 0.002 1000 0.003 e 0.003 x e 4 e 0.001x dx 0 1 e 4 1 e 0.003 e 0.001 0.003 3 1 Figure 5-5 Region of integration for the probability that X < 1000 and Y < 2000 is darkly shaded. 0.003 316.738 11.578 0.915 Sec 5-1.1 Joint Probability Distributions 8 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 4 2/23/2016 Marginal Probability Distributions (discrete) The marginal probability distribution for X is found by summing the probabilities in each column whereas the marginal probability distribution for Y is found by summing the probabilities in each row. f X x f xy y fY y f xy x y = Response time(nearest second) 1 2 3 4 f (x ) x = Number of Bars of Signal Strength 1 0.01 0.02 0.02 0.15 0.20 2 0.02 0.03 0.10 0.10 0.25 3 0.25 0.20 0.05 0.05 0.55 f (y ) 0.28 0.25 0.17 0.30 1.00 Marginal probability distributions of X and Y Sec 5-1.2 Marginal Probability Distributions 9 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Marginal Probability Density Function (continuous) If the joint probability density function of random variables X and Y is fXY(x,y), the marginal probability density functions of X and Y are: Sec 5-1.2 Marginal Probability Distributions 10 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 5 2/23/2016 Example 5-4: Server Access Time-1 For the random variables that denotes times in Example 5-2, find the probability that Y exceeds 2000 milliseconds. Integrate the joint PDF directly using the picture to determine the limits. P Y 2000 2000 0 Dark region f XY x, y dy dx f XY x, y dy dx 2000 x 2000 left dark region right dark region Sec 5-1.2 Marginal Probability Distributions 11 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-4: Server Access Time-2 Alternatively, find the marginal PDF and then integrate that to find the desired probability. y fY y ke 0.001x 0.002 y dx P Y 2000 0 y fY y dy 2000 e 0.002 y 1 e 0.001 y dy ke 0.002 y e 0.001x dx 6 103 e 0.001x y ke 0.001 0 0.001 y 1 e ke 0.002 y 0.001 e 0.002 y e 0.003 y 6 103 0.002 2000 0.003 2000 0 0.002 y 2000 e 4 e 6 6 103 0.05 0.002 0.003 6 103 e 0.002 y 1 e 0.001 y for y 0 Sec 5-1.2 Marginal Probability Distributions 12 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 6 2/23/2016 Mean & Variance of a Marginal Distribution E(X) and V(X) can be obtained by first calculating the marginal probability distribution of X and then determining E(X) and V(X) by the usual method. E X x fX x R V X x 2 f X x X2 R E Y y fY y R V Y y 2 fY y Y2 R Sec 5-1.2 Marginal Probability Distributions 13 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Mean & Variance for Example 5-1 y = Response time(nearest second) x = Number of Bars of Signal Strength 1 2 3 0.01 0.02 0.25 0.02 0.03 0.20 0.02 0.10 0.05 0.15 0.10 0.05 f (x ) 0.20 0.25 0.55 x *f (x ) 0.20 0.50 1.65 x 2*f (x ) 0.20 1.00 4.95 1 2 3 4 f (y ) 0.28 0.25 0.17 0.30 1.00 2.35 6.15 y *f (y ) y 2*f (y ) 0.28 0.50 0.51 1.20 2.49 0.28 1.00 1.53 4.80 7.61 E(X) = 2.35 V(X) = 6.15 – 2.352 = 6.15 – 5.52 = 0.6275 E(Y) = 2.49 V(Y) = 7.61 – 2.492 = 7.61 – 16.20 = 1.4099 Sec 5-1.2 Marginal Probability Distributions 14 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 7 2/23/2016 Conditional Probability Density Function Sec 5-1.3 Conditional Probability Distributions 15 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-6: Conditional Probability-1 From Example 5-2, determine the conditional PDF for Y given X=x. f X x k e 0.001x 0.002 y dy x e 0.002 y ke 0.001x 0.002 x 0.002 e ke 0.001x 0.002 0.003e 0.003 x for x 0 fY x y f XY x, y ke 0.001x 0.002 y f X ( x) 0.003e 0.003 x 0.002e0.002 x 0.002 y for 0 x and x y Sec 5-1.3 Conditional Probability Distributions 16 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 8 2/23/2016 Example 5-6: Conditional Probability-2 Now find the probability that Y exceeds 2000 given that X=1500: P Y 2000 X 1500 fY 1500 y dy 2000 0.002e 0.0021500 0.002 y 2000 e 0.002 y 0.002e 0.002 2000 3 e 4 1 0.002e3 e 0.368 0.002 Sec 5-1.3 Conditional Probability Distributions 17 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Mean & Variance of Conditional Random Variables • The conditional mean of Y given X = x, denoted as E(Y|x) or μY|x is E Y x y fY x y y • The conditional variance of Y given X = x, denoted as V(Y|x) or σ2Y|x is V Y x y Y x y f 2 Yx y y 2 fY x y Y2 x y Sec 5-1.3 Conditional Probability Distributions 18 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 9 2/23/2016 Example 5-8: Conditional Mean And Variance From Example 5-2 & 5-6, what is the conditional mean for Y given that x = 1500? E Y X 1500 y 0.002e0.0021500 0.002 y dy 0.002e3 1500 y e 0.002 y dy 1500 e 0.002 y e 0.002 y 3 0.002e y dy 0.002 1500 1500 0.002 1500 3 e 0.002 y 0.002e3 e 0.002 0.002 1500 0.002 1500 3 e 3 0.002e3 e 0.002 0.002 0.002 e 3 0.002e3 2000 2000 0.002 If the connect time is 1500 ms, then the expected time to be authorized is 2000 ms. Sec 5-1.3 Conditional Probability Distributions 19 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-9 For the discrete random variables in Exercise 5-1, what is the conditional mean of Y given X=1? y = Response time(nearest second) 1 2 3 4 f (x ) 1 2 3 4 Sum of f(y|x) x = Number of Bars of Signal Strength 1 0.01 0.02 0.02 0.15 0.20 0.050 0.100 0.100 0.750 1.000 2 0.02 0.03 0.10 0.10 0.25 0.080 0.120 0.400 0.400 1.000 f (y ) 3 0.25 0.28 0.20 0.25 0.05 0.17 0.05 0.30 0.55 y*f(y|x=1) 0.455 0.05 0.364 0.20 0.091 0.30 0.091 3.00 1.000 3.55 y2*f(y|x=1) 0.05 0.40 0.90 12.00 13.35 12.6025 0.7475 The mean number of attempts given one bar is 3.55 with variance of 0.7475. Sec 5-1.3 Conditional Probability Distributions 20 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 10 2/23/2016 Independent Random Variables For random variables X and Y, if any one of the following properties is true, the others are also true. Then X and Y are independent. Sec 5-1.4 Independence 21 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-11: Independent Random Variables • Suppose the Example 5-2 is modified such that the joint PDF is: 6 0.001x 0.002 y f XY x, y 2 10 e for x 0 and y 0. • Are X and Y independent? f X x 2 10 e 6 0.001x 0.002 y dy 0 0 0.001e 0.001x fY y 2 106 e0.001x 0.002 y dx for x 0 0.002e0.002 y for y > 0 • Find the probability P X 1000, Y 1000 P X 1000 P Y 1000 e1 1 e2 0.318 Sec 5-1.4 Independence 22 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 11 2/23/2016 Joint Probability Density Function The joint probability density function for the continuous random variables X1, X2, X3, …Xp, denoted as f x , x ,..., x satisfies the following properties: X1 X 2 ... X p 1 2 p Sec 5-1.5 More Than Two Random Variables 23 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-14: Component Lifetimes In an electronic assembly, let X1, X2, X3, X4 denote the lifetimes of 4 components in hours. The joint PDF is: f X1 X 2 X 3 X 4 x1 , x2 , x3 , x4 9 1012 e0.001x1 0.002x2 0.0015x3 0.003x4 for x i 0 What is the probability that the device operates more than 1000 hours? The joint PDF is a product of exponential PDFs. P(X1 > 1000, X2 > 1000, X3 > 1000, X4 > 1000) = e-1-2-1.5-3 = e-7.5 = 0.00055 Sec 5-1.5 More Than Two Random Variables 24 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 12 2/23/2016 Marginal Probability Density Function Sec 5-1.5 More Than Two Random Variables 25 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Mean & Variance of a Joint Distribution The mean and variance of Xi can be determined from either the marginal PDF, or the joint PDF as follows: Sec 5-1.5 More Than Two Random Variables 26 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 13 2/23/2016 Example 5-16 Points that have positive probability in the joint probability distribution of three random variables X1 , X2 , X3 are shown in Figure. Suppose the 10 points are equally likely with probability 0.1 each. The range is the non-negative integers with x1+x2+x3 = 3 List the marginal PDF of X2 P (X2 = 0) P (X2 = 1) P (X2 = 2) P (X2 = 3) = = = = f x1x2 x3(3,0,0) + f x1x2 x3(0,0,3) + f x1x2 x3 (1,0,2) + f x1x2 x3(2,0,1) = 0.4 f x1x2 x3(2,1,0) + f x1x2 x3(0,1,2) + f x1x2 x3 (1,1,1) = 0.3 f x1x2 x3(1,2,0) + f x1x2 x3(0,2,1) = 0.2 f x1x2 x3(0,3,0) = 0.1 Also, E(x2) = 0(0.4) + 1(0.3) + 2(0.2) + 3(0.1) = 1 Sec 5-1.5 More Than Two Random Variables 27 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Distribution of a Subset of Random Variables Sec 5-1.5 More Than Two Random Variables 28 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 14 2/23/2016 Conditional Probability Distributions • Conditional probability distributions can be developed for multiple random variables by extension of the ideas used for two random variables. • Suppose p = 5 and we wish to find the distribution conditional on X4 and X5. f X1 X 2 X 3 X 4 X 5 x1 , x2 , x3 f X1 X 2 X 3 X 4 X 5 x1 , x2 , x3 , x4 , x5 f X 4 X 5 x4 , x5 for f X 4 X 5 x4 , x5 0. Sec 5-1.5 More Than Two Random Variables 29 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Independence with Multiple Variables The concept of independence can be extended to multiple variables. Sec 5-1.5 More Than Two Random Variables 30 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 15 2/23/2016 Example 5-18: Layer Thickness Suppose X1,X2, and X3 represent the thickness in μm of a substrate, an active layer and a coating layer of a chemical product. Assume that these variables are independent and normally distributed with parameters and specified limits as tabled. Normal Random Variables X1 X2 X3 10,000 1,000 80 250 20 4 What proportion of the product meets all specifications? Answer: 0.7783, 3 layer product. Parameters and specified limits Mean (μ) Std dev (σ) Which one of the three thicknesses has the least probability of meeting specs? Answer: Layer 3 has least prob. Lower limit 9,200 950 75 Upper limit 10,800 1,050 85 P(in limits) 0.99863 0.98758 0.78870 P(all in limits) = 0.77783 Sec 5-1.5 More Than Two Random Variables 31 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Covariance • Covariance is a measure of the relationship between two random variables. • First, we need to describe the expected value of a function of two random variables. Let h(X, Y) denote the function of interest. Sec 5-2 Covariance & Correlation 32 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 16 2/23/2016 Example 5-19: Expected Value of a Function of Two Random Variables For the joint probability distribution of the two random variables in Example 5-1, calculate E [(X-μX)(Y-μY)]. The result is obtained by multiplying x - μX times y - μY, times fxy(X,Y) for each point in the range of (X,Y). First, μX and μy were determined previously from the marginal distributions for X and Y: μX = 2.35 and μy = 2.49 Therefore, Sec 5-2 Covariance & Correlation 33 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Covariance Defined Sec 5-2 Covariance & Correlation 34 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 17 2/23/2016 Correlation (ρ = rho) Sec 5-2 Covariance & Correlation 35 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-21: Covariance & Correlation Joint x StDev Figure 5-13 Discrete joint distribution, f(x, y). Mean Marginal Determine the covariance and correlation to the figure below. y 0 1 1 2 2 3 0 1 2 3 0 1 2 1 2 3 0 1 2 3 μX = μY = f(x, y) x-μX y-μY Prod 0.2 -1.8 -1.2 0.42 0.1 -0.8 -0.2 0.01 0.1 -0.8 0.8 -0.07 0.1 0.2 -0.2 0.00 0.1 0.2 0.8 0.02 0.4 1.2 1.8 0.88 0.2 covariance = 1.260 0.2 correlation = 0.926 0.2 0.4 Note the strong 0.2 positive correlation. 0.2 0.2 0.4 1.8 1.8 σX = 1.1662 σY = 1.1662 Sec 5-2 Covariance & Correlation 36 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 18 2/23/2016 Independence Implies ρ = 0 • If X and Y are independent random variables, σXY = ρXY = 0 • ρXY = 0 is necessary, but not a sufficient condition for independence. Sec 5-2 Covariance & Correlation 37 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-23: Independence Implies Zero Covariance Let f XY xy x y 16 for 0 x 2 and 0 y 4 Show that XY E XY E X E Y 0 EX E XY 4 2 1 2 x ydx dy 16 0 0 4 2 1 2 2 x y dx dy 16 0 0 4 x3 2 1 y dy 16 0 3 0 4 x3 1 y2 16 0 3 4 1 y 2 8 1 16 4 16 2 0 3 6 2 3 1 8 y2 dy 16 0 3 1 y3 6 3 4 4 2 1 E Y xy 2 dx dy 16 0 0 4 x2 1 y2 16 0 2 2 y3 16 3 dy 0 2 dy 0 2 1 64 32 6 3 9 0 4 Figure 5-15 A planar joint distribution. XY E XY E X .E Y 1 64 8 8 3 3 0 4 32 4 8 0 9 3 3 Sec 5-2 Covariance & Correlation 38 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 19 2/23/2016 Multinomial Probability Distribution • Suppose a random experiment consists of a series of n trials. Assume that: 1) 2) 3) • The outcome of each trial can be classifies into one of k classes. The probability of a trial resulting in one of the k outcomes is constant, and equal to p1, p2, …, pk. The trials are independent. The random variables X1, X2,…, Xk denote the number of outcomes in each class and have a multinomial distribution and probability mass function: Sec 5-3.1 Multinomial Probability Distribution 39 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-25: Digital Channel Of the 20 bits received over a digital channel, 14 are of excellent quality, 3 are good, 2 are fair, 1 is poor. The sequence received was EEEEEEEEEEEEEEGGGFFP. Let the random variables X1 , X2 , X3, and X4 denote the number of bits that are E, G, F , and P, respectively, in a transmission of 20 bits. What is the probability that 12 bits are E, 6 bits are G, 2 are F, and 0 are P? P X1 12, X2 6, X3 2, X4 0 20! 0.612 0.36 0.082 0.020 0.0358 12!6!2!0! Using Excel 0.03582 = (FACT(20)/(FACT(12)*FACT(6)*FACT(2))) * 0.6^12*0.3^6*0.08^2 Sec 5-3.1 Multinomial Probability Distribution 40 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 20 2/23/2016 Multinomial Mean and Variance The marginal distributions of the multinomial are binomial. If X1, X2,…, Xk have a multinomial distribution, the marginal probability distributions of Xi is binomial with: E(Xi) = npi and V(Xi) = npi(1-pi) Sec 5-3.1 Multinomial Probability Distribution 41 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Bivariate Normal Probability Density Function The probability density function of a bivariate normal distribution is f XY x, y; X , X , Y , Y , 1 2 X Y 1 2 eu x X 2 2 x X y Y y Y 2 1 where u 2 XY Y2 2 1 2 X for x and y . x 0, x , Parameter limits: y 0, y , 1 1 Sec 5-3.2 Bivariate Normal Distribution 42 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 21 2/23/2016 Marginal Distributions of the Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density function fXY(x,y;σX,σY,μX,μY,ρ) the marginal probability distributions of X and Y are normal with means μX and μY and standard deviations σX and σY, respectively. Sec 5-3.2 Bivariate Normal Distribution 43 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Conditional Distribution of Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density fXY(x,y;σX,σY,μX,μY,ρ), the conditional probability distribution of Y given X = x is normal with mean and variance as follows: Y x Y Y x X X Y2 x Y2 1 2 Sec 5-3.2 Bivariate Normal Distribution 44 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 22 2/23/2016 Correlation of Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density function fXY(x,y;σX,σY,μX,μY,ρ), the correlation between X and Y is ρ. Sec 5-3.2 Bivariate Normal Distribution 45 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Bivariate Normal Correlation and Independence • In general, zero correlation does not imply independence. • But in the special case that X and Y have a bivariate normal distribution, if ρ = 0, then X and Y are independent. If X and Y have a bivariate normal distribution with ρ=0, X and Y are independent. Sec 5-3.2 Bivariate Normal Distribution 46 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 23 2/23/2016 Linear Functions of Random Variables • A function of random variables is itself a random variable. • A function of random variables can be formed by either linear or nonlinear relationships. We limit our discussion here to linear functions. • Given random variables X1, X2,…,Xp and constants c1, c2, …, cp Y= c1X1 + c2X2 + … + cpXp is a linear combination of X1, X2,…,Xp. Sec 5-4 Linear Functions of Random Variables 47 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Mean and Variance of a Linear Function If X1, X2,…,Xp are random variables, and Y= c1X1 + c2X2 + … + cpXp , then Sec 5-4 Linear Functions of Random Variables 48 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 24 2/23/2016 Example 5-31: Error Propagation A semiconductor product consists of three layers. The variances of the thickness of each layer is 25, 40 and 30 nm. What is the variance of the finished product? Answer: Sec 5-4 Linear Functions of Random Variables 49 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Mean and Variance of an Average Sec 5-4 Linear Functions of Random Variables 50 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 25 2/23/2016 Reproductive Property of the Normal Distribution Sec 5-4 Linear Functions of Random Variables 51 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-32: Linear Function of Independent Normal Random variables Let the random variables X1 and X2 denote the length and width of a manufactured part. Their parameters are shown in the table. What is the probability that the perimeter exceeds 14.5 cm? Mean Std Dev Parameters of X1 X2 2 5 0.1 0.2 Let Y 2 X 1 2 X 2 perimeter E Y 2 E X 1 2 E X 2 2 2 2 5 14 cm V Y 22 V X 1 22V X 2 4 0.1 4 0.2 0.04 0.16 0.20 2 2 SD Y 0.20 0.4472 cm 14.5 14 P Y 14.5 1 1 1.1180 0.1318 .4472 Using Excel 0.1318 = 1 - NORMDIST(14.5, 14, SQRT(0.2), TRUE) Sec 5-4 Linear Functions of Random Variables 52 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 26 2/23/2016 General Function of a Discrete Random Variable Suppose that X is a discrete random variable with probability distribution fX(x). Let Y = h(X) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let this solution be x = u(y), the inverse transform function. Then the probability mass function of the random variable Y is fY(y) = fX[u(y)] Sec 5-5 General Functions of Random Variables 53 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-34: Function of a Discrete Random Variable Let X be a geometric random variable with probability distribution fX(x) = p(1-p)x-1 , x = 1, 2, … Find the probability distribution of Y = X2. Solution: – Since X ≥ 0, the transformation is one-to-one. – The inverse transform function is X = y . – fY(y) = p(1-p) y -1 , y = 1, 4, 9, 16,… Sec 5-5 General Functions of Random Variables 54 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 27 2/23/2016 General Function of a Continuous Random Variable Suppose that X is a continuous random variable with probability distribution fX(x). Let Y = h(X) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let this solution be x = u(y), the inverse transform function. Then the probability distribution of Y is fY(y) = fX[u(y)]∙|J| where J = u’(y) is called the Jacobian of the transformation and the absolute value of J is used. Sec 5-5 General Functions of Random Variables 55 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-35: Function of a Continuous Random Variable Let X be a continuous random variable with probability distribution: x f X ( x) for 0 x 4 8 Find the probability distribution of Y = h(X) = 2X + 4 Note that Y has a one-to-one relationship to X . y4 1 and the Jacobian is J u ' y 2 2 y 4 2 1 y 4 for 4 y 12. fY y 8 2 32 x u y Sec 5-5 General Functions of Random Variables 56 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 28 2/23/2016 Definition of Moments about the Origin The rth moment about the origin of the random variable X is X r f ( x), X discrete 'r E ( X r ) r X f ( x)dx, X continuous Sec 5-6 Moment Generating Functions 57 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Definition of a Moment-Generating Function The moment-generating function of the random variable X is the expected value of etX and is denoted by MX (t). That is, etX f ( x), X discrete M X (t ) M (etX ) tX e f ( x)dx, X continuous Let X be a random variable with moment-generating function MX (t). Then d r M X (t ) 'r |t 0 dt r Sec 5-6 Moment Generating Functions 58 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 29 2/23/2016 Example 5-36 Moment-Generating Function for a Binomial Random Variable-1 Let X follows a binomial distribution, that is n f ( x) p x (1 p)n x , x 0,1,...., n x Determine the moment generating function and use it to verify that the mean and variance of the binomial random variable are μ=np and σ2=np(1-p). The moment-generating function is n n n n M X (t ) etx p x (1 p)n x ( pet ) x (1 p) n x x x 0 x 0 x which is the binomial expansion of [ pet (1 p)]n Now the first and second order derivatives will be M x' (t ) npet [1 p(et 1)]n 1 and M x'' (t ) npet (1 p npet )[1 p(et 1)]n 2 Sec 5-6 Moment Generating Functions 59 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-36 Moment-Generating Function for a Binomial Random Variable-2 If we set t = 0 in the above two equations we get M x' (t ) 1' np and M x'' (t ) 2' np(1 p np) Now the variance is 2 2' 2 np(1 p np) (np) 2 np np 2 np(1 p ) Hence, the mean is np and variance is 2 np (1 p ). Sec 5-6 Moment Generating Functions 60 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 30 2/23/2016 Properties of Moment-Generating Function If X is a random variable and a is a constant, then 1. M X a (t ) e at M X (t ) 2. M aX (t ) M X (at ) If X 1 , X 2 ,..., X n are independent random variables with moment generating functions M X1 (t ), M X 2 (t ),..., M X n (t ) respectively, and if Y X 1 X 2 ... X n then the moment generating function of Y is 3. M Y (t ) M X1 (t ).M X 2 (t ). ... .M X n (t ) Sec 5-6 Moment Generating Functions 61 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. Example 5-38 Distribution of a Sum of Poisson Random Variables Suppose that X1 and X2 are two independent Poisson random variables with parameters λ1 and λ2, respectively. Determine the probability distribution of Y = X1 + X2. The moment-generating function of a Poisson random variable with parameter λ is t M X (t ) e ( e 1) Hence for X1 and X2, t t M X1 (t ) e1 ( e 1) and M X 2 (t ) e2 ( e 1) Using M Y (t ) M X1 (t ).M X 2 (t ). ... .M X n (t ) , the moment-generating function of Y = X1 + X2 is M Y (t ) M X1 (t ).M X 2 (t ) t t e1 ( e 1) e2 ( e 1) t e( 1 2 )( e 1) Sec 5-6 Moment Generating Functions 62 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 31 2/23/2016 Important Terms & Concepts for Chapter 5 Bivariate distribution Bivariate normal distribution Conditional mean Conditional probability density function Conditional probability mass function Conditional variance Contour plots Correlation Covariance Error propagation General functions of random variables Independence Joint probability density function Joint probability mass function Linear functions of random variables Marginal probability distribution Multinomial distribution Reproductive property of the normal distribution Chapter 5 Summary 63 Copyright © 2014 John Wiley & Sons, Inc. All rights reserved. 32