Survey

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Survey

Document related concepts

Transcript

Probability and Statistics Lecture 3 Dr.-Ing. Erwin Sitompul President University http://zitompul.wordpress.com 2 0 1 3 President University Erwin Sitompul PBST 3/1 Chapter 3 Random Variables and Probability Distributions Chapter 3 Random Variables and Probability Distributions President University Erwin Sitompul PBST 3/2 Chapter 3.1 Concept of a Random Variable Concept of a Random Variable Random variable is a function that associates a real number with each element in the sample space. In other words, random variable is a numerical description of the outcome of an experiment, where each outcome gets assigned a numerical value. A capital letter, say X, is used to denotes a random variable and its corresponding small letter, x in this case, for one of its values. President University Erwin Sitompul PBST 3/3 Chapter 3.1 Concept of a Random Variable Concept of a Random Variable The sample space giving a detailed description of each possible outcome when three electronic components are tested may be written as S DDD, DDN , DND, DNN , NDD, NDN , NND, NNN One is concerned with the number of defectives that occurs. Thus each point in the sample space will be assigned a numerical value of 0, 1, 2, or 3. Then, the random variable X assumes the value 2 for all elements in the subset E DDN , DND, NDD Two balls are drawn in succession without replacement from an urn containing 4 red balls and 3 black balls. The possible outcomes and the values y of the random variable Y, where Y is the number of red balls are President University Erwin Sitompul PBST 3/4 Chapter 3.1 Concept of a Random Variable Sample Space and Random Variable If a sample space contains a finite number of possibilities or an unending sequence with as many elements as there are whole numbers, it is called a discrete sample space. If a sample space contains an infinite number of possibilities equal to the number of points on a line segment, it is called a continuous sample space. A random variable is called a discrete random variable if its set of possible outcomes is countable. A random variable is called a continuous random variable if it can take on values on a continuous scale. If X is the random variable assigned to the waiting time, in minute, for a bus at a bus stop, then the random variable X may take on all values of waiting time x, x ≥ 0. In this case, X is a continuous random variable. President University Erwin Sitompul PBST 3/5 Chapter 3.2 Discrete Probability Distributions Discrete Probability Distributions Frequently, it is convenient to represent all the probabilities of a random variable X by a formula. Such a formula would necessarily be a function of the numerical values x, denoted by f(x), g(x), r(x), and so forth. For example, f ( x) P( X x) The set of ordered pairs (x, f(x)) is a probability function, probability mass function, or probability distribution of the discrete random variable X if, for each possible outcome x, 1. f ( x ) 0 2. f ( x) 1 x 3. P( X x) f ( x) President University Erwin Sitompul PBST 3/6 Chapter 3.2 Discrete Probability Distributions Discrete Probability Distributions In the experiment of tossing a fair coin twice, the random variable X represents how many times the head turns up. The possible value for x of X and their probability can be summarized as President University Erwin Sitompul PBST 3/7 Chapter 3.2 Discrete Probability Distributions Discrete Probability Distributions A shipment of 20 similar laptop computers to a retail outlet contains 3 that are defective. If a school makes a random purchase of 2 of these computers, find the probability distribution for the number of defectives. Let X be a random variable, whose value x are the possible numbers of defective computers purchased by the school. C0 17 C2 136 f (0) P( X 0) 190 20 C2 3 f (1) P( X 1) f (2) P( X 2) President University 3 C1 17 C1 51 190 20 C2 3 C2 17 C0 3 190 20 C2 Erwin Sitompul PBST 3/8 Chapter 3.2 Discrete Probability Distributions Discrete Probability Distributions There are many problems where we may wish to compute the probability that the observed value of a random variable X will be less than or equal to some real number x. The cumulative distribution F(x) of a discrete random variable X with probability distribution f(x) is F ( x) P( X x) f (t ) for x tx Example of a probability distribution President University Example of a cumulative distribution Erwin Sitompul PBST 3/9 Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions In case the sample space is continuous, there can be unlimited number of possible value for the samples. Thus, it is more meaningful to deal with an interval rather than a point value of a random variable. For example, it does not make sense to know the probability of selecting person at random who is exactly 164 cm tall. It will be more useful to talk about the probability of selecting a person who is at least 163 cm but not more than 165 cm. We shall concern ourselves now with computing probabilities for various intervals of continuous random variables such as P(a < X < b), P(W ≥ c), P(U ≤ d) and so forth. Note that when X is continuous P( X a) 0 Probability of a point value is zero P ( a X b) P ( a X b) P ( X b ) P (a X b ) President University Erwin Sitompul PBST 3/10 Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions In dealing with continuous variables, the notation commonly used is f(x) and it is usually called the probability density function, or the density function of X. For most practical application, the density functions are continuous and differentiable. Their graphs may take any forms, but since it will be used to represent probabilities, the density function must lie entirely above the x axis to represent positive probability. f(x) f(x) x President University f(x) x Erwin Sitompul x PBST 3/11 Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions A probability density function is constructed so that the area under its curve bounded by the x axis is equal to 1 when computed over the range of X for which f(x) is defined. In the figure below, the probability that X assumes a value between a and b is equal to the shaded area under the density function between the ordinates at x = a and x = b. b P(a X b) f ( x)dx a President University Erwin Sitompul PBST 3/12 Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions The function f(x) is a probability density function for the continuous random variable X, defined over the set of real numbers R if 1. 2. f ( x) 0, for all x R f ( x )dx 1 3. b P(a X b) f ( x)dx a President University Erwin Sitompul PBST 3/13 Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions Suppose that the error in the reaction temperature, in °C, for a controlled laboratory experiment is a continuous random variable X having the probability density function x2 f ( x) 3 , 1 x 2 0, elsewhere (a) Verify whether f ( x ) dx 1 (b) Find P(0 < X ≤ 1) (a) 2 3 2 x x 8 1 f ( x)dx dx 1 3 9 1 9 9 1 2 1 (b) P(0 X 1) President University 2 3 1 x x dx 0 3 9 0 1 9 Erwin Sitompul PBST 3/14 Chapter 3.3 Continuous Probability Distributions Continuous Probability Distributions The cumulative distribution F(x) of a continuous random variable X with density function f(x) is x F ( x) P( X x) f (t )dt , for x For the density function in the last example, find F(x) and use it to evaluate P(0 < X ≤ 1). x F ( x) x 2 3 x t t f (t )dt dt 9 3 1 1 x3 1 , 9 for 1 x 2 x 1 0, x3 1 F ( x) , 1 x 2 9 x2 1, P(0 X 1) F (1) F (0) President University 2 1 1 9 9 9 Erwin Sitompul PBST 3/15 Chapter 3.4 Joint Probability Distributions Joint Probability Distributions If X and Y are two discrete random variables, the probability distribution for their simultaneous occurrence can be represented by a function with values f(x, y) for any pair of values (x, y) within the range of the random variables X and Y. Such function is referred to as the joint probability distribution of X and Y. The function f(x, y) is a joint probability density function or joint probability distribution function of the discrete random variables X and Y if 1. f ( x, y ) 0, for all ( x, y ) R 2. f ( x, y) 1 x y 3. P( X x, Y y ) f ( x, y ) For any region A in the xy plane, P ( X , Y ) A President University Erwin Sitompul f ( x, y) A PBST 3/16 Chapter 3.4 Joint Probability Distributions Joint Probability Distributions Two ballpoint pens are selected at random from a box that contains 3 blue pens, 2 red pens, and 3 green pens. If X is the number of blue pens selected and Y is the number of red pens selected, find (a) the joint probability function f(x, y) (b) P[(X, Y) A], where A is the region {(x, y)|x + y ≤ 1} (a) f ( x, y) 3 Cx 2 C y 3 C2 x y , C2 for x 0,1, 2; y 0,1, 2; 0 x y 2 8 (b) P ( X , Y ) A P( X Y 1) f (0, 0) f (0,1) f (1, 0) 3 3 9 28 14 28 9 14 President University Erwin Sitompul PBST 3/17 Chapter 3.4 Joint Probability Distributions Joint Probability Distributions The function f(x, y) is a joint probability density function of the continuous random variables X and Y if 1. f ( x, y ) 0, for all ( x, y ) R 2. f ( x, y ) dxdy 1 3. P ( X , Y ) A f ( x, y) dxdy A For any region A in the xy plane. President University Erwin Sitompul PBST 3/18 Chapter 3.4 Joint Probability Distributions Joint Probability Distributions A privately owned business operates both a drive-in facility and a walk-in facility. On a randomly selected day, let X and Y, respectively, be the proportions of the time that the drive-in and the walk-in facilities are in use, and suppose that the joint density function of these random variables is 52 (2 x 3 y ), 0 x 1, 0 y 1 f ( x, y ) elsewhere 0, (a) Verify that f(x, y) is a joint density function. (b) Find P[(X, Y) A], where A is {(x, y)|0 < x < 1/2, 1/4 < y < 1/2}. (a) y x x 1 2 2 2 6 f ( x, y ) dxdy (2 x 3 y ) dxdy x yx dy 5 5 x 0 5 0 0 0 1 1 1 1 2 6 2 2 6 y dy y y 5 10 0 5 5 0 2 6 1 5 10 1 President University Erwin Sitompul PBST 3/19 Chapter 3.4 Joint Probability Distributions Joint Probability Distributions (b) Find P[(X, Y) A], where A is {(x, y)|0 < x < 1/2, 1/4 < y < 1/2}. P ( X , Y ) A P(0 X 12 , 14 Y 12 ) x 1 2 12 2 2 2 6 dy (2 x 3 y ) dxdy x yx 5 5 x 0 5 1 4 y 1 4 x 0 12 12 12 12 1 3 2 1 3 y dy y y 10 10 1 4 10 5 1 4 1 1 1 3 1 1 10 2 4 10 4 16 13 160 President University Erwin Sitompul PBST 3/20 Chapter 3.4 Joint Probability Distributions Marginal Probability Distributions The marginal probability distribution functions of X alone and of Y alone are g ( x ) f ( x, y ) and y h ( y ) f ( x, y ) x for the discrete case, and g ( x) f ( x, y ) dy and h( y ) for the continuous case. f ( x, y ) dx The term marginal is used here because, in discrete case, the values of g(x) and h(y) are just the marginal totals of the respective columns and rows when the values of f(x, y) are displayed in a rectangular table. President University Erwin Sitompul PBST 3/21 Chapter 3.4 Joint Probability Distributions Marginal Probability Distributions Show that the column and row totals from the “ballpoint pens” example give the marginal distribution of X alone and of Y alone. 2 g (0) f (0, y) f (0, 0) f (0,1) f (0, 2) y 0 3 3 1 5 28 14 28 14 2 g (1) f (1, y) f (1, 0) f (1,1) f (1, 2) y 0 9 3 15 0 28 14 28 2 g (2) f (2, y) f (2, 0) f (2,1) f (2, 2) y 0 3 3 00 28 28 President University It is found that the values of g(x) are just the column totals of the table above. In similar manner we could show that the values of h(y) are given by the row totals. Erwin Sitompul PBST 3/22 Chapter 3.4 Joint Probability Distributions Marginal Probability Distributions Find f(x) and h(y) for the joint density function of the “drive-in walkin facility” example. 2 (2 x 3 y ), 0 x 1, 0 y 1 f ( x) 5 elsewhere 0, g ( x) 1 2 f ( x, y ) dy (2 x 3 y) dy 5 0 1 6 4 3 4 xy y 2 x 10 0 5 5 5 h( y ) 1 2 f ( x, y ) dx (2 x 3 y) dx 5 0 1 6 2 6 2 x 2 yx y 5 0 5 5 5 President University Erwin Sitompul PBST 3/23 Chapter 3.4 Joint Probability Distributions Conditional Probability Distributions Let X and Y be two random variables, discrete or continuous. The conditional probability distribution function of the random variable Y, given than X = x, is f ( x, y) f ( y x) , g ( x) 0 g ( x) Similarly, the conditional distribution of the random variable X, given that Y = y, is f ( x y) f ( x, y ) , h( y ) 0 h( y ) President University Erwin Sitompul PBST 3/24 Chapter 3.4 Joint Probability Distributions Conditional Probability Distributions If one wished to find the probability that the discrete random variable X falls between a and b when it is known that the discrete variable Y = y, we evaluate P( a X b Y y ) f ( x y ) x where the summation extends over all available values of X between a and b. When X and Y are continuous, we can find the probability that X lies between a and b by evaluating b P(a X b Y y) f ( x y) dx a President University Erwin Sitompul PBST 3/25 Chapter 3.4 Joint Probability Distributions Conditional Probability Distributions Referring back to the “ballpoint pens” example, find the conditional distribution of X, given that Y = 1, and use it to determine P(X = 0 | Y = 1). f ( x y) f ( x, y ) h( y ) f ( x 1) f ( x,1) , h(1) f (0 1) f (0,1) 3 14 1 2 h(1) 37 f (11) 1 f (1,1) 3 14 2 h(1) 37 f (2 1) x 0,1, 2 P X 0 Y 1 f 0 1 f (2,1) 0 0 h(1) 37 President University Erwin Sitompul 1 2 PBST 3/26 Chapter 3.4 Joint Probability Distributions Conditional Probability Distributions Given the joint density function x(1 3 y 2 ) , 0 x 2, 0 y 1 f ( x, y ) 4 elsewhere 0, find g(x), h(y), f(x|y), and evaluate P(1/4 < X < 1/2|Y = 1/3). g ( x) 1 x( y y ) x(1 3 y ) x dy , 0 x2 4 4 2 0 2 3 f ( x, y ) dy x (1 3 y ) x(1 3 y ) 1 3y2 dx , 0 y 1 f ( x, y ) dx 8 4 2 0 0 h( y ) 1 0 2 2 2 2 2 f ( x, y ) x(1 3 y 2 ) 4 x f ( x y) , 0 x 2, 0 y 1 2 h( y ) (1 3 y ) 2 2 12 P(1 4 X 1 2 Y 1 3) 14 President University 12 x2 x f ( x y ) dx dx 4 2 14 Erwin Sitompul 12 14 3 64 PBST 3/27 Chapter 3.4 Joint Probability Distributions Statistical Independence Let X and Y be two random variables, discrete or continuous, with joint probability distribution f(x, y) and marginal distributions g(x) and h(y), respectively. The random variables X and Y are said to be statistically independent if and only if f ( x, y ) g ( x ) h ( y ) for all (x, y) within their range. President University Erwin Sitompul PBST 3/28 Chapter 3.4 Joint Probability Distributions Statistical Independence Consider the following joint probability density function of random variables X and Y. 3x y , 1 x 4, 1 y 2 f ( x, y ) 18 elsewhere 0, (a) Find the marginal density functions of X and Y (b) Are X and Y statistically independent? (c) Find P(X > 2|Y = 2) (a) g ( x) h( y ) 2 6 xy y 3x y 6x 3 2x 1 dy , 1 x 4 18 36 12 36 1 1 2 2 f ( x, y ) dy f ( x, y ) dx President University 4 3x 2 yx 45 6 y 15 2 y 3x y , 1 y 2 dx 36 36 12 18 1 1 4 2 Erwin Sitompul PBST 3/29 Chapter 3.4 Joint Probability Distributions Statistical Independence (b) Are X and Y statistically independent? 2 x 1 15 2 y 3 x y g ( x ) h( y ) f ( x, y ) 18 12 12 X and Y are not statistically independent (c) Find P(X > 2|Y = 2) 4 4 P( X 2 Y 2) f ( x y ) y 2 dx 2 2 f ( x, y ) h( y ) 4 dx y 2 2 f ( x, 2) dx h(2) 4 4 2 3 28 (3 x 2) 18 2 dx (3 x 2) dx x 2 2 x 33 2 33 (15 4) 12 33 2 2 2 4 President University Erwin Sitompul PBST 3/30 Chapter 3.4 Joint Probability Distributions Statistical Independence Let X1, X2, ..., Xn be n random variables, discrete or continuous, with joint probability distribution f(x1, x2, ..., xn) and marginal distributions f1(x1), f2(x2), ..., fn(xn), respectively. The random variables X1, X2, ..., Xn are said to be mutually statistically independent if and only if f ( x1 , x2 ,...,, xn ) f1 ( x1 ) f 2 ( x2 ) f n ( xn ) for all (x1, x2, ..., xn)) within their range. President University Erwin Sitompul PBST 3/31 Chapter 3.4 Joint Probability Distributions Statistical Independence Suppose that the shelf life, in years, of a certain perishable food product packaged in cardboard containers is a random variable whose probability density function is given by e x , x 0 f ( x) 0, elsewhere Let X1, X2, and X3 represent the shelf lives for three of these containers selected independently and find P(X1<2, 1<X2<3, X3>2) f ( x1 , x2 , x3 ) f ( x1 ) f ( x2 ) f ( x3 ) e x1 e x2 e x3 3 2 P( X1 2,1 X 2 3, X 3 2) e x1 e x2 e x3 dx1dx2 dx3 2 1 0 e x1 2 0 e x2 3 1 e x3 2 (1 e2 )(e1 e3 )(e2 ) 0.0372 President University Erwin Sitompul PBST 3/32 Probability and Statistics Homework 3A 1. A game is played with the rule that a counter will move forward one, two, or four places according to whether the scores on the two dice rolled differ by three or more, by one or two, or are equal. Here we define a random variable, M, the number of places moved, which can take the value 1, 2, or 4. Determine the probability distribution of M. (Sou.04.E1 s.2) 2. Let the random variable X denote the time until a computer server connects to your notebook (in milliseconds), and let Y denote the time until the server authorizes you as a valid user (in milliseconds). Each of these random variables measures the wait from a common starting time. Assume that the joint probability density function for X and Y is 2 106 e0.001x 0.002 y , x 0, y 0 f ( x, y ) elsewhere 0, (a) Show that X and Y are independent. (b) Determine P(X > 1000, Y < 1000). President University Erwin Sitompul (Mo.E5.20) PBST 3/33 Probability and Statistics Homework 3B 1. A player throws a dice whose face are numbered 1 to 6. If the player obtains a six, he throws the dice a second time, and in this case, his score is the sum of 6 and the second number; otherwise his score is the number obtained from the first throw only. The player cannot throw more than two times. Let X be the random variable denoting the player’s score. (a) Write down the probability distribution of X and the cumulative distribution F(x) of X. (Sou.04.4.5.N7 s.12) (b) Determine the probability that the sum of two successive scores is 8 or more. 2. The joint probability density function for the number of minutes waiting to catch the first fish, X, and the number of minutes waiting to catch the second fish, Y, is given below. Determine the probability of: (a) waiting 3 minutes to catch the first fish and 3 minutes to catch the second fish. (b) waiting 2 minutes to catch the first fish. (c) waiting at least two minutes to catch the first fish. (d) waiting 2 minutes to catch the second fish, after catching the first fish within 1 minute. (Sou.04.4.5.N7 s.12) President University Erwin Sitompul PBST 3/34