Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
UNIT II-PART II Random variable – Discrete Probability distribution – Continuous probability distributions – Expectation – Moment generating function – probability generating function - Probability mass and density functions. Prepared by Dr. V. Valliammal 1 Random Variables Random variable A real variable (X) whose value is determined by the outcome of a random experiment is called a random variable. (e.g) A random experiment consists of two tosses of a coin. Consider the random experiment which is the number of heads (0, 1 or2) Outcome: HH HT TH TT Value of X: 2 1 1 0 2 Discrete Random Variable A random variable x which takes a countable number of real values is called a discrete random variable. (e.g) 1. number of telephone calls per unit time 2. marks obtained in a test 3. number of printing mistakes in each page of a book 3 Probability Mass Function If X is a discrete random variable taking atmost a countably infinite number of values x1, x2, .., we associate a number Pi = P(X = xi) = P(xi), called the probability mass function of X. The function P(xi) satisfies the following conditions: (i) P(xi) 0 i = 1, 2, …, (ii) P(x i ) 1 i 1 4 Continuous Random Variable A random variable X is said to be continuous if it can take all possible values between certain limits. (e.g.)1. The length of a time during which a vacuum tube installed is a continuous random variable . 2. number of scratches on a surface, proportion of defective parts among 1000 tested, 3. number of transmitted in error. 5 Probability Density Function Consider a small interval (x, x+dx) of length dx. The function f(x)dx represents the probability that X falls in the interval (x, x+dx) i.e., P(x X x+dx) = f(x) dx. The probability function of a continuous random variable X is called as probability density function and it satisfies the following conditions. (i) f(x) 0 x (ii) f(x)dx 1 6 Distribution Function The distribution function of a random variable X is denoted as F(X) and is defined as F(x) = P(X x). The function is also called as the cumulative probability function. x F(x) P(X x) P(x) when X is discrete x x F(x)dx when X is continuous 7 Properties on Cumulative Distribution 1. If x b, F(a) F(b), where a and b are real quantities. 2. If F is the distribution function of a onedimensional random variable X, then 0 F(x) 1. 3. If F is the distribution function of a one dimensional random variable X, then F() = 0 and F() = 1. 8 Problems 1. If a random variable X takes the values 1, 2, 3, 4 such that 2P(X=1)=3P(X=2)=P(X=3)=5P(X=4). Find the probability distribution of X. Solution: Assume P(X=3) = α By the given equation P(X 1) α α α P(X 2) P(X 4) 2 3 5 For a probability distribution (and mass function) P(x) = 1 P(1)+P(2)+P(3)+P(4) =1 9 1 2 3 5 61 1 30 30 61 15 10 30 6 P ( X 1) ; P ( X 2) ; P ( X 3) ; P ( X 4) 61 61 61 61 The probability distribution is given by X 1 2 3 4 p ( x) 15 61 10 61 30 61 6 61 10 2. Let X be a continuous random variable having the probability density function 2 , x 1 f ( x) x3 0 , otherwise Find the distribution function of x. Solution: x x x 1 1 1 F ( x) f ( x) dx dx 3 2 2 1 1x x x 1 2 11 4. A continuous random variable X has the probability density function f(x) given by f ( x) ce x , x Find the value of c and CDF of X. Solution: f ( x) dx 1 x ce 2 ce 0 2 ce x x dx 1 dx 1 dx 1 0 x 2c e 1 0 2c1 1 c 1 2 12 Case(ii ) x 0 Case(i) x 0 F x x F x f ( x) dx x x ce dx c x f ( x) dx x x ce c x x e dx 0 x x x e dx c e dx x c e x x c ce ce 1 x e 2 dx 0 0 x x c e 0 x c x c 2 e x 1 2 e 2 1 x x0 2 e , F ( x) x 1 2 e , x 0 2 13 5. A random variable X has the following probability distribution. X: 0 1 2 3 4 5 6 7 f(x): 0 k 2k 2k 3k k2 2k2 7k2+k Find (i) the value of k (ii) p(1.5 < X < 4.5 | X >2) and (iii) the smallest value of λ such that p(X≤λ) > 1/2. Solution P ( x) 1 (i) 0 k 2k 2k 3k k 10 k 2 9k 1 0 1 k 0 .1 10 2 2k 2 k 1, 7k 2 k 1 1 10 14 (ii) A 1.5 X 4.5 2,3,4 B X 2 3,4,5,6,7 A B 3,4 p A B p (3,4) pB p (3,4,5,6,7) 5 2k 3k 5k 5 10 2 2 2 2 7 7 2k 3k k 2k 7k k 10k 6k 10 p(X) F(X) p (1.5 X 4.5 | X 2) p A | B (iii) X 0 0 0 2 2k = 0.2 0.3 3 2k = 0.2 0.5 4 3k = 0.3 0.8 5 k2=0.01 0.81 6 2k2 = 0.02 0.83 7 7k2+k = 0.17 1.00 From the table for X = 4,5,6,7 p(X) > and the smallest value is 4 Therefore λ = 4. 15 Expectation of a Random Variable The expectation of a random variable X is denoted as E(X). It returns a representative value for a probability distribution. For a discrete probability distribution E(X) = x p(x). For a continuous random variable X which assumes values in (a, b) b E(X) xf(x)dx a 16 Properties on Expectation 1. Expectation of a constant is a constant. 2. E[aX] = aE(X), where a is a constant. 3. E(aX + b) = aE(X) + b, where a and b are constants. 4. |E(X)| E|X|, for any random variable X. 5. If X Y, E(X) E(Y). 17 Variance of a Random Variable The variance of a Random variable X, which is represented as V(X) is defined as the expectation of squares of the derivations from the expected value. V(X) = E(X2) – (E(X))2 Properties On Variance 1. Variance of a constant is 0. 2. V(aX) = a V(X), where a is a constant. 18 Moments and Other Statistical Constants Raw Moments Raw moments about origin b μr x r f(x)dx a Raw moments about any arbitrary value A b μr (x A) r f(x)dx a Central moments b r μ r E[X E(X)] (X E(X))r f(x)dx a 19 Relationship between Raw Moments and Central Moments μ1 0 (always) μ 2 μ2 μ12 μ 3 μ 3 3μ 2μ1 2μ13 μ 4 μ4 4μ3μ1 6μ2μ1 2 3μ14 20 Moment Generating Function (M.G.F) It is a function which automatically generates the raw moments. For a random variable X, the moment generating function is denoted as MX(t) and is derived as MX(t) = E(etX). Reason for the name M.G.F M X (t ) E (e tx ) t 2 X 2 t 3X 2 E 1 tX 2! 3! t 2X2 E(1) E(tx) E 2! 21 t2 1 tE(X) E(X 2 ) 2! t2 1 tμ1 μ 2 2! Here μ1 μ 2 = coefficient of t in MX(t) = coefficient of t2 2! in MX(t) In general μr = coefficient of t2 in 2! MX(t). 22 Problems 1. The p.m.f of a RV X, is given by Find MGF, mean and variance. Solution e M X t E e tX etx x 0 p ( x) 1 2x et x 0 2 tx x 2 3 4 e t e t et et .... 2 2 2 2 2 3 4 e t e t e t e t e t 1 .. 2 2 2 2 2 et 1 et t 2 e 2 et 1 2 23 Differentiating twice with respect to t t t t t t 2 e e e e 2 e M X t 2 2 t t 2e 2e 2 M X t t t t t t 2 e 2e 2e 2 2 e e t 4 2t t 2 e 2 e put t = 0 above t 4e 2e 3 E ( X ) M X 0 2 E X 2 M X 0 6 2 Variance E X 2 E X 6 4 2 24 2. Find MGF of the RV X, whose pdf is given by and hence find the first four central moments. Solution M X t E e tX tx e f x dx e tx e x dx 0 e t x dx 0 t x e t 0 t 25 Expanding in powers of t M X t t 2 3 1 t t t 1 ... t 1 Taking the coefficient we get the raw moments about origin E X coefficient of t 1! 1 2 2 2 E X coefficient of t 2! 2 6 3 24 coefficient of t 4 4! 4 E X 3 coefficient of t 3 3! E X4 26 and the central moments are 1 0 2 2 2 2C11 1 1 2 2 2 1 2 1 2 1 2 2 3 3 3 3C1 2 1 3C 2 1 1 1 6 2 1 1 1 1 2 3 3 3 2 3 3 2 2 4 4 4 4 4C1 3 1 4C 2 2 1 4C 3 1 1 24 6 1 2 1 1 1 9 4 6 4 4 3 2 2 4 4 4 27 3. If the MGF of a (discrete) RV X is distribution of X and p ( X = 5 or 6). Solution M X t 1 1 5 4e t 4e t 51 5 1 4e t 1 5 5 1 5 4e t find the 4e t 5 2 3 4e t ... 5 By definition tX M X t E e tx e p ( x) 1 e t0 t p (0) e p (1) e t2 p (2) ... 28