Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Probability & Statistics Professor Wei Zhu July 22nd (1) Letโs have some fun: A Fair Game or Not? A gambler goes to bet. The dealer has 3 dice, which are fair, meaning that the chance that each face shows up is exactly 1/6. The dealer says: "You can choose your bet on a number, any number from 1 to 6. Then I'll roll the 3 dice. If none show the number you bet, you'll lose $1. If one shows the number you bet, you'll win $1. If two or three dice show the number you bet, you'll win $3 or $5, respectively." Is this a fair game? (*A fair game is such that the expected, or equivalently the long term average, winning is zero.) ๐ Solution: Let ๐ฟ be # of dice show the number you bet, then ๐ฟ~๐ฉ๐๐๐๐๐๐๐(๐, ๐) (Because the dealer roll 3 dice independently, and the probability for each ๐ dice show the number you bet is ๐. That is, ๐ ๐ ๐โ๐ ๐ ๐ ๐ท(๐ฟ = ๐) = ( ) ( ) ( ) , ๐ = ๐, ๐, ๐, ๐ ๐ ๐ ๐ Let ๐ be the amount of money you get, then, ๐ ๐ ๐โ๐ ๐ ๐ ๐๐๐ ๐ ๐ ๐ท(๐ = โ๐) = ๐ท(๐ฟ = ๐) = ( ) ( ) ( ) =( ) = ๐ ๐ ๐ ๐ ๐๐๐ ๐ ๐ ๐โ๐ ๐ ๐ ๐ ๐๐ ๐ ๐ ๐ท(๐ = ๐) = ๐ท(๐ฟ = ๐) = ( ) ( ) ( ) =๐โ โ( ) = ๐ ๐ ๐ ๐ ๐ ๐๐๐ ๐ ๐ ๐โ๐ ๐ ๐ ๐ ๐๐ ๐ ๐ ๐ท(๐ = ๐) = ๐ท(๐ฟ = ๐) = ( ) ( ) ( ) =๐โ( ) โ = ๐ ๐ ๐ ๐ ๐ ๐๐๐ ๐ ๐ ๐โ๐ ๐ ๐ ๐ ๐ ๐ ๐ท(๐ = ๐) = ๐ท(๐ฟ = ๐) = ( ) ( ) ( ) =( ) = ๐ ๐ ๐ ๐ ๐๐๐ Thus, your expected winning is ๐ฌ(๐) = โ ๐ โ ๐ท(๐ = ๐) = (โ๐) โ ๐ ๐๐๐ ๐๐ ๐๐ ๐ +๐โ +๐โ +๐โ =๐ ๐๐๐ ๐๐๐ ๐๐๐ ๐๐๐ Therefore, it is a fair game. 1 (2) *Review: The derivative and integral of some important functions one should remember. ๐ ๐ (๐ฅ ) = ๐๐ฅ ๐โ1 ๐๐ฅ ๐ ๐ฅ (๐ ) = ๐ ๐ฅ ๐๐ฅ ๐ 1 (ln ๐ฅ) = ๐๐ฅ ๐ฅ ๐ฅ=๐ ๐ 1 1 โซ ๐ฅ ๐๐ฅ = ( ๐ฅ ๐+1 )| = (๐ ๐+1 โ ๐๐+1 ) ๐ + 1 ๐ + 1 ๐ ๐ฅ=๐ ๐ ๐ฅ=๐ ๐ ๐ฅ ๐ฅ = ๐๐ โ ๐๐ โซ ๐ ๐๐ฅ = ๐ | ๐ ๐ฅ=๐ *The Chain Rule ๐ ๐[๐(๐ฅ)] = ๐โฒ[๐(๐ฅ)] โ ๐โฒ(๐ฅ) ๐๐ฅ ๐ 2 2 For example: ๐๐ฅ (๐ ๐ฅ ) = ๐ ๐ฅ โ 2๐ฅ *The Product Rule ๐ [๐(๐ฅ) โ ๐(๐ฅ)] = ๐โฒ (๐ฅ)๐(๐ฅ) + ๐(๐ฅ)๐โฒ(๐ฅ) ๐๐ฅ *To Find the Maximum/Minimum of a (continuous & differentiable) Function e. g. g(๐ฅ) = (๐ฅ โ 1)2 , ๐ฅ โ [0, 2] We first set the first derivative to zero, find the solutions, then compare to the boundary values. This shall yield the minimum and maximum. We see that ๐ ๐๐ฅ g(๐ฅ) = 2(๐ฅ โ 1) = 0 โ ๐ฅ = 1. Soon we realize that at this point we have the minimum, while the maximum is achieved at the two boundary points. 2 (3) Normal Distribution Q. Who invented the normal distribution? * Left: Abraham de Moivre (26 May 1667 in Vitry-le-François, Champagne, France โ 27 November 1754 in London, England) *From the Wikipedia * Right: Johann Carl Friedrich Gauss (30 April 1777 โ 23 February 1855) <i> Probability Density Function (p.d.f.) X ~ N ( ๏ญ , ๏ณ 2 ) : X follows normal distribution of mean ๏ญ and variance ๏ณ 2 f ( x) ๏ฝ 1 2๏ฐ ๏ณ ๏ญ e ( x๏ญ๏ญ )2 2๏ณ 2 , ๏ญ ๏ฅ ๏ฐ x ๏ฐ ๏ฅ, x ๏ R 3 b P(a ๏ฃ X ๏ฃ b) ๏ฝ ๏ฒ f ( x)dx = area under the pdf curve bounded by a and b a <ii> Cumulative Distribution Function (c.d.f.) x F ( x) ๏ฝ P( X ๏ฃ x) ๏ฝ ๏ฒ f (t )dt ๏ญ๏ฅ f ( x) ๏ฝ [ F ( x)]' ๏ฝ d F ( x) dx (4). Mathematical Expectation (Review). ๏ฅ Continuous random variable: E[ g ( X )] ๏ฝ ๏ฒ g ( x) f ( x)dx ๏ญ๏ฅ Discrete random variable: all E[ g ( X )] ๏ฝ ๏ฅ g ( x) P( X ๏ฝ x) Properties of Expectations: (1) E(c) = c, where c is a constant 4 (2) E[c*g(X)] = c*E[g(X)], where c is a constant (3) E[g(X)+h(Y))]= E[g(X)]+E[h(Y)], for any X&Y (4) E[g(X)*h(Y)] = E[g(X)]*E[h(Y)], if X & Y are independent โotherwise it is usually not true. Special case: ๏ฅ 1) (population) Mean: ๏ญ ๏ฝ E ( X ) ๏ฝ ๏ฒ x ๏ f ( x)dx ๏ญ๏ฅ Note: E(aX+b) =aE(X)+b, where a & b are constants 2) (population) Variance: ๏ฅ Var(X) = ๏ณ 2 ๏ฝ E[( X ๏ญ ๏ญ )2 ] ๏ฝ ๏ฒ ( x ๏ญ ๏ญ )2 ๏ f ( x)dx ๏ฝ E( X 2 ) ๏ญ [ E( X )]2 ๏ญ๏ฅ Note: Var(aX+b) = a2Var(X), where a & b are constants 3) Moment generating function: ๏ฅ M X (t ) ๏ฝ E (e tX ) ๏ฝ ๏ฒ e tx f ( x)dx , when X is continuous. ๏ญ๏ฅ ๐ด๐ฟ (๐) = ๐(๐๐ญ๐ ) = โ๐๐ฅ๐ฅ ๐ฉ๐จ๐ฌ๐ฌ๐ข๐๐ฅ๐ ๐ฏ๐๐ฅ๐ฎ๐๐ฌ ๐จ๐ ๐ ๐๐ญ๐ฑ ๐(๐ฑ) For normal distribution, X ~ N (๏ญ , ๏ณ ), f ( x) ๏ฝ 2 ๏ฅ M X (t ) ๏ฝ ๏ฒ e tx f ( x)dx ๏ฝ e 1 2๏ฐ ๏ณ ๏ญ e ( x๏ญ๏ญ )2 2๏ณ 2 , ๏ญ๏ฅ ๏ผ x ๏ผ ๏ฅ 1 2 ๏ญt ๏ซ ๏ณ 2t 2 ๏ญ๏ฅ 4) Moment: 1st (population) moment: ๐ธ(๐) = โซ ๐ฅ โ ๐(๐ฅ) ๐๐ฅ 2nd (population) moment: ๐ธ(๐ 2 ) = โซ ๐ฅ 2 โ ๐(๐ฅ) ๐๐ฅ โฆ Kth (population) moment: ๐ธ(๐ ๐ ) = โซ ๐ฅ ๐ โ ๐(๐ฅ) ๐๐ฅ 5 Theorem. If X 1 , X 2 are independent, then M X 1 ๏ซ X 2 (t ) ๏ฝ M X1 (t ) M X 2 (t ) Theorem. Under regularity conditions, there is 1-1 correspondence between the pdf and the mgf of a given random variable X. That is, 1๏ญ1 pdf f ( x) ๏ฌ๏พ๏ฎ mgf M X (t ) . e.g. Sgt. Jones wishes to select one army recruit into his unit. It is known that the IQ distribution for all recruits is normal with mean 180 and standard deviation 10. What is the chance that Sgt. Jones would select a recruit with an IQ of at least 200? Sol) Let X represents the IQ of a randomly selected recruit. X ~ N ( ๏ญ ๏ฝ 180, ๏ณ ๏ฝ 10) P( X ๏ณ 200) ๏ฝ P( X ๏ญ 180 200 ๏ญ 180 ๏ณ ) ๏ฝ P( Z ๏ณ 2) ๏ป 2.28% 10 10 e.g. Sgt. Jones wishes to select three army recruits into his unit. It is known that the IQ distribution for all recruits is normal with mean 180 and standard deviation 10. What is the chance that Sgt. Jones would select three recruits with an average IQ of at least 200? Sol) Let ๐1, ๐2 , ๐3 represents the IQ of three randomly selected recruits. ๐๐ ๐.๐.๐. ~ ๐(๐, ๐ 2 ), ๐ = 1, 2, 3 X ~ N ( ๏ญ * ๏ฝ 180,๏ณ * ๏ฝ P( X ๏ณ 200) ๏ฝ P( 10 ) 3 X ๏ญ 180 200 ๏ญ 180 ๏ณ ) ๏ฝ P( Z ๏ณ 2 3 ) ๏ป 0.03% 10 / 3 10 / 3 6 *(5). Joint distribution, and independence Definition. The joint cdf of two random variables X and Y are defined as: ๐น๐,๐ (๐ฅ, ๐ฆ) = ๐น(๐ฅ, ๐ฆ) = ๐(๐ โค ๐ฅ, ๐ โค ๐ฆ) Definition. The joint pdf of two discrete random variables X and Y are defined as: ๐๐,๐ (๐ฅ, ๐ฆ) = ๐(๐ฅ, ๐ฆ) = ๐(๐ = ๐ฅ, ๐ = ๐ฆ) Definition. The joint pdf of two continuous random variables X and Y are defined as: ๐๐,๐ (๐ฅ, ๐ฆ) = ๐(๐ฅ, ๐ฆ) = ๐2 ๐น(๐ฅ, ๐ฆ) ๐๐ฅ๐๐ฆ Definition. The marginal pdf of the discrete random variable X or Y can be obtained by summation of their joint pdf as the following: ๐๐ (๐ฅ) = โ๐ฆ ๐(๐ฅ, ๐ฆ) ; ๐๐ (๐ฆ) = โ๐ฅ ๐(๐ฅ, ๐ฆ) ; Definition. The marginal pdf of the continuous random variable X or Y can be obtained by integration of the joint pdf as the following: โ โ ๐๐ (๐ฅ) = โซโโ ๐(๐ฅ, ๐ฆ) ๐๐ฆ; ๐๐ (๐ฆ) = โซโโ ๐(๐ฅ, ๐ฆ) ๐๐ฅ; Definition. The conditional pdf of a random variable X or Y is defined as: ๐(๐ฅ|๐ฆ) = ๐(๐ฅ, ๐ฆ) ๐(๐ฅ, ๐ฆ) ; ๐(๐ฆ|๐ฅ) = ๐(๐ฆ) ๐(๐ฅ) Definition. The joint moment generating function of two random variables X and Y is defined as ๐๐,๐ (๐ก1 , ๐ก2 ) = ๐ธ(๐ ๐ก1 ๐+๐ก2 ๐ ) 7 Note that we can obtain the marginal mgf for X or Y as follows: ๐๐ (๐ก1 ) = ๐๐,๐ (๐ก1 , 0) = ๐ธ(๐ ๐ก1 ๐+0โ๐ ) = ๐ธ(๐ ๐ก1 ๐ ); ๐๐ (๐ก2 ) = ๐๐,๐ (0, ๐ก2 ) = ๐ธ(๐ 0โ๐+๐ก2 โ๐ ) = ๐ธ(๐ ๐ก2 โ๐ ) Theorem. Two random variables X and Y are independent โ (if and only if) ๐น๐,๐ (๐ฅ, ๐ฆ) = ๐น๐ (๐ฅ)๐น๐ (๐ฆ) โ ๐๐,๐ (๐ฅ, ๐ฆ) = ๐๐ (๐ฅ)๐๐ (๐ฆ) โ ๐๐,๐ (๐ก1 , ๐ก2 ) = ๐๐ (๐ก1 ) ๐๐ (๐ก2 ) Definition. The covariance of two random variables X and Y is defined as ๐ถ๐๐(๐, ๐) = ๐ธ[(๐ โ ๐๐ )(๐ โ ๐๐ )]. Theorem. If two random variables X and Y are independent, then we have ๐ถ๐๐(๐, ๐) = 0. (*Note: However, ๐ถ๐๐(๐, ๐) = 0 does not necessarily mean that X and Y are independent.) Homework #2, Due Friday, July 25, before our lecture ๐๐. ๐ฟ๐๐ก ๐1 & ๐2 ๐๐ ๐๐๐๐๐๐๐๐๐๐๐ก ๐(๐, ๐ 2 ). Prove that ๐1 + ๐2 and ๐1 โ ๐2 are independent. Q2. ๐ฟ๐๐ก ๐ ๐๐ ๐(0,1), (1) please derive the covariance between ๐ ๐๐๐ ๐ 2 ; (2) Are ๐ ๐๐๐ ๐ 2 independent? Q3. Let ~๐(3, 4), please calculate ๐(1 < ๐ < 3) Q4. Jack & Jill plan to meet at the gate of the famous Los Angeles Chinese Theater (LACT) between 12noon and 1pm tomorrow. The one who arrives first will wait for the other for 30 minutes, and then leave. What is the chance that the two friends will be able to meet at LACT during their appointed time period assuming that each one will arrive at LACT independently, each at a random time between 12noon and 1pm? 8