Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Some Rules on Taking Expectations in Probability and Statistics Each of these can be verified by using the definition of an expectation, namely E[g( X)] g(x)f (x)dx where f(x) is the pdf of random variable X. #1. E[αX+β] = αE[X]+β when both α and β are constants. #2. E[E[X]] = E[X] sometimes called the law of the iterated expectation #3. E[X/Y] ≠ E[X]/E[Y], in general #4. E[XY] = E[X]E[Y] if X and Y are statistically independent. #5. E[ t X t ] = t 1 1 if E[Xt] = μ for all t and if | θ | < 1. #6. E[ (X-E[X])2 ] = var(X), which is just the definition of var(X) #7. E[ g(X) ] < g( E[X] ) if g' ' 0. which is called Jensen’s Inequality #8. If X ~ N(μ,σ2), then E[ eX ] = e ( 2 )/2 , not e . #9. If X ~ 2k , then E [X] = k, where k are the degrees of freedom. #10. If X ~ Fk,p, then E[X] = p/(p-2) for p > 2, where p = denominator degrees of freedom. #11. var(αX + β) = α2var(X) #12. var(X) = E[ X2 ] if E[X ] = 0 #13. var(αX – βY) = α2var(X) + β2var(Y) - 2αβcov(X,Y) #14. var(X) = E [ {X-E[X]}2 ] by definition #15. var(X) = E[X2] – E2[X]