Download mixed moment

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
MIXED MOMENTS
CORRELATION & INDEPENDENCE,
INDEPENDENT SUM
Tutorial 7, STAT1301 Fall 2010, 09NOV2010, MB103@HKU
By Joseph Dong Recall: Univariate Moment of order and Generalization: Mixed Moments
• The moment of a random variable is defined as
• The central moment of a random variable is defined as
• Q: How to generalize these two definitions to the case of a random vector of dimensions?
• A: We can define the ⋯
mixed moment of an ‐
dimensional random vector.
⋯
• Define the mixed moment of a random vector ⋯
• Define the ,⋯,
as
⋯
⋯
,⋯,
as
⋅ ⋯⋅
mixed central moment of a random vector ⋯
2
Bivariate Mixed Moment: • The mixed moment of order of of and is defined by
• The mixed central moment of order of
and is defined by • The covariance of two random variables is defined as the 2nd
bivariate mixed central moment :
Cov ,
• Covariance is a bivariate concept.
• A convenient identity:
• Properties of Cov
,
:
• Symmetry: Cov ,
Cov ,
• Positive semi‐definiteness : Cov ,
,
Cov
• Linearity: Cov
≡
,
0
Cov
,
3
Standardization in Statistics
• Express position of a number label using its distance from the expectation, in terms of a multiple of the standard deviation. • This is as if we recoordinatize the state space using the location of expectation as the origin and using the standard deviation as the unit length.
• Standardization of a random variable is a one‐one transformation (a centralization plus a rescaling) of the random variable.
• Using angle brackets to denote the resultant random variable of standardization:
↔
orinshorthandnotation:
• Purpose of standardization: for ease of describing positions. For example, • What’s the relative position of the number label 5.3 in the state space of a random .
.
variable following 4,0.16 . 5.3 ↔
3.25. Since follows 0,1
.
.
(show this if you are not convinced), and 3.25 is a very high quantile. Therefore 5.3 is located quite unusually right in the original state space.
4
Correlation as standardized Covariance
• Covariance is a bivariate concept, so is correlation.
• Compare:
• Covariance of & : ⋅
• Quick Question: What if and are independent?
• Correlation ( ) of & :
⋅
,
•
• Very interestingly, correlation of any pair of r.v.’s is always bounded, while their covariance can explode.
• Pf. 5
Exploring Correlation
demonstration.wolfram.com
• Download Mathematica™ player if you don’t have one.
• Search “correlation”
• And explore…
6
Covariance/Correlation calculation Exercises
• Handout Problem 1
• Handout Problem 2
• Handout Problem 3
• Find the correlation of the two random variables and who are dependent functionally as
1
, ~
1,1
2
, ~
1,1
3
4
1, 1
, 0
,
50
,
0
,
50
0
1
~ln
, ~
1,1
2,1 . What about ,
?
7
Independent sum
• Independent sum refers to the sum of independent random variables. ⋯
• is a random variable itself—it has a sample space, a state space, and a probability measure (and distribution) on the sample space.
• We’re now interested in finding the following moments of :
• Expectation (too easy and no need independent actually)
• Just summing the expecations
• Variance (a bit proof work required, uses all 0)
• It turns out that this is also just the sum of the variances.
• Proof uses properties of covariance.
• MGF (now easy because we have proved Theorem C of Tutorial 6). • Finding the MGF is equivalent to finding the Distribution.
⋅
⋯
• If we consider a pair of independent sums, we are also interested in finding their covariance (this is easy too)
• Using properties of covariance
8
Independent sum: finding its distribution
• Previous slide gives one method for finding the distribution of :
throughitsmomentgeneratingfunction sinceunderindependence
condition,themomentgeneratingfunctionisveryconvenientto
derive.Butthereisoneproblem:whatifyoudon’trecognizethe
resultingformofMGF? Assumingweareblindedaboutthe
divinelycleverintegraltransformationmethods. • We can also find the distribution of by working with the probability measure directly.
ℙ
,
,
9
Exercises: Handout Problems 4,5,6
10