Download 13 Variance and Covariance

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Fundamental theorem of algebra wikipedia , lookup

Fisher–Yates shuffle wikipedia , lookup

Transcript
Variance and Covariance
13
114
Variance and Covariance
The central terms are the variance and the covariance. While
the variance is interpreted as an 'index of spreading of a
random variable' the covariance can be regarded as an 'index for linear connection of random variables'.
For the denition of the variance as an index of spreading of a random variable, the squared distances between the values of the random variable and its expectation weighted by the presupposed probability measure has much to be recommended.
13.1 Denition (Variance)
Let (Ω, A, P ) be a probability space and X : (Ω, A, P ) →
(R, B) a random variable with expectation E(X).
Then the value
(13.1.1)
V (X) := VP (X) := EP ((X − E(X))2 ,
is called the
variance of X w.r.t. P , if it is nite.
13.2 Remarks
The variance has been introduced as the expectation
of the squared deviation of a random variable X from
Variance and Covariance
115
its expectation E(X). Accordingly, the variance is an
index for the average quadratic distance of the random
variable X from its expectation.
A large variance means thereby a large spreading, a
small variance a small spreading of the values X(ω), ω ∈
Ω, around the expectation.
This is conrmed by the fact that V (X) = 0 is equivalent with the validity of X = E(X) on the whole Ω
with the exception of maximally a P nullset.
13.3 Calculation rules for variances
Let (Ω, A, P ) be a probability space and let the random variable X : (Ω, A, P ) → (R, B) have the variance
V (X). Then
13.3.1 V (aX + b) = a2 V (X),
13.3.2 V (X) = E(X 2 )−[E(X)]2
(a, b ∈ R)
(Shifting theorem)
Proof:
13.3.1: Due to the linearity of the expectation, we have
V (aX+b) = E((aX−E(aX))2 ) = E(a2 (X−E(X))2 = a2 V (X) .
13.3.2: Applying the calculation rules for expectation,
Variance and Covariance
116
cf. 12.4 we obtain
V (X) = E((X − E(X))2 ) = E(X 2 − 2XE(X) + (E(X))2 )
= E(X 2 ) − 2E(X)E(X) − (E(X))2 = E(X 2 ) − (E(X))2 .
The following list contains the variances of particular
distributions. Analogously to 12.8 there for expectations we understand under the variance of B(n, p),
for example, the variance of VB(n,p) ( idN0 ) of the random variable idN0n w.r.t. the distribution (probability
measure) B(n, p) etc.
13.4 Variances of particular distributions
13.4.1 VB(1,p) ( id{0,1} ) = p(1 − p)
13.4.2 VB(n,p) ( idN0n ) = np(1 − p)
13.4.3 Vπ(λ) ( idN ) = λ
13.4.4 VN(a,σ2 ) ( idR ) = σ2 .
(We renounce here consciously to present calculation
techniques!)
13.5 Denition
Let (Ω, A, P ) be a probability space and X, Y : (Ω, A, P ) →
(R, B) two random variables whose variances exist.
Variance and Covariance
117
Then the value
Cov(X, Y ) := Cov(X, Y ) := E((X−E(X))(Y −EY ))
is called
the covariance of X and Y w.r.t. P .
Theorem 13.6 shows in what way the term covariance
enters into the theory presented hitherto.
13.6 Theorem
Let (Ω, A, P ) be a probability space and X, Y two real
random variables with the variances V (X) and V (Y )
respectively. Then the random variable X + Y has
also a variance, and
V (X + Y ) = V (X) + V (Y ) + 2(X, Y ) .
13.7 Calculation rules for covariances
Let X, Y be two random variables with the variances
V (X) and V (Y ). Then
13.7.1 Cov(aX+c, bY +d) = abCov(X, Y ) a, b, c, d ∈
R)
13.7.2 Cov(X, Y ) = E(X, Y ) − E(X)E(Y )
Variance and Covariance
If moreover X and Y are
dent, then
118
stochastically indepen-
13.7.3 (X, Y ) = 0
13.7.4 V (X + Y ) = V (X) + V (Y )
mula of Bienaymé)
(For-
Hints concerning the proof:
13.7.1 and 13.7.2 correspond to the facts 13.3.1 and
13.3.2 (shifting theorem) respectively formulated for
variances.
13.7.3 follows from 13.7.2 and Theorem 12.9 while 13.7.4
is a consequence of Theorem 13.6 and 13.7.3.
13.8 Remark
The covariance Cov(X, Y ) is a 'measure' of the linear connection of the random variables X and Y ; the
stochastic independence of X and Y implies that Cov(X, Y ) =
0, cf. 13.7.3; the reversal of this fact does not hold.
However, in the case of stochastic independence the
variance of the sum of two random variables turns out
to be simply the sum of the variances, cf. 13.7.4, (Formula of Bienaymé).