Download Chapter #7 - Properties of Expectation

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Chapter #7 – Properties of Expectation
1
Question #4: Let π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 𝑦 when 0 < 𝑦 < 1 and 0 < π‘₯ < 𝑦 and π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 0 otherwise.
(a) Verify that π‘“π‘‹π‘Œ (π‘₯, 𝑦) is a density function, (b) find 𝐸[π‘‹π‘Œ], (c) find 𝐸[𝑋], (d) find 𝐸[π‘Œ], and
(e) compute the covariance between 𝑋 and π‘Œ given by πΆπ‘œπ‘£(𝑋, π‘Œ).
1
1 π‘₯ 𝑦
𝑦1
a) We must show that 1 = ∫0 ∫0
1
𝑑π‘₯𝑑𝑦 = ∫0 [𝑦] 𝑑𝑦 = ∫0 1 𝑑𝑦 = [𝑦]10 = 1.
𝑦
0
b) From Proposition 2.1, we know that if the random variables 𝑋 and π‘Œ have the joint
∞ ∞
density function π‘“π‘‹π‘Œ (π‘₯, 𝑦), then 𝐸(𝑔(𝑋, π‘Œ)) = βˆ«βˆ’βˆž βˆ«βˆ’βˆž 𝑔(π‘₯, 𝑦)π‘“π‘‹π‘Œ (π‘₯, 𝑦) 𝑑π‘₯𝑑𝑦. Here we
1
𝑦
are given 𝑔(π‘₯, 𝑦) = π‘₯𝑦, so we have that 𝐸(π‘‹π‘Œ) = ∫0 ∫0 𝑔(π‘₯, 𝑦)π‘“π‘‹π‘Œ (π‘₯, 𝑦) 𝑑π‘₯𝑑𝑦 =
1
𝑦 π‘₯𝑦
∫0 ∫0
𝑦
1
𝑦
1
1
1
1
𝑦
1
1
𝑑π‘₯𝑑𝑦 = ∫0 ∫0 π‘₯ 𝑑π‘₯𝑑𝑦 = 2 ∫0 [π‘₯ 2 ]0 𝑑𝑦 = 2 ∫0 𝑦 2 𝑑𝑦 = 6 [𝑦 3 ]10 = 6.
1
𝑦π‘₯
c) We are now given that 𝑔(π‘₯, 𝑦) = π‘₯, so we can calculate 𝐸(𝑋) = ∫0 ∫0
1
1 π‘₯2
𝑦
1 𝑦2
1
∫ [ ] 𝑑𝑦 = 2 ∫0
2 0 𝑦
𝑦
0
1
1
1
𝑦
𝑑π‘₯𝑑𝑦 =
1
𝑑𝑦 = 2 ∫0 𝑦 𝑑𝑦 = 4 [𝑦 2 ]10 = 4.
1
𝑦𝑦
d) We have 𝑔(π‘₯, 𝑦) = 𝑦, so 𝐸(π‘Œ) = ∫0 ∫0
1
1
𝑦
1
1
𝑑π‘₯𝑑𝑦 = ∫0 [π‘₯]0 𝑑𝑦 = ∫0 𝑦 𝑑𝑦 = 2 [𝑦 2 ]10 = 2.
𝑦
e) To derive a computational formula, we have πΆπ‘œπ‘£(𝑋, π‘Œ) = 𝐸[(𝑋 βˆ’ 𝐸(𝑋)(π‘Œ βˆ’ 𝐸(π‘Œ)] =
𝐸[π‘‹π‘Œ βˆ’ 𝑋𝐸(π‘Œ) βˆ’ π‘ŒπΈ(𝑋) + 𝐸(𝑋)𝐸(π‘Œ)] = 𝐸(π‘‹π‘Œ) βˆ’ 𝐸(𝑋)𝐸(π‘Œ) βˆ’ 𝐸(𝑋)𝐸(π‘Œ) +
𝐸(𝑋)𝐸(π‘Œ) = 𝐸(π‘‹π‘Œ) βˆ’ 𝐸(𝑋)𝐸(π‘Œ). We therefore see that for these random variables
1
1
1
1
1
πΆπ‘œπ‘£(𝑋, π‘Œ) = 𝐸(π‘‹π‘Œ) βˆ’ 𝐸(𝑋)𝐸(π‘Œ) = 6 βˆ’ (4) (2) = 6 βˆ’ 8 =
8βˆ’6
48
2
1
= 48 = 24.
Question #5: The county hospital is located at the center of a square whose sides are 3 miles
wide. If an accident occurs within this square, then the hospital sends out an ambulance. The
road network is rectangular, so the travel distance from the hospital, whose coordinates are
(0,0) to the point (π‘₯, 𝑦) is |π‘₯| + |𝑦|. If an accident occurs at a point that is uniformly
distributed in the square, find the expected travel distance of the ambulance.
ο‚· We first let 𝑍 = |𝑋| + |π‘Œ|, so we would like to compute 𝐸(𝑍). Since the sides of the
3
3
3
3
square have length 3, we have that βˆ’ 2 < 𝑋 < 2 and βˆ’ 2 < π‘Œ < 2, which implies that
|𝑋| <
3
2
3
3 3
and |π‘Œ| < 2. Finally, note that 𝑋, π‘Œ~π‘ˆπ‘πΌπΉ(βˆ’ 2 , 2). Therefore, we have that
3/2
𝐸(𝑍) = 𝐸(|𝑋| + |π‘Œ|) = 𝐸(|𝑋|) + 𝐸(|π‘Œ|) = βˆ«βˆ’3/2 (3
2
3/2 π‘₯
2 ∫0
3/2 𝑦
𝑑π‘₯ + 2 ∫0
3
π‘₯2
𝑑𝑦 = 2 [ 6 ]
3
3/2
0
𝑦2
3/2
+ 2[ 6 ]
0
3/2
π‘₯
3
2
) 𝑑π‘₯ + βˆ«βˆ’3/2 (3
βˆ’(βˆ’ )
9
2
9
36
𝑦
3
2
) 𝑑𝑦 =
βˆ’(βˆ’ )
3
= 2 (24) + 2 (24) = 24 = 2.
Question #6: A fair die is rolled 10 times. Calculate the expected sum of the 10 rolls.
ο‚· We have a sum 𝑋 of 10 independent random variables 𝑋𝑖 . We therefore have that
1
1
10
10
10
𝐸(𝑋) = 𝐸(βˆ‘10
𝑖=1 𝑋𝑖 ) = βˆ‘π‘–=1 𝐸(𝑋𝑖 ) = βˆ‘π‘–=1 [1 βˆ— 6 + β‹― + 6 βˆ— 6] = βˆ‘π‘–=1
21
6
21
= 10 ( 6 ) = 35.
We could also have found this from noting that 𝐸(𝑋𝑖 ) = 3.5 for all 𝑖 = 1,2, … ,10, so
10
that 𝐸(𝑋) = βˆ‘10
𝑖=1 𝐸(𝑋𝑖 ) = βˆ‘π‘–=1 3.5 = 10(3.5) = 35.
Question #9: A total of 𝑛 balls, numbered 1 through 𝑛, are put into 𝑛 urns, also numbered 1
through 𝑛 in such a way that ball 𝑖 is equally likely to go into any of the urns 1,2, … , 𝑖. Find
(a) the expected number of urns that are empty; (b) probability none of the urns is empty.
a) Let 𝑋 be the number of empty urns and define an indicator variable 𝑋𝑖 = 1 if urn 𝑖 is
empty and 𝑋𝑖 = 0 otherwise. We must find the expected value of this indicator
random variable, which is simply the probability that it is one: 𝐸(𝑋𝑖 ) = 𝑃(𝑋𝑖 = 1).
Since a ball can land in any of the 𝑖 urns with equal probability, the probability that
1
the ith ball will not land in urn 𝑖 is (1 βˆ’ 𝑖 ). Similarly, the probability that the i + 1st
1
ball will not land in urn 𝑖 is (1 βˆ’ 𝑖+1). Using this reasoning, we calculate that 𝐸(𝑋𝑖 ) =
1
1
1
1
𝑃(𝑋𝑖 = 1) = (1 βˆ’ 𝑖 ) (1 βˆ’ 𝑖+1) (1 βˆ’ 𝑖+2) … (1 βˆ’ 𝑛) = (
π‘–βˆ’1
𝑖
𝑖
) (𝑖+1) … (
π‘›βˆ’1
𝑛
)=
can then use this to calculate the 𝐸(𝑋) = 𝐸(βˆ‘π‘›π‘–=1 𝑋𝑖 ) = βˆ‘π‘›π‘–=1 𝐸(𝑋𝑖 ) =
1
𝑛
1
1 𝑛(π‘›βˆ’1)
𝑛
𝑛
βˆ‘π‘›π‘–=1 𝑖 βˆ’ 1 = βˆ‘π‘›βˆ’1
𝑖=0 𝑖 = (
2
)=
π‘–βˆ’1
.
We
𝑛
π‘–βˆ’1
βˆ‘π‘›π‘–=1
𝑛
=
π‘›βˆ’1
2
.
b) For all of the urns to have at least one ball in them, the nth ball must be dropped into
1
the nth urn, which occurs with probability 𝑛. Similarly, the n – 1st ball must be dropped
into the n – 1st urn, which has a probability of
1
1
1
, and so on. We can therefore
π‘›βˆ’1
1
1
1
calculate that 𝑃(π‘›π‘œ π‘’π‘šπ‘π‘‘π‘¦ π‘’π‘Ÿπ‘›π‘ ) = (𝑛) (π‘›βˆ’1) … (2) (1) = 𝑛!.
Question #11: Consider 𝑛 independent flips of a coin having probability 𝑝 of landing on
heads. Say that a changeover occurs whenever an outcome differs from the one preceding it.
For instance, if 𝑛 = 5 and the outcome is 𝐻𝐻𝑇𝐻𝑇, then there are 3 changeovers. Find the
expected number of changeovers. Hint: Express the number of changeovers as the sum of a
total of 𝑛 βˆ’ 1 Bernoulli random variables.
ο‚· Let 𝑋𝑖 = 1 if a changeover occurs on the ith flip and 𝑋𝑖 = 0 otherwise. Thus, 𝐸(𝑋𝑖 ) =
𝑃(𝑋𝑖 = 1) = 𝑃(𝑖 βˆ’ 1 𝑖𝑠 𝐻, 𝑖 𝑖𝑠 𝑇) + 𝑃(𝑖 βˆ’ 1 𝑖𝑠 𝑇, 𝑖 𝑖𝑠 𝐻) = 𝑝(1 βˆ’ 𝑝) + (1 βˆ’ 𝑝)𝑝 =
2𝑝(1 βˆ’ 𝑝) whenever 𝑖 β‰₯ 2. If we let 𝑋 denote the number of changeovers, then we
have that 𝐸(𝑋) = 𝐸(βˆ‘π‘›π‘–=2 𝑋𝑖 ) = βˆ‘π‘›π‘–=2 𝐸(𝑋𝑖 ) = βˆ‘π‘›π‘–=2 2𝑝(1 βˆ’ 𝑝) = 2(𝑛 βˆ’ 1)(1 βˆ’ 𝑝)𝑝.
Question #20: In an urn containing 𝑛 balls, the ith ball has weight π‘Š(𝑖), 𝑖 = 1, … , 𝑛. The balls
are removed without replacement, one at a time, according to: At each selection, the
probability that a given ball in the urn is chosen is equal to its weight divided by the sum of
the weights remaining in the urn. For instance, if at some time 𝑖1 , … , π‘–π‘Ÿ is the set of balls
remaining in the urn, then the next selection will be 𝑖𝑗 with probability βˆ‘π‘Ÿ
π‘Š(𝑖𝑗 )
π‘˜=1 π‘Š(π‘–π‘˜ )
, 𝑗 = 1, … , π‘Ÿ.
Compute the expected number of balls that are withdrawn before ball number 1 is removed.
ο‚· Let 𝑋𝑗 = 1 if ball 𝑗 is removed before ball 1 and 𝑋𝑗 = 0 otherwise. Then we see that
π‘Š(𝑗)
𝐸(βˆ‘π‘—β‰ 1 𝑋𝑗 ) = βˆ‘π‘—β‰ 1 𝐸(𝑋𝑗 ) = βˆ‘π‘—β‰ 1 𝑃(𝑋𝑗 = 1) = βˆ‘π‘—β‰ 1 π‘Š(𝑗)+π‘Š(1).
Question #33: If 𝐸(𝑋) = 1 and π‘‰π‘Žπ‘Ÿ(𝑋) = 5, find (a) 𝐸[(2 + 𝑋)2 ], and (b) π‘‰π‘Žπ‘Ÿ(4 + 3𝑋).
a) We calculate that the 𝐸[(2 + 𝑋)2 ] = 𝐸(4 + 4𝑋 + 𝑋 2 ) = 𝐸(4) + 4𝐸(𝑋) + 𝐸(𝑋 2 ) =
4 + 4(1) + π‘‰π‘Žπ‘Ÿ(𝑋) + [𝐸(𝑋)]2 = 4 + 4 + 5 + 12 = 14, which can calculate using the
identity that π‘‰π‘Žπ‘Ÿ(𝑋) = 𝐸(𝑋 2 ) βˆ’ [𝐸(𝑋)]2.
b) π‘‰π‘Žπ‘Ÿ(4 + 3𝑋) = π‘‰π‘Žπ‘Ÿ(3𝑋) = 32 π‘‰π‘Žπ‘Ÿ(𝑋) = 9(5) = 45.
Question #38: If the joint density π‘“π‘‹π‘Œ (π‘₯, 𝑦) =
2𝑒 βˆ’2π‘₯
π‘₯
when 1 ≀ π‘₯ < ∞ and 0 ≀ 𝑦 ≀ π‘₯ and
π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 0 otherwise, compute πΆπ‘œπ‘£(𝑋, π‘Œ).
ο‚·
Since πΆπ‘œπ‘£(𝑋, π‘Œ) = 𝐸(π‘‹π‘Œ) βˆ’ 𝐸(𝑋)𝐸(π‘Œ), we must compute the following:
∞
π‘₯
o 𝐸(π‘‹π‘Œ) = ∫0 ∫0 π‘₯𝑦 βˆ—
∞
1
2𝑒 βˆ’2π‘₯
π‘₯
1 2
∞
π‘₯
∞
∫0 π‘₯ 2 𝑒 βˆ’2π‘₯ 𝑑π‘₯ = (2) (2) ∫0 𝑑 2 𝑒 βˆ’π‘‘ 𝑑𝑑 =
π‘₯ 2𝑒 βˆ’2π‘₯
o Since 𝑓𝑋 (π‘₯) = ∫0
π‘₯
∞
𝑑𝑦𝑑π‘₯ = ∫0 ∫0 2𝑦 𝑒 βˆ’2π‘₯ 𝑑𝑦𝑑π‘₯ = ∫0 [𝑦 2 𝑒 βˆ’2π‘₯ ]0π‘₯ 𝑑π‘₯ =
Ξ“(3)
8
1
= 4.
π‘₯
1
𝑑𝑦 = 2𝑒 βˆ’2π‘₯ , we have 𝐸(𝑋) = ∫0 2π‘₯𝑒 βˆ’2π‘₯ 𝑑π‘₯ = 2 by doing
1
integration by parts with 𝑒 = π‘₯, 𝑑𝑣 = 2𝑒 βˆ’2π‘₯ so that 𝑑𝑒 = 1𝑑π‘₯, 𝑣 = βˆ’ 2 𝑒 βˆ’2π‘₯ .
∞
∞
o 𝐸(π‘Œ) = ∫0 βˆ«π‘¦ 2𝑦 βˆ—
∞
∫0 π‘₯𝑒 βˆ’2π‘₯
ο‚·
𝑑π‘₯ =
𝑒 βˆ’2π‘₯
π‘₯
∞ 𝑑 βˆ’π‘‘
∫ 𝑒
2 0 2
1
∞
π‘₯
𝑑π‘₯𝑑𝑦 = ∫0 ∫0 2𝑦 βˆ—
𝑑𝑑 =
Ξ“(2)
4
𝑒 βˆ’2π‘₯
π‘₯
∞ 𝑦 2 𝑒 βˆ’2π‘₯
𝑑π‘₯𝑑𝑦 = ∫0 [
1
π‘₯
= 4 with 𝑑 = 2π‘₯ and 𝑑𝑑 = 2𝑑π‘₯.
1
1
1
1
1
1
Thus, πΆπ‘œπ‘£(𝑋, π‘Œ) = 𝐸(π‘‹π‘Œ) βˆ’ 𝐸(𝑋)𝐸(π‘Œ) = (4) βˆ’ (2) (4) = 4 βˆ’ 8 = 8.
π‘₯
] 𝑑𝑦 =
0
Question #41: A pond contains 100 fish, of which 30 are carp. If 20 fish are caught, what are
the mean and variance of the number of carp among the 20?
ο‚·
Let 𝑋 denote the number of carp caught of the 20 fish caught. We assume that each of
the 𝐢(100,20) ways to catch 20 fish from 100 are equally likely, so it is clear that
𝑋~π»π‘Œπ‘ƒπΊ(𝑛 = 20, 𝑁 = 100, π‘š = 30). This implies that 𝐸(𝑋) =
π‘‰π‘Žπ‘Ÿ(𝑋) =
π‘›π‘š (π‘›βˆ’1)(π‘šβˆ’1)
𝑁
[
π‘βˆ’1
+1βˆ’
π‘›π‘š
𝑁
19βˆ—29
] = 6(
99
+ 5) =
π‘›π‘š
𝑁
=
20βˆ—30
100
= 6, while
112
33
.
Question #22: Suppose that 𝑋1, 𝑋2 and 𝑋3 are independent Poisson random variables with
respective parameters πœ†1 , πœ†2 and πœ†3 . Let 𝑋 = 𝑋1 + 𝑋2 and π‘Œ = 𝑋2 + 𝑋3 . Calculate (a) 𝐸(𝑋)
and 𝐸(π‘Œ), and (b) πΆπ‘œπ‘£(𝑋, π‘Œ).
a) 𝐸(𝑋) = 𝐸(𝑋1 + 𝑋2 ) = 𝐸(𝑋1 ) + 𝐸(𝑋2 ) = πœ†1 + πœ†2 and 𝐸(π‘Œ) = 𝐸(𝑋2 + 𝑋3 ) = πœ†2 + πœ†3
since whenever a random variable 𝑍~𝑃𝑂𝐼𝑆(πœ†), we have that 𝐸(𝑍) = π‘‰π‘Žπ‘Ÿ(𝑍) = πœ†.
b) πΆπ‘œπ‘£(𝑋, π‘Œ) = πΆπ‘œπ‘£(𝑋1 + 𝑋2 , 𝑋2 + 𝑋3 ) = πΆπ‘œπ‘£(𝑋1 , 𝑋2 ) + πΆπ‘œπ‘£(𝑋1 , 𝑋3 ) + πΆπ‘œπ‘£(𝑋2 . 𝑋2 ) +
πΆπ‘œπ‘£(𝑋2 , 𝑋3 ) = 0 + 0 + π‘‰π‘Žπ‘Ÿ(𝑋2 ) + 0 = πœ†2 since the three random variables are
independent (implying that their covariance is zero) and πΆπ‘œπ‘£(𝑍, 𝑍) = π‘‰π‘Žπ‘Ÿ(𝑍).