Download Solution

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Generalized linear model wikipedia , lookup

Multiple-criteria decision analysis wikipedia , lookup

Randomness wikipedia , lookup

Simulated annealing wikipedia , lookup

Birthday problem wikipedia , lookup

Risk aversion (psychology) wikipedia , lookup

Probability amplitude wikipedia , lookup

Transcript
Chalmers TU course ESS011, 2014: mathematical statistics homework
Week 3
These are the homeworks for course week 3 sessions (2.4.& 4.4.2014), related to topics discussed on weeks 2 and
3. The solutions will be discussed at the exercise session.
8. Axioms of probability
a) Let C1 , C2 , C3 be sets with P (C1 ) = 4/10, P (C2 ) = 3/10, P (C3 ) = 5/10. Are C1 , C2 , C3 mutually exclusive?
Solution: If they are mutually exclusive the union of them will simply be the sum of the individual probabilities.
P (C1 ∪ C2 ∪ C3 ) = 4/10 + 3/10 + 5/10 = 12/10. However this sum is larger than one hence they cannot be
mutually exclusive.
b) Show that the conditional probability P (A|B) = P (A ∩ B)/P (B) fulfills the axioms of probability as a function
of A.
Solution: The three axioms we need to show can be found on page 25 of the book. Note that we can assume
that P (B) > 0.
Axiom 1: First let A = S conditional probability gives P (S|B) = P (S ∩ B)/P (B) = P (B)/P (B) = 1.
Axiom 2: We know that P (A) ≥ 0 and we have said that P (B) > 0. This implies that P (A ∩ B) ≥ 0 which in
turns gives that P (A|B) ≥ 0.
(A2 ∩B)
1 ∪A2 )∩B)
= P (A1 ∩B)+P
=
Axiom 3: Let A1 and A2 be two mutually exclusive events. P (A1 ∪ A2 ) = P ((AP
(B)
P (B)
P (A1 |B) + P (A2 |B)
c) Prove the general addition rule: P (A ∪ B) = P (A) + P (B) − P (A ∩ B) for any events A, B.
Solution: First note that we can write P (A) as P (A) = P (A ∩ B) + P (A ∩ B 0 ) and in the same way
P (B) = P (B ∩ A) + P (B ∩ A0 ). Now write;
9. Winner’s dice Consider a peculiar dice game, where you have three dice with non-standard faces: A red
die with faces {2, 4, 9, 2, 4, 9}, a green die with faces {1, 6, 8, 1, 6, 8} and a blue die with faces {3, 5, 7, 3, 5, 7}.
You let your opponent choose a die, and then you choose a die. You both roll, and the one who gets the bigger
number wins.
Show that when the opponent has chosen his die, you can always choose so that your chances of winning are better
than his.
Solution: First enumerate the outcomes when rolling each pair of dice. We can then count the number of times
each die rolls a greater number in each pair. A tip is to just look at the first three numbers of each die and multiply
the number of outcomes by four. We find that red beats green 20 out of 36 times, green beats blue 20 out of 36
times and that blue beats red 20 out of 36 times. Thus it is always possible to choose a die that has greater chance
to win.
10. Independence
a) How many times should a fair coin be tossed so that the probability of getting at least one head is at least
99.9%? How about if the coin is not fair, but lands tails 75% of the time?
Solution: First note that P ({At least one head}) = 1 − P ({Only tails}). For a fair coin the probability
P ({Only tails in x tosses}) = ( 12 )x . Set this probability equal to the sought percentage giving ( 12 )x = 0.001.
Solving this using logarithms gives x = 10 by rounding to up to the next integer. Using 34 instead for the
probability of tails gives x = 25.
b) Roll two dice. Let Ai = {die i is even}, i = 1, 2, and B = {sum is even}. Show that the events are pairwise
independent but not jointly independent.
Solution: One way to solve this is to calculate the probabilities of each individual event and the probabilities
of the joint events by enumerating the possible outcomes. Then checking whether P (A ∩ B) = P (A)P (B) holds
for each pair and for all three events together check P (A1 ∩ A2 ∩ B) = P (A1 )P (A2 )P (B). We can also easily
argue that if both the first and second die are even then the sum must be even so the events cannot be jointly
independent.
1
Chalmers TU course ESS011, 2014: mathematical statistics homework
Week 3
c) Show that if A and B are independent, then so are the pairs A and B 0 , and A0 and B 0 . Solution: Independence
gives that P (A ∩ B) = P (A)P (B). Now write P (A)P (B) = P (A)(1 − P (B 0 )) = P (A) − P (A)P (B 0 )
⇒ P (A)P (B 0 ) = P (A) − P (A ∩ B) = P (A ∩ B 0 ).
In the same way we can show that it holds for pairs A0 and B 0 starting with just derived results P (A ∩ B 0 ) =
P (A)P (B 0 ).
11. Inversion
Use Bayes’s theorem to solve the following:
a) You have two coins, a fair one and a double-headed one, in your pocket. You take one out without looking and
flip it, getting heads. What is the probability that the coin is the fair one?
Solution: Define the events F : Picked the fair coin, H : Got heads. We are given that P (F ) = 0.5, P (H|F ) =
(H|F )P (F )
0.5∗0.5
1
0.5. We seek P (F |H). Using Bayes’ theorem we get P (F |H) = P (H|F )PP(F
)+P (H|F 0 )P (F 0 ) = 0.5∗0.5+0.5∗1 = 3 .
b) A student knows 70% of the questions in a multiple-choice exam. When not knowing the answer he picks one
randomly. If he answer correctly to question 1, which has 4 choices, what are the chances that he actually
knows the answer?
Solution: Define the events K : Knows the answer, C : Answers correctly. We are given that P (K) =
(C|K)P (K)
0.7, P (C|K) = 1. We seek P (K|C). Using Bayes’ theorem we get P (K|C) = P (C|K)PP(K)+P
(C|K 0 )P (K 0 ) =
0.7∗0.1
0.7∗0.1+0.25∗0.3 ≈ 0.903.
c) You want to plant some yellow tulips, but your tulip bulbs are all mixed up. All you know that out of your 7
sacks, 3 contain bulbs for 1 red and 9 yellow tulips, and 4 contain bulbs for 5 red and 5 yellow tulips. You pick
a sack at random, and then a bulb at random. What are the chances it is for a yellow tulip? If later on you
see it was yellow, what is the probability it came from a 1 red and 9 yellow -sack?
Solution: Define the events U : Uneven Sack, Y : Yellow Tulip. We are given that P (U ) = 37 , P (Y |U ) =
9 3
5 4
We seek P (Y ) and P (U |C). First we calculate P (Y ) = P (Y |U )P (U ) + P (Y |U 0 )P (U 0 ) = 10
7 + 10 7 =
Using Bayes’ theorem we get P (U |Y ) =
12. Expected gains
P (Y |U )P (U )
P (Y )
=
9 3
10 7
47
70
=
9
10 .
47
70 .
27
47 .
Some expectations in games of chance:
a) What is your expected return on a game of roulette, if you are going to play SEK1000 on red? (Roulette wheel:
37 slots of which one colorless and the rest 50-50 black and red)
Solution: Let the random variable X denote the money won or lost playing roulette. Note the probabilities
18
for losing, P (X = −1000) = 19
37 , and winning, P (X = 1000) = 37 . By the definition of expected value we get
19
18
the expected winnings to be E[X] = −1000 37 + 1000 37 ≈ −27.
b) A bowl contains 10 chips, of which 8 are marked SEK100 each and 2 are marked SEK1000 each. You are
given a chance to randomly pick three, without replacement, and get the sum in cash. How much is the
expected sum? Solution:Let the random variable X denote the sum of the chips. There are three possible
56
56
8
outcomes {300, 1200, 2100} with the respective probabilities 120
, 120
and 120
. Note that the probabilities
(22)(81)
can be derived using combinatorics for example P (2100) = 10 . The expected sum then becomes E[X] =
(3)
56
56
8
300 120
+ 1200 120
+ 2100 120
= 840.
13. Densities
Find c such that the function is a density for some random variable:
a. f (x) = cx with x = 1, 2, 3, 4, 5, 6.
Solution: ForX
P to be a random variable it holds that; for continuous distributions
distributions x∈Ω f (x) = 1.
1
For the given distribution this gives c(1 + 2 + 3 + 4 + 5 + 6) = 1 ⇒ c = 21
.
R
Ω
f (x)dx = 1, for discrete
b. f (x) = c(2/3)x with x = 1, 2, 3, ...
P∞
x
Solution: To solve x=1 c(2/3)
knowledge of geometric series found for example in BETA
P∞ = 1 wex canPneed
∞
or online. Using this we get x=1 c(2/3) = x=0 c(2/3)x − c = 1−c 2 − c = 2c ⇒ c = 12 .
3
2
Chalmers TU course ESS011, 2014: mathematical statistics homework
c. f (x) = c(1 − x2 ) where −1 < x < 1.
R
Solution: Solving the integral −1 1c(1 − x2 )dx = 1 gives c =
Week 3
3
4
Suppose X has the density function as in c. :
i) Find the cumulative distribution function F (x).
Rx
Solution: Solving F (x) = −1 f (x)dx gives;
F (x) =

 0
3x
4

−
x3
4
+
1
1
2
x < −1
−1 < x < 1
1<x
ii) Sketch a picture of f and F .
iii) Compute P (X > 0.5) and P (0 < X < 0.5).
Solution: By definition F (x) = P (X < x). Then P (X > 0.5) = 1 − P (X < 0.5) = 1 − F (0.5) ≈ 0.156. In
the same way P (0 < X < 0.5) = F (0.5) − F (0) ≈ 0.344.
14. More densities
Compute the mean µ, variance σ 2 and P (µ − 2σ < X < µ + 2σ) when density of X is
a) f (x) = 6x(1 − x) with 0 < x < 1
Solution: First note the formula for variance V ar(X) = E[X 2 ] − E[X]2 . We calculate the first two moments
using integrals.
Z
∞
E[X] =
−∞
E[X 2 ] =
1
Z
x6x(1 − x)dx = ... =
xf (x)dx =
0
∞
Z
x2 f (x)dx =
Z
−∞
The variance then becomes V ar(X) =
3
10
1
=µ
2
1
x2 6x(1 − x)dx = ... =
0
−
1
4
=
1
20
= σ2 ⇒ σ =
3
10
√1 .
20
The probability we seek is P (µ−2σ < X < µ+2σ) = P ( 21 − √220 < X < 21 + √220 ) =
.
R
1
√2
2 + 20
1
√2
−
2
20
6x(1−x)dx ≈ 0.984
b) f (x) = (1/2)x , with x = 1, 2, 3, ....
Solution: The procedure is similar to that of a) but we use sums to calculate the first two moments.
E[X] =
∞
X
k=1
1
1
1
x( )x = ... = (1 − )−2 = 2 = µ
2
2
2
∞
X
1
1
1
1
x2 ( )x = ... = (1 − )−3 ( + ( )2 ) = 6
2
2
2
2
k=1
√
2
The variance then√becomes V ar(X)
√ = 6 − 4 = 2 = σ ⇒ σ = 2. The probability we seek is P (µ − 2σ < X <
µ + 2σ) = P (2 − 2 < X < 2 + 2) = P (−0.8 < X < 4.8) = P (X = 1) + P (X = 2) + P (X = 3) + P (X =
4) = ... = 15
16 .
E[X 2 ] =
15. Continuity theorem One of the theorems in probabilityTtheory states that for so called decreasing se∞
quences of set C1 ⊃ C2 ⊃ C3 ... for which we write limn→∞ Cn = n=1 Cn , it holds that
lim P (Cn ) = P ( lim Cn ) = P (
n→∞
n→∞
∞
\
Cn )
n=1
Consider now the probability of randomly picking a point from the interval [0, 1] with uniform distribution.
3
Chalmers TU course ESS011, 2014: mathematical statistics homework
Week 3
1. Show that the probability of taking the number from any subinterval [a, b] ⊂ [0, 1] is b − a.
Solution: With a uniform distribution on [0, 1] we know that f (x) = 1 on 0 ≤ x ≤ 1 and 0 otherwise. We
Rb
Rb
can the calculate the sought probability, P (a < X < b) = a f (x)dx = a 1dx = b − a.
2. Show using the theorem above that if X follows this distribution, P (X = c) = 0 for any single value c ∈ [0, 1].
Solution: The key idea here is to construct a sequence of sets Cn that converges to the point interest. An
example of such a sequence is Cn = [c − n1 , c + n1 ]. By the answer to the previous exercise we know that
P (Cn ) = c + n1 − (c − n1 ) = n2 . Note that for small values of n and with c near the edges of the interval [0, 1]
this is not strictly true but for any choice of c we can construct a set Cn that converges in the desired way.
First we note that limn→∞ Cn = c and that limn→∞ n2 = 0. We have,
lim P (Cn ) = P ( lim Cn ) = lim
n→∞
n→∞
and as we let n → ∞ we get the desired result
P (X = c) = 0
4
n→∞
2
n