Download Second Assignment 1. (2 points) Let (Ω,¿,P) be a probability space

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Central limit theorem wikipedia , lookup

Foundations of statistics wikipedia , lookup

Non-standard calculus wikipedia , lookup

Inductive probability wikipedia , lookup

Proofs of Fermat's little theorem wikipedia , lookup

Birthday problem wikipedia , lookup

Infinite monkey theorem wikipedia , lookup

Risk aversion (psychology) wikipedia , lookup

Law of large numbers wikipedia , lookup

Transcript
Second Assignment
1. (2 points) Let (Ω, F , P ) be a probability space. InSother words,
P F is a σ-algebra of subsets of
Ω, and P : F → R+ such that P (Ω) = 1 and P ( i Ai ) = i P (Ai ) whenever the countable
collection {Ai } ⊂ F consists of pairwise disjoint sets. Show that if {Bi }∞
i=1 ⊂ F is any countable
collection such that
∞
∞ [
∞
∞ \
\
[
Bi
(*)
Bi =
n=1 i=n
n=1 i=n
then limn→∞ P (Bn ) exists, and the limit is P (B) where B is the set in (*).
2. (4 points) A fair coin is tossed independently, 60 thousand times.
(i) Estimate the probability that 30 thousand heads show up.
(ii) Estimate the probability that 31 thousand heads show up.
(iii) Estimate the probability that the actual number of heads differs from 30 thousand by at
most 1%.
(iv) Suppose that the number of heads more than 59500. What is then the (conditional) distribution of the number of heads in excess of 59500? Choose from the following options:
–(a) It is uniform on the set {1, 2, . . . , 500}.
–(b) It is concentrated near the center of the interval. That is, with very large probability, it is
within 1% of 250.
–(c) It is concentrated near the left endpoint of the interval. That is, with probability about 1,
the excess number of heads is 1 or 2 or 3.
Suggestion: I do not mind if you use a calculator to give me an answer. Of course, I would prefer an
answer by hand. But, in all cases, I want a number, not an abstract sum–this is trivial. Part (iv) is a
multiple-choice question. Choose whichever answer seems right to you. You don’t have to justify it. Of
course, I would not mind a justification (proof?)
3. (5 points) Consider a finite measure µ on the Borel subsets of the interval [0, 1]. (i) Prove that
if f : [0, 1] → R is a continuous function and {λj , j = 1, 2, . . .} a sequence of numbers in [0, 1]
then
Z
n
X
j − λj
j−1 j
,
µ
= f dµ − f (0)µ({0}),
lim
f
n→∞
n
n n
j=1
R
where the f dµ is the Lebesgue integral of f against µ. (ii) Use this to prove that if µ is the
Lebesgue measure then the Riemann integral of f agrees with its Lebesgue integral.
4. (2 points) Let f, g : [0, 1] → R+ be increasing functions: x < y ⇒ f (x) ≤ f (y), g(x) ≤ g(y).
Show that
Z
Z
Z
f (x)g(x)dx ≥ f (x)dx g(x)dx,
where integrals are Lebesgue integrals on [0, 1].
5. (8 points) (i) Prove that if {An }∞
n=1 is a sequence of events (elements of F ) in a probability
space (Ω, F , P ), such that P (An ) → 0 and
∞
X
P (An ∩ Acn+1 ) < ∞,
n=1
Credits for problems: Davar Khosnevishan, Sergey Foss, Dima Korshunov, Igor Borisov, William Feller
1
then
P
\
∞ [
∞
n=1 i=n
Ai
= 0.
(In other words, the probability of those ω ∈ Ω such that ω belongs to infinitely many terms of
the sequence {An }∞
n=1 is zero.)
(ii) Let X1 , X2 , . . . be i.i.d. random variables, uniformly distributed in [0, 1] and let Mn :=
max(X1 , . . . , Xn ). Use (i) to show that
P lim Mnn/ log n = 1 = 1.
(**)
n→∞
n/ log n
If you cannot show (**) then show the weaker statement limn→∞ P (|Mn
For this, you do not need (i).
− 1| > ε) = 0, for all ε > 0.
6. (5 points) (i) Complete the proof of Bernstein’s theorem which I started in the lectures: If
f : [0, 1] → R is a continuous function, and if Bn is the operator defined by
n X
n k
x (1 − x)n−k f (k/n),
Bn f (x) :=
k
k=0
then, Bn f converges to f uniformly, as n → ∞,
(ii) Show that, for all x, δ ∈ [0, 1], and all n ∈ N,
|Bn f (x) − f (x)| ≤ m(δ) +
kf k∞
,
2δ 2 n
where m(δ) := sup|x−y|≤δ |f (x) − f (y)| and kf k∞ := sup0≤x≤1 |f (x)|.
7. (4 points) You are invited to play the following game of chance: A button is pressed and a
random positive number X is drawn according to some probability measure with (unknown)
continuous and strictly increasing distribution function F . You do not see X. You are asked to
press the button again so that another, independent, random number Y is produced, which you
see. You then have two options.
—Option 1 = Pass (do not gamble).
—Option 2 = Bet on the event Y > X.
The game is played again and again, independently from time to time. Question: Find a strategy
so that you make money in the long run, for sure (with probability 1). Part of the problem is to
convince yourself (not me!) that the question makes perfect sense, mathematically. The other part to
find the strategy and prove that it works: You have to use the strong law of large numbers in the proof.
8. (6 points) A game of chance is called “fair” if the expected net winnings equals zero. So, if on
the outcome k we are told that we agree
P on net winnings equal to xk (interpreted as payment,
if negative), then the game is fair if k xk pk = 0, where pk is the probability of outcome k.
Consider now a game with outcomes {0, 1, 2, . . .}, such that
pk =
1
,
k(k + 1)2k
k = 1, 2, . . . ,
and p0 = 1 − (p1 + p2 + · · · ), and the following payment schedule:
xk := 2k − 1,
k = 1, 2, . . . ,
2
x0 := −1.
Notice that
∞
X
xk pk = 0,
k=0
so that the game is fair. Let X1 , X2 , . . . be i.i.d. random variables with P (Xn = xk ) = pk ,
k = 0, 1, 2, . . . Then Sn := X1 + · · · + Xn is the total net winnings in the first n games. Show
that, with cn := n/ log2 n,
for all 0 < α < 1 < β,
lim P (−βcn ≤ Sn ≤ −αcn ) = 0.
n→∞
In other words, the probability that Sn is about −cn tends to 1. In particular,
P (Sn → −∞) = 1.
Does this violate the strong law of large numbers? What about “fairness”?
3