Download Exercises - Probability Theory III

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Exercises - Probability Theory III
October 2007
1 Compute the characteristic function of a uniform distribution on the interval [a, b]. What result is obtained in the special case a = −1, b = 1?
2 Determine the distribution with characteristic function φ(t) = cos t, −∞ <
t < ∞.
3 Determine the distribution with characteristic function
φ(t) =
t + sin t
,
2t
−∞ < t < ∞.
4 Determine the distribution with characteristic function
1
φ(t) = (2 cos t + 3 cos 2t + i sin 2t),
5
−∞ < t < ∞.
Then compute the expected value and variance – directly and by using φ(t).
1
5 Let φ(t) be the characteristic function of a distribution symmetric around
0. Show that
1 + φ(2t) ≥ 2φ(t)2 .
6 Show that φ(t) = cos t2 , −∞ < t < ∞, cannot be the characteristic
function of a random variable X with EX 2 < ∞.
7 Show that if φ(t) is a characteristic function, then exp{λ(φ(t) − 1)} is a
characteristic function as well for positive λ. Hint: What is the probability
generating function of a random variable with a Poisson distribution?
8 A random variable X is said to be infinitely divisible if, for each n =
1, 2, . . ., there are independent and identically distributed random variables
P
X1 , . . . , Xn such that ni=1 Xi has the same distribution as X. Show, by
means of characteristics funcitons, that X is infinitely divisible when
a) X is normally distributed with mean 0 and variance σ 2 ;
b) X is Poisson distributed with mean λ.
9 Compute E(cos X) and Var(cos X) when X is normally distributed with
mean 0 and variance 1.
10 Let g be a given non-negative function.
a) If X is a continuous random variable with density fX , show that
Z ∞
−∞
g(x)fX (x) dx < ∞
if
Z ∞
−∞
2
g 2 (x)fX (x) dx < ∞.
b) Is it true that
Z ∞
g(x) dx < ∞
if
−∞
Z ∞
g 2 (x) dx < ∞ ?
−∞
Prove or give a counterexample.
11 Let
n
1X
Z=
Xi
n i=1
be the arithmetic mean of n independent and Cauchy-distributed random
variables. Show that Z is Cauchy-distributed as well. Doesn’t this violate
the Law of Large Numbers?
12 Compute the density function of the continuous random variable X, if
the characteristic function of X is given by
φ(t) =
1 − |t| om |t| ≤ 1,
0
annars.
13 Let (X, Y ) be a bivariate random variable with finite second moments
and simultaneous characteristic function φ(s, t). Suppose there exists a function g(t) such that
φ(at, bt) = g(t)
for all a and b such that a2 + b2 = 1. Prove that the random variables Xand
Y are uncorrelated and equally distributed.
14 I the Continuity Theorem, it is required that the limit function φ of a
sequence of characteristic functions is continuous at the origin in order for φ
to be a characteristic function itself. Show that this requirement cannot be
dropped. Hint: Consider a sequence Xn of random variables, where Xn is
uniformly distributed on [−n, n].
3
15 Suppose that X is Poisson√distributed with parameter λ. Show that
asymptotically, Y = (X−EX)/ VarX is normally distributed when λ → ∞.
16 Suppose√ X has a Γ(p, a)-distribution. Show that asympotically, Y =
(X − EX)/ VarX has a normal distribution when p → ∞.
17 Given a sequence Xn of random variables, where Xn ∼ Bin(n, λ/n).
Show that Xn converges in distribution to a Poisson distribution.
18 Given a sequence Xn of random variables, where Xn ∼ Geom(λ/n).
Show that Xn /n converges in distribution to an exponential distribution.
19 Let X and Y be two independent symmetric random variables such that
X + Y and X − Y are independent as well. (Symmetry of X means that X
and −X have the same distribution.) Show, e.g. by means of characteristic
functions, that X and Y have the same distribution.
20 Let Xi be a sequence of independent random variables, all having a uniform distribution on [0, θ]. Define Yn = min{X1 , X2 , . . . , Xn }, n ≥ 1. Show
that nYn converges in distribution and write down the limit distribution.
21 Suppose that Xn is a sequence of random variables converging to X in
distribution, and that an is a sequence of positive numbers converging to 0
as n → ∞. Show that an Xn converges to 0 in probability as n → ∞.
4
22 Suppose Xn is a sequence of random variables converging almost surely
to X, and that cn is a sequence of numbers converging to 0 as n → ∞. Show
that Xn + cn converges almost surely to X as n → ∞.
23 Let Xn be a sequence of independent random variables, defined on the
same probability space, with
P(Xn = n) = pn
and P(Xn = 0) = 1 − pn .
Give conditions on {pn } that imply convergence of Xn to 0
a) almost surely (you only need to give a sufficient condition);
b) in rth moment;
c) in probability;
d) in distribution.
24 Let Xi be a sequence of independent random variables with expected
value 0 and VarXi ≤ M for all i. Show that the sequence
n
1X
Yn =
Xi
n i=1
of arithmetic means converges to 0 in probability as n → ∞.
25 Let Xi be a sequence of independent random variables, uniformly distributed on [0, 1]. Show that the sequence
Yn = (X1 X2 X3 · · · Xn )1/n
of geometric means converges in probability to a certain constant as n → ∞.
Which constant?
5
26 Let Xi be a sequence of independent random variable, all uniformly
P
distributed on [0, θ]. Show that max{X1 , X2 , . . . , Xn } → θ.
27 Let Xi be a sequence of random variables with expected value 0 and the
same finite variance. Suppose, in addition, that Cov(Xi , Xj ) ≤ 0 if i 6= j.
Show that the sequence
n
1X
Yn =
Xi
n i=1
converges to 0 in probability.
28 Prove or give a counterexample:
n.s.
n.s.
n.s.
a) Xn −→ X and Yn −→ Y imply Xn + Yn −→ X + Y .
r
r
r
P
P
P
D
D
D
b) Xn → X and Yn → Y imply Xn + Yn → X + Y .
c) Xn → X and Yn → Y imply Xn + Yn → X + Y .
d) Xn → X and Yn → Y imply Xn + Yn → X + Y .
29 Prove or give a counterexample:
n.s.
n.s.
n.s.
a) Xn −→ X and Yn −→ Y imply Xn Yn −→ XY .
r
r
r
P
P
P
D
D
D
b) Xn → X and Yn → Y imply Xn Yn → XY .
c) Xn → X and Yn → Y imply Xn Yn → XY .
d) Xn → X and Yn → Y imply Xn Yn → XY .
6
30 Let A1 , A2 , . . .be a sequence of independent events such that
P (Ai+1 ) ≥
i
P (Ai )
i+1
for i = 1, 2, . . .. Show that P (Ai i.o.) is either 0 or 1.
31 Let X and Y be random variables defined on the same probability space
such that X is N (0, 1)-distributed and Y is Cauchy-distributed. According
1 n.s. P
D
to which convergence criteria →, →, → and → do the sequences
Zn = an X + bn Y
converge for different a and b such that −1 ≤ a, b ≤ 1?
P
P
32 Let g : R → R be continuous. Show that g(Xn ) → g(X) if Xn → X.
33 Let g : R → R be continuous and bounded. Show that Eg(Xn ) →
D
Eg(X) if Xn → X. Hint: Use Skorokhod’s Representation Theorem.
34 Let Yj be independent and identically distributed random variables, each
one uinformly distributed on {0, 1, . . . , 9}. Show, e.g. by means of characteristic functions, that the sequence
Xn =
n
X
Yj 10−j
j=1
converges in distribution to a uniform distribution on [0, 1]. Then show that
n.s.
Xn −→ X for some X that is uniformly distributed on [0, 1].
7
35 Suppose
∞
X
E |Xn − X|r < ∞
for some r > 0.
n=1
n.s.
Show that Xn −→ X.
36 Prove the following version of the Strong Law of Large Numbers: If Xi is
a sequence of independent and identically distributed random variables with
expected value 0 and finite fourth moment, EX14 < ∞, then
Yn =
n
1X
n.s.
Xi −→ 0.
n i=1
37 If X has mean 0 och variance σ 2 , show that
P(X ≥ t) ≤
σ2
,
σ 2 + t2
for t ≥ 0.
Hint: Introduce Y = X + c and then use Chebyshev’s Inequality.
8