Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
SOLUTION FOR HOMEWORK 8, STAT 4351
Welcome to your 8th homework. We begin our work on classical distributions. This HW
will be primarily devoted to binomial and negative binomial discrete distributions.
As usual, try to find mistakes and get extra points!
In what follows θ ∈ (0, 1), You can always study the two boundary points separately
because they imply trivial RVs.
Now let us look at your problems.
1. Problem 5.1. Let f X (x) = k −1 , x = 1, 2, . . . , k. Then:
(a)
E(X) =
k
X
xf X (x) = k −1
k
X
x = k −1 [k(k + 1)/2] = (k + 1)/2.
x=1
x=1
Please note that the result is reasonable — the mean is in the middle of the range.
(b)
V ar(X) = E(X 2 ) − [E(X)]2 = E(X 2 ) − [(k + 1)/2]2.
Further,
2
E(X ) =
k
X
2 X
x f (x) = k
−1
k
X
x2 .
x=1
x=1
Using formula for S(k, 2) on page 554 we get
E(X 2 ) = k −1 [(1/6)k(k + 1)(2k + 1)] = (1/6)(k + 1)(2k + 1).
We conclude that
V ar(X) = (1/6)(k + 1)(2k + 1) − (k + 1)2 /4 = (1/12)(k + 1)[4k + 2 − 3k − 3]
= (k + 1)(k − 1)/12 = (k 2 − 1)/12.
Please note that if k = 1 then the variance is zero as it should be because in this case X is
deterministic; this is a way to check your calculations.
2. Problem 5.8. Note that b(x, n, θ) = P (X = x|n, θ) with X ∼ Bin(n, θ).
Remark: in other texts/tables/softwares you can see Bin(n, θ) or Bin(θ, n); just keep in
mind that n is the sample size (number of trials) and thus integer while θ (also p is a popular
notation) is the probability of S and thus between 0 and 1. Thus you can always figure out
what is what.
Now:
n!
b(x + 1, n, θ) =
θx+1 (1 − θ)n−x−1
(x + 1)!(n − x − 1)!
=
θ(n − x)
n!
θ(n − x)
[
θx (1 − θ)n−x ] =
b(x, n, θ).
(1 − θ)((x + 1) x!(n − x)!
(1 − θ)(x + 1)
1
3. Problem 5.16. Let Y be the number of failures before the kth success. Then the last
trial is S and Y + (k − 1) earlier trials are binomial with exactly (k − 1) Ss. Then
P (Y = y|θ, k) =
h (y
i
+ k − 1)! k−1
θ (1 − θ)y θ
(k − 1)!y!
(y + k − 1)! k
θ (1 − θ)y ,
(k − 1)!y!
=
y = 0, 1, 2 . . .
4. Problem 5.17. Here I should be smart with the approach taken. It is given that
Y = X − k where X is the total number of trials with k successes. We know (page 175) that
E(X) = k/θ and V ar(X) = (k/θ)(θ−1 − 1). Thus
E(Y ) = k/θ − k = k(1 − θ)/θ,
V ar(Y ) = V ar(X) = (k/θ)(θ−1 − 1).
5. Problem 5.20. Here X ∼ Geom(θ) = Negbin(1, θ). Thus
f X (x) = (1 − θ)x−1 θ,
x = 1, 2, . . .
Recall that X is the total number of Bernoulli trials until first S occurs. Then
MX (t) = E{etX } =
∞
X
etx (1 − θ)x−1 θ
x=1
=
∞
X
etx+x ln(1−θ) θ/(1 − θ) = θ(1 − θ)−1
∞
X
ex(t+ln(1−θ)) .
x=1
x=1
Recall formula for a geometric sum (we discussed it in class)
∞
X
k=1
qk =
q − q∞
q
=
, |q| < 1.
1−q
1−q
Thus for sufficiently small t we can write
θ(1 − θ)et
θet
θ
et+ln(1−θ)
=
=
.
MX (t) =
1 − θ 1 − et+ln(1−θ)
(1 − θ)[1 − (1 − θ)et ]
1 − et + θet
6. Problem 5.23. Here X ∼ Geom(θ), f X (x) = θ(1 − θ)x−1 , x = 1, 2, . . . Then, for any
positive integer x we can write
P (X = x + n|X > n) =
=
P (X = x + n, X > n)
P (X > n)
θ(1 − θ)x+n−1
P (X = x + n)
= P∞
.
P (X > n)
θ x=n+1 (1 − θ)x−1
2
Working on the geometric sum in the denominator we get
∞
X
(1 − θ)x−1 =
∞
X
(1 − θ)x =
x=n
x=n+1
(1 − θ)n − (1 − θ)∞
(1 − θ)n
=
.
1 − (1 − θ)
θ
We plug in the result and get
P (X = x + n|X > n) =
θ(1 − θ)x+n−1
= θ(1 − θ)x−1 .
n
(1 − θ)
Note that this is the case of a memoryless random variable (do you recall another example
of a continuous RV with the same property?)
7. Problem 5.24. Consider the familiar failure rate function
Z(x) =
f (x)
.
1 − F (x − 1)
Note that
1 − F (x − 1) = P (X ≥ x) = θ
∞
X
(1 − θ)k−1
k=x
x−1
(1 − θ)
− (1 − θ)
1 − (1 − θ)
=θ
∞
= (1 − θ)x−1 .
We plug in and get for the failure rate function:
θ(1 − θ)x−1
= θ.
(1 − θ)x−1
Z(x) =
Please think how this constant failure rate is related to the property of memoryless.
8. Problem 5.25. Here the RV of interest is X = ni=1 Xi with Xi ∼ Bern(θi ). Then:
P
(a) Denote θ := n−1 ni=1 θi (note that it is the average probability of Ss) and write:
P
E(X) =
n
X
E(Xi ) =
i=1
n
X
θi = nθ.
i−1
(b) For the variance we use independence of Xs and write:
V ar(
n
X
Xi ) =
n
X
i=1
Note that
n
X
i=1
θi2 =
n
X
i=1
V ar(Xi ) =
θi −
n
X
θi2 = nθ −
i=1
[(θi − θ) + θ]2 =
n
X
θi (1 − θi )
i=1
i=1
i=1
=
n
X
n
X
θi2 .
i=1
n
X
(θi − θ)2 + 2θ
i=1
3
n
X
i=1
(θi − θ) + nθ2 ,
and because
Pn
i=1 (θi
− θ) = 0 (use definition of θ to check this) we can write
V ar(
n
X
Xi ) = nθ − nθ2 −
i=1
n
X
(θi − θ)2 = nθ(1 − θ) − nσθ2 .
i=1
9. Problem 5.41. Here X ∼ Binom(θ = .1, n = 5), and
P (X ≥ 3) = P (X = 3) + P (X = 4) + P (X = 5)
=
5!
5!
5!
(.1)3 (.9)2 +
(.1)4 (.9)1 +
(.1)5 (.9)0 .
3!2!
4!1!
5!0!
10. Problem 5.44 Here X ∼ Binom(θ = .5, n = 18), and
(a)
18!
P (X = 10) =
(.5)18 .
10!8!
(b)
18
18
X
X
18!
(.5)18 .
P (X = i) =
P (X ≥ 10) =
i!(18
−
i)!
i=10
i=10
(c)
P (X ≤ 8) =
8
X
P (X = i) =
i=0
8
X
18!
(.5)18 .
i!(18
−
i)!
i=0
11. Problem 5.50
(a) σ = [np(1 − p)]1/2 , so to reduce it by half, choose n∗ = n/4.
(b) Here
σk = [knp(1 − p)]1/2 = k 1/2 σ.
4