Download .pdf

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
STAT 353 – Solutions: Assignment 5
Winter, 2016 (Total 25 marks)
Problem 1 (From Sheet.) (4 marks)
(a) (2 marks) We can define three random variables, say Y1 , Y2 and Y3 , where Y1 is the
number of X’s that are less than .2, Y2 is the number of X’s that are greater than .8,
and Y3 is the number of X’s that fall in the interval [.2, .8]. Then (Y1 , Y2 , Y3 )T will have
a Multinomial distribution with parameters n = 10, p1 = .2, p2 = .2, and p3 = .6. The
probability that exactly two of the Xi ’s are less than .2 and exactly two of them are
greater than .8 is then
P (Y1 = 2, Y2 = 2, Y3 = 6) =
10! 2 2 6
.2 .2 .6 = 1260 × 0.0000746496 = 0.094.
2!2!6!
(b) (2 marks) Here we can define five random variables, say Zi , i = 0, . . . , 4, where Zi is
the number of X’s that fall in the interval [0.2i, 0.2i + .2). Then (Z0 , Z1 , Z2 , Z3 , Z4 )T
has a Multinomial distribution with parameters n = 10 and pi = .2, for i = 0, . . . , 4.
Then the probability that each of the intervals [0.2i, 0.2i + .2), i = 0, 1, 2, 3, 4, contains
exactly two of the Xi ’s is
10!
.22 .22 .22 .22 .22
2!2!2!2!2!
= 113400 × 0.0000001024
P (Z0 = 2, Z1 = 2, Z2 = 2, Z3 = 2, Z4 = 2) =
= 0.0116.
√
Problem 2 (From Sheet.) (5 marks) We have seen in class that Γ(1/2) = π. This
handles the n = 0 case. For n ≥ 1, using the recursive property of the Gamma function,
Γ(α) = (α − 1)Γ(α − 1) for α > 1, we have
Γ(n + 1/2) = (n − 1 + 1/2)Γ(n − 1 + 1/2)
..
.
= (n − 1 + 1/2)(n − 2 + 1/2) . . . (n − n + 1/2)Γ(n − n + 1/2)
2n − 1
2n − 3
2n − (2n − 1) √
=
...
π
2
2
2
STAT 353 -- Solutions:
Assignment 5
p.2
√
(2n)!
π
− 2) . . . (2n − (2n − 2))
√
(2n)!
= n n
π
2 2 (n)(n − 1) . . . (n − (n − 1))
√
(2n)! π
=
4n n!
=
2n (2n)(2n
Problem 3 (From Sheet.) (4 marks) The probability that the point (X, Y ) is at a distance
of more than 1.5σ from the origin is
2
√
X +Y2
2
2
2
2
2
P ( X + Y > 1.5σ) = P (X + Y > 2.25σ ) = P
> 2.25 .
σ2
Since X/σ and Y /σ are independent N (0, 1) random variables, we have (from results in
class) that X 2 /σ 2 and Y 2 /σ 2 are independent χ2 random variables with 1 degree of freedom.
Therefore (also from results in class), X 2 /σ 2 + Y 2 /σ 2 has a χ2 distribution with 2 degrees of
freedom, which is the same as an exponential distribution with parameter 1/2. Therefore,
2
√
X +Y2
2
2
> 2.25 = e−2.25/2 = 0.3246.
P ( X + Y > 1.5σ) = P
σ2
Problem 4 (From Sheet.) (5 marks) We integrate y out of the joint pmf/pdf of (X, Y ) to
get the marginal pmf of X. For x ∈ {0, 1, . . . n} we obtain
Z 1 n x
P (X = x) =
y (1 − y)n−x dy
x
0 Z
n Γ(x + 1)Γ(n − x + 1) 1 Γ(x + 1 + n − x + 1) x+1−1
=
y
(1 − y)n−x+1−1 dy
x Γ(x + 1 + n − x + 1) 0 Γ(x + 1)Γ(n − x + 1)
n Γ(x + 1)Γ(n − x + 1)
=
,
x Γ(x + 1 + n − x + 1)
since the last integrand is a proper Beta(x + 1, n − x + 1) pdf. Simplifying, we have
P (X = x) =
n!x!(n − x)!
1
=
.
x!(n − x)!(n + 1)!
n+1
That is, the marginal pmf of X is discrete uniform on 0, . . . , n.
STAT 353 -- Solutions:
Assignment 5
p.3
Problem 5 (From Sheet.) (7 marks) We use the bivariate change of variable formula to
get the joint density of U and V . We have
U = h1 (X, Y ) =
X/m
Y /n
and
V = h2 (X, Y ) = Y
mU V
n
and
Y = w2 (U, V ) = V.
and the inverse transformation is
X = w1 (U, V ) =
The Jacobian of the transformation is
"
J = det
∂w1 (u,v)
∂u
∂w2 (u,v)
∂u
∂w1 (u,v)
∂v
∂w2 (u,v)
∂v
#
"
mv/n mu/n
= det
0
1
#
=
mv
.
n
Since X and Y are independent, the joint density of X and Y is given by
(
(1/2)m/2 (m/2)−1 −x/2 (1/2)n/2 (n/2)−1 −y/2
x
e
y
e
for x > 0 and y > 0
Γ(m/2)
Γ(n/2)
f (x, y) =
0
otherwise.
It is not hard to see that the sample space of (U, V ) is
SU,V = {(u, v) ∈ R2 : u > 0, v > 0}.
By the bivariate change of variable formula, the joint density of U and V , say g(u, v), is
computed as
g(u, v) = f (w1 (u, v), w2 (u, v))|J|
mv
= f (muv/n, v)
n
(
(1/2)m/2 muv (m/2)−1 −muv/(2n) (1/2)n/2 (n/2)−1 −v/2 mv
( n )
e
v
e
Γ(m/2)
Γ(n/2)
n
=
0
(
(1/2)(m+n)/2 (m/n)m/2 (m/2)−1 ((m+n)/2)−1 −(mu/(2n)+1/2)v
u
v
e
Γ(m/2)Γ(n/2)
=
0
for u > 0 and v > 0
otherwise
for u > 0 and v > 0
otherwise.
Now we integrate g(u, v) over v to get the marginal density function of U , say gU (u). For
fixed u > 0, g(u, v) as a function of v is proportional to the pdf of a Gamma distribution
R∞
with parameters m+n
and mu
+ 12 . Therefore, computing the integral 0 g(u, v)dv can be
2
2n
done by multiplying g(u, v) by the appropriate normalizing constant (which may depend on
u). For u > 0, we have
Z ∞
gU (u) =
g(u, v)dv
0
STAT 353 -- Solutions:
Assignment 5
p.4
Z
(1/2)(m+n)/2 (m/n)m/2 (m/2)−1 ∞ ((m+n)/2)−1 −(mu/(2n)+1/2)v
=
u
v
e
dv
Γ(m/2)Γ(n/2)
0
(1/2)(m+n)/2 (m/n)m/2 (m/2)−1
Γ((m + n)/2)
=
u
× mu 1 (m+n)/2
Γ(m/2)Γ(n/2)
( 2n + 2 )
Z ∞ mu 1 (m+n)/2
( 2n + 2 )
v ((m+n)/2)−1 e−(mu/(2n)+1/2)v dv
×
Γ((m + n)/2)
0
(m+n)/2
(1/2)
(m/n)m/2 (m/2)−1
Γ((m + n)/2)
=
u
× mu 1 (m+n)/2
Γ(m/2)Γ(n/2)
( 2n + 2 )
=
m
(mu/n)(m−2)/2
·
,
nB(m/2, n/2) [1 + (mu/n)](m+n)/2
where
B(α, β) =
Γ(α)Γ(β)
Γ(α + β)
for α > 0, β > 0
is the Beta function. Clearly, gU (u) = 0 for u < 0 since g(u, v) = 0 for u < 0 and for any v.
Related documents