Download Midterm #1 Practice Problems

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Midterm #1 – Practice Exam
Question #1: Let 𝑋1 and 𝑋2 be independent and identically distributed random variables
with probability density function 𝑓𝑋 (π‘₯) = {
𝑒 βˆ’π‘₯ 𝑖𝑓 π‘₯ > 0
such that they come from the
0
𝑖𝑓 π‘₯ ≀ 0
distribution 𝑋~𝐸𝑋𝑃(1). Compute the joint density function of π‘ˆ = 𝑋1 and 𝑉 = 𝑋1 𝑋2.
ο‚·
Since 𝑋1 and 𝑋2 are independent, their joint density function is given by
𝑓𝑋1 𝑋2 (π‘₯1 , π‘₯2 ) = 𝑓𝑋1 (π‘₯1 )𝑓𝑋2 (π‘₯2 ) = 𝑒 βˆ’π‘₯1 𝑒 βˆ’π‘₯2 = 𝑒 βˆ’(π‘₯1 +π‘₯2 ) whenever π‘₯1 > 0, π‘₯2 > 0. We
𝑣
𝑣
have the transformation 𝑒 = π‘₯1 and 𝑣 = π‘₯1 π‘₯2 so π‘₯1 = 𝑒 and π‘₯2 = π‘₯ = 𝑒, so we can
1
πœ•π‘₯1
compute the Jacobian as 𝐽 =
πœ•π‘’
𝑑𝑒𝑑 [πœ•π‘₯
2
πœ•π‘’
πœ•π‘₯1
πœ•π‘£
]
πœ•π‘₯2
1
= 𝑑𝑒𝑑 [βˆ’ 𝑣
𝑒2
πœ•π‘£
𝑣
0
1
1 ] = . This implies that the
𝑒
𝑒
𝑣
1
joint density π‘“π‘ˆπ‘‰ (𝑒, 𝑣) = 𝑓𝑋1 𝑋2 (𝑒, 𝑒) |𝐽| = 𝑒 𝑒 βˆ’(𝑒+𝑒) whenever 𝑒 > 0 and 𝑣 > 0.
Question #2: Let 𝑋 be a standard normal random variable. Compute the Moment Generating
Function (MGF) of π‘Œ = |𝑋| using the standard normal distribution function Ξ¦(z).
ο‚·
For some continuous random variable 𝑋 with density 𝑓𝑋 (π‘₯), its moment generating
∞
function is given by 𝑀𝑋 (𝑑) = 𝐸(𝑒 𝑑𝑋 ) = βˆ«βˆ’βˆž 𝑒 𝑑π‘₯ 𝑓𝑋 (π‘₯) 𝑑π‘₯. Since 𝑋~𝑁(0,1), we know
that its density function is given by 𝑓𝑋 (π‘₯) =
𝑀|𝑋| (𝑑) = 𝐸(𝑒
∞
2
∫ 𝑒
√2πœ‹ 0
2𝑑π‘₯βˆ’π‘₯2
2
𝑑|𝑋|
)=
𝑑π‘₯ =
π‘₯2
∞
1
βˆ’
βˆ«βˆ’βˆž 𝑒 𝑑|π‘₯| √2πœ‹ 𝑒 2
2
∞
∫ 𝑒
√2πœ‹ 0
𝑑2 βˆ’(π‘₯βˆ’π‘‘)2
2
𝑑π‘₯ =
𝑑π‘₯ =
2
1
√2πœ‹
π‘₯2
𝑒 βˆ’ 2 . We can therefore compute
2
∞ 𝑑|π‘₯|βˆ’π‘₯
2
𝑒
∫
√2πœ‹ βˆ’βˆž
1
∞
𝑑2
𝑒 2 π‘’βˆ’
∫
0
√2πœ‹
(π‘₯βˆ’π‘‘)2
2
𝑑π‘₯ =
2
∞ 𝑑π‘₯βˆ’π‘₯
2
𝑒
∫
√2πœ‹ 0
2
𝑑π‘₯ =
𝑑π‘₯. Separating out the 𝑑
variable and making the substitution 𝑒 = π‘₯ βˆ’ 𝑑 and 𝑑𝑒 = 𝑑π‘₯ then yields
𝑑2
∞ 1
𝑒2
𝑑2
𝑑2
𝑑2
2𝑒 2 βˆ«βˆ’π‘‘
𝑒 βˆ’ 2 𝑑𝑒 = 2𝑒 2 [Ξ¦(∞) βˆ’ Ξ¦(βˆ’π‘‘)] = 2𝑒 2 [1 βˆ’ Ξ¦(βˆ’π‘‘)] = 2𝑒 2 Ξ¦(𝑑). Thus, the
√2πœ‹
𝑑2
2
Moment Generating Function of π‘Œ = |𝑋| is given by 𝑀|𝑋| (𝑑) = 2Ξ¦(𝑑)𝑒 .
Question #3: Let 𝑋1 and 𝑋2 be independent random variables with respective probability
1 βˆ’
𝑒 βˆ’π‘₯ 𝑖𝑓 π‘₯ > 0
𝑒
(π‘₯)
(π‘₯)
density functions 𝑓𝑋1
={
and 𝑓𝑋2
= {2
0
𝑖𝑓 π‘₯ ≀ 0
0
(π‘₯βˆ’1)
2
𝑖𝑓 π‘₯ > 1
. Calculate the
𝑖𝑓 π‘₯ ≀ 1
probability density function of the transformed random variable π‘Œ = 𝑋1 + 𝑋2.
ο‚·
Since 𝑋1 and 𝑋2 are independent random variables, their joint density is
1
𝑓𝑋1 𝑋2 (π‘₯1 , π‘₯2 ) = 𝑓𝑋1 (π‘₯1 )𝑓𝑋2 (π‘₯2 ) = 2 𝑒 βˆ’π‘₯1 𝑒 βˆ’
(π‘₯2 βˆ’1)
2
1
= 2 π‘’βˆ’
(2π‘₯1 +π‘₯2 βˆ’1)
2
if π‘₯1 > 0 and π‘₯2 > 1.
We have 𝑦 = π‘₯1 + π‘₯2 and generate 𝑧 = π‘₯1 so π‘₯1 = 𝑧 and π‘₯2 = 𝑦 βˆ’ π‘₯1 = 𝑦 βˆ’ 𝑧, so we
πœ•π‘₯1
can compute the Jacobian as 𝐽 =
πœ•π‘₯1
πœ•π‘¦
𝑑𝑒𝑑 [πœ•π‘₯
2
πœ•π‘§
]
πœ•π‘₯2
πœ•π‘¦
0
= 𝑑𝑒𝑑 [
1
1
] = βˆ’1. This implies
βˆ’1
πœ•π‘§
1
that the joint density π‘“π‘Œπ‘ (𝑦, 𝑧) = 𝑓𝑋1 𝑋2 (𝑧, 𝑦 βˆ’ 𝑧)|𝐽| = 2 𝑒 βˆ’
(2𝑧+π‘¦βˆ’π‘§βˆ’1)
2
1
= 2 π‘’βˆ’
(𝑧+π‘¦βˆ’1)
2
whenever 𝑧 > 0 and 𝑦 βˆ’ 𝑧 > 1 β†’ 𝑦 > 1 + 𝑧 or 𝑧 < 𝑦 βˆ’ 1. We then integrate out 𝑧 to
∞
1
π‘¦βˆ’1 βˆ’(𝑧+π‘¦βˆ’1)
find the PDF π‘“π‘Œ (𝑦) = βˆ«βˆ’βˆž π‘“π‘Œπ‘ (𝑦, 𝑧) 𝑑𝑧 = 2 ∫0
1
π‘’βˆ’
2
(π‘¦βˆ’1)
2
𝑧
[βˆ’2𝑒 βˆ’2 ]
π‘¦βˆ’1
0
= βˆ’π‘’ βˆ’
(π‘¦βˆ’1)
2
Thus, we find that π‘“π‘Œ (𝑦) = 𝑒
𝑧
[𝑒 βˆ’2 ]
π‘¦βˆ’1
0
1βˆ’π‘¦
2
(1 βˆ’ 𝑒
𝑒
= βˆ’π‘’ βˆ’
1βˆ’π‘¦
2
(π‘¦βˆ’1)
2
2
(𝑒 βˆ’
1
𝑑𝑧 = 2 𝑒 βˆ’
(π‘¦βˆ’1)
2
(π‘¦βˆ’1)
2
βˆ’ 1) = 𝑒
π‘¦βˆ’1 βˆ’π‘§
∫0
1βˆ’π‘¦
2
𝑒
2
(1 βˆ’ 𝑒
𝑑𝑧 =
1βˆ’π‘¦
2
).
) whenever 𝑦 > 0 and zero otherwise.
Question #4: Let 𝑋1 , … , 𝑋𝑛 be independent and identically distributed random variables
1
from a cumulative distribution function 𝐹𝑋 (π‘₯) = {
1 βˆ’ π‘₯ 𝑖𝑓 π‘₯ > 1
0
𝑖𝑓 π‘₯ ≀ 1
. Find the limiting
𝑛
distribution of the transformed first order statistic given by 𝑋1:𝑛
.
ο‚·
1
1
𝑛
𝑛 (π‘₯) = 𝑃(𝑋1:𝑛 ≀ π‘₯) = 𝑃 (𝑋1:𝑛 ≀ π‘₯ 𝑛 ) = 1 βˆ’ 𝑃 (𝑋1:𝑛 > π‘₯ 𝑛 ) =
We can calculate that 𝐹𝑋1:𝑛
1
𝑛
1
𝑛
1 βˆ’ 𝑃 (𝑋𝑖 > π‘₯ 𝑛 ) = 1 βˆ’ [1 βˆ’ 𝐹𝑋 (π‘₯ 𝑛 )] = 1 βˆ’ [1 βˆ’ (1 βˆ’
1
1
π‘₯𝑛
𝑛
1
)] = 1 βˆ’ π‘₯ when π‘₯ > 1.
Since there is no dependence on 𝑛, we may simply compute the desired limiting
1
lim (1 βˆ’ π‘₯) 𝑖𝑓 π‘₯ > 1
𝑛 (π‘₯) = {π‘›β†’βˆž
distribution as lim 𝐹𝑋1:𝑛
π‘›β†’βˆž
lim (0)
π‘›β†’βˆž
𝑖𝑓 π‘₯ ≀ 1
1
={
1 βˆ’ π‘₯ 𝑖𝑓 π‘₯ > 1
0
𝑖𝑓 π‘₯ ≀ 1
= 𝐹𝑋 (π‘₯).
Question #5: Let 𝑋1 , … , 𝑋𝑛 be independent and identically distributed random variables
with density function 𝑓𝑋 (π‘₯) = {
1 𝑖𝑓 π‘₯ ∈ (0,1)
. That is, they are sampled from a random
0 𝑖𝑓 π‘₯ βˆ‰ (0,1)
variable distributed as 𝑋~π‘ˆπ‘πΌπΉ(0,1). Approximate 𝑃(βˆ‘90
𝑖=1 𝑋𝑖 ≀ 77) using 𝑍~𝑁(0,1).
ο‚·
Since 𝑋~π‘ˆπ‘πΌπΉ(0,1), we know that 𝐸(𝑋) =
0+1
2
1
= 2 and π‘‰π‘Žπ‘Ÿ(𝑋) =
(1βˆ’0)2
12
1
90
90
90
we let π‘Œ = βˆ‘90
𝑖=1 𝑋𝑖 , we have 𝐸(π‘Œ) = 𝐸(βˆ‘π‘–=1 𝑋𝑖 ) = βˆ‘π‘–=1 𝐸(𝑋𝑖 ) = βˆ‘π‘–=1 2 =
1
1
= 12. Then if
90
2
= 45 and
90
90
90
π‘‰π‘Žπ‘Ÿ(π‘Œ) = π‘‰π‘Žπ‘Ÿ(βˆ‘90
𝑖=1 𝑋𝑖 ) = βˆ‘π‘–=1 π‘‰π‘Žπ‘Ÿ(𝑋𝑖 ) = βˆ‘π‘–=1 12 = 12 = 7.5 so that 𝑠𝑑(π‘Œ) = √7.5.
π‘Œβˆ’πΈ(π‘Œ)
𝑃(βˆ‘90
𝑖=1 𝑋𝑖 ≀ 77) = 𝑃(π‘Œ ≀ 77) = 𝑃 (
Thus
𝑃 (𝑍 ≀
32
𝑠𝑑(π‘Œ)
≀
77βˆ’πΈ(π‘Œ)
𝑠𝑑(π‘Œ)
) = 𝑃(
π‘Œβˆ’45
√7.5
77βˆ’45
≀
√7.5
)β‰ˆ
) = 𝑃(𝑍 ≀ 11.68) = Ξ¦(11.68) β‰ˆ 0.99 where 𝑍~𝑁(0,1).
√7.5
Question #6: If 𝑋~𝐸𝑋𝑃(1) such that 𝑓𝑋 (π‘₯) = {
𝑒 βˆ’π‘₯ 𝑖𝑓 π‘₯ > 0
, what transformation of 𝑋 will
0
𝑖𝑓 π‘₯ ≀ 0
yield a random variable π‘Œ~π‘ˆπ‘πΌπΉ(0,1)? What about a random variable 𝑍~π‘ˆπ‘πΌπΉ(βˆ’1,1)?
ο‚·
Let π‘Œ = 1 βˆ’ 𝑒 βˆ’π‘‹ , so we can compute that πΉπ‘Œ (𝑦) = 𝑃(π‘Œ ≀ 𝑦) = 𝑃(1 βˆ’ 𝑒 βˆ’π‘‹ ≀ 𝑦) =
𝑃(𝑋 ≀ βˆ’π‘™π‘›(1 βˆ’ 𝑦)) = 𝐹𝑋 (βˆ’π‘™ 𝑛(1 βˆ’ 𝑦)) and then by differentiating we obtain π‘“π‘Œ (𝑦) =
𝑑
𝑑𝑦
ο‚·
[𝐹𝑋 (βˆ’π‘™ 𝑛(1 βˆ’ 𝑦))] = 𝑓𝑋 (βˆ’π‘™ 𝑛(1 βˆ’ 𝑦)) (βˆ’
Since we need to end up with
1βˆ’π‘¦
2(1βˆ’π‘¦)
βˆ’1
1βˆ’π‘¦
1βˆ’π‘¦
) = 1βˆ’π‘¦ = 1 so that π‘Œ~π‘ˆπ‘πΌπΉ(0,1).
1
= 2 to conclude that 𝑍~π‘ˆπ‘πΌπΉ(βˆ’1,1), it is clear
that we must make the transformation 𝑍 = 1 βˆ’ 2𝑒 βˆ’π‘‹ to obtain this result. We can
1βˆ’π‘§
therefore compute 𝐹𝑍 (𝑧) = 𝑃(𝑍 ≀ 𝑧) = 𝑃(1 βˆ’ 2𝑒 βˆ’π‘‹ ≀ 𝑧) = 𝑃 (𝑋 ≀ βˆ’π‘™ 𝑛 (
1βˆ’π‘§
𝐹𝑋 (βˆ’π‘™ 𝑛 (
2
1βˆ’π‘§
𝑓𝑋 (βˆ’π‘™ 𝑛 (
2
𝑑
1βˆ’π‘§
)) so after differentiating we have 𝑓𝑍 (𝑧) = 𝑑𝑦 [𝐹𝑋 (βˆ’π‘™ 𝑛 (
2
1
1βˆ’π‘§
1
2
2
)) (βˆ’ 1βˆ’π‘¦ (βˆ’ 2) ) = 2(1βˆ’π‘§) = 2 so that we conclude 𝑍~π‘ˆπ‘πΌπΉ(βˆ’1,1).
)) =
))] =
Question #7: Does convergence in probability imply convergence in distribution? Does
convergence in distribution imply convergence in probability? Give a counterexample if not.
ο‚·
We have that 𝐢𝑃 β‡’ 𝐢𝐷, but that 𝐢𝐷 ⇏ 𝐢𝑃. As a counterexample, consider π‘Œπ‘› = βˆ’π‘‹
where 𝑋~𝑁(0,1). Then π‘Œπ‘› ~𝑁(0,1) so π‘Œπ‘› →𝑑 𝑋. However, we have 𝑃(|π‘Œπ‘› βˆ’ 𝑋| > πœ–) =
𝑃(|βˆ’π‘‹ βˆ’ 𝑋| > Ο΅) = 𝑃(2|𝑋| > Ο΅) = 𝑔(πœ–) > 0. This expression won’t converge to zero
as 𝑛 β†’ ∞ since there is no dependence on 𝑛 in the probability expression. This
implies that there is no convergence in probability, that is π‘Œπ‘› ↛𝑝 𝑋.
Question #8: State the law of large numbers (LLN), including its assumptions. Then state
the central limit theorem (CLT), including its assumptions.
ο‚·
Law of Large Numbers: If 𝑋1 , … , 𝑋𝑛 are independent and identically distributed with
finite mean πœ‡ and variance 𝜎 2 , then the sample mean 𝑋̅ →𝑝 πœ‡.
ο‚·
Central Limit Theorem: If 𝑋1 , … , 𝑋𝑛 are independent and identically distributed with
finite mean πœ‡ and variance 𝜎 2 , then
[βˆ‘π‘›
𝑖=1 𝑋𝑖 ]βˆ’π‘›πœ‡
𝜎 βˆšπ‘›
𝑋̅ βˆ’πœ‡
→𝑑 𝑁(0,1) and 𝜎/
βˆšπ‘›
→𝑑 𝑁(0,1).
Question #10: Let 𝑋1 and 𝑋2 be independent random variables with Cumulative Distribution
Function (CDF) given by 𝐹𝑋 (π‘₯) = 1 βˆ’ 𝑒 βˆ’π‘₯
2 /2
for π‘₯ > 0. Find the joint probability density
function π‘“π‘Œ1 π‘Œ2 (𝑦1 , 𝑦2 ) of the random variables π‘Œ1 = βˆšπ‘‹1 + 𝑋2 and π‘Œ2 = 𝑋
𝑋2
1 +𝑋2
ο‚·
π‘₯2
𝑑
.
π‘₯2
We first find the density 𝑓𝑋 (π‘₯) = 𝑑π‘₯ 𝐹𝑋 (π‘₯) = βˆ’π‘’ βˆ’ 2 (βˆ’π‘₯) = π‘₯𝑒 βˆ’ 2 for π‘₯ > 0. Then since
𝑋1 and 𝑋2 are independent random variables, their joint density function is
2
𝑓𝑋1 𝑋2 (π‘₯1 , π‘₯2 ) = 𝑓𝑋1 (π‘₯1 )𝑓𝑋2 (π‘₯2 ) = [π‘₯1 𝑒
π‘₯
βˆ’ 1
2
2
] [π‘₯2 𝑒
We then have that 𝑦1 = √π‘₯1 + π‘₯2 and 𝑦2 = π‘₯
π‘₯
βˆ’ 2
2
π‘₯2
1 +π‘₯2
2
] = π‘₯1 π‘₯2 𝑒
2
π‘₯ +π‘₯
βˆ’ 1 2
2
for π‘₯1 > 0, π‘₯2 > 0.
so that π‘₯2 = 𝑦2 (π‘₯1 + π‘₯2 ) = 𝑦2 𝑦12 and
π‘₯1 = 𝑦12 βˆ’ π‘₯2 = 𝑦12 βˆ’ 𝑦2 𝑦12 . These computations allow us to compute the Jacobian as
πœ•π‘₯1
πœ•π‘₯1
πœ•π‘¦
πœ•π‘¦2
]
πœ•π‘₯2
𝐽 = 𝑑𝑒𝑑 [πœ•π‘₯1
2
πœ•π‘¦1
= 𝑑𝑒𝑑 [
πœ•π‘¦2
2𝑦1 βˆ’ 2𝑦1 𝑦2
2𝑦1 𝑦2
βˆ’π‘¦12
3
3
3
3
2 ] = 2𝑦1 βˆ’ 2𝑦1 𝑦2 + 2𝑦1 𝑦2 = 2𝑦1 .
𝑦1
This
implies that the joint density function is π‘“π‘Œ1 π‘Œ2 (𝑦1 , 𝑦2 ) = 𝑓𝑋1 𝑋2 (𝑦12 βˆ’ 𝑦2 𝑦12 , 𝑦2 𝑦12 )|𝐽| =
(𝑦12 βˆ’ 𝑦2 𝑦12 )(𝑦2 𝑦12 )𝑒 βˆ’
2 2
2 2
(𝑦2
1 βˆ’π‘¦2 𝑦1 ) +(𝑦2 𝑦1 )
2
2𝑦13 whenever 𝑦12 βˆ’ 𝑦2 𝑦12 > 0, 𝑦2 𝑦12 > 0. This
work also reveals that π‘Œ1 and π‘Œ2 are not independent since their joint probability
density function π‘“π‘Œ1 π‘Œ2 (𝑦1 , 𝑦2 ) cannot be factored into two functions of 𝑦1 and 𝑦2 .
Question #11: Let 𝑋1 , 𝑋2 , 𝑋3 be independent random variables and suppose that we know
𝑋1 ~𝑁(2,4), 𝑋2 ~𝑁(3,9), and 𝑋1 + 𝑋2 + 𝑋3 ~𝑁(4,16). What is the distribution of 𝑋3?
ο‚·
We use the Moment Generating Function (MGF) technique to note that since the three
random variables are independent, we have 𝑀𝑋1 +𝑋2 +𝑋3 (𝑑) = 𝑀𝑋1 (𝑑)𝑀𝑋2 (𝑑)𝑀𝑋3 (𝑑) β†’
𝑀𝑋3 (𝑑) =
𝑀𝑋1 +𝑋2 +𝑋3 (𝑑)
𝑀𝑋1 (𝑑)𝑀𝑋2 (𝑑)
=
2
𝑒 4𝑑+16𝑑 /2
2
2
(𝑒 2𝑑+4𝑑 /2 )(𝑒 3𝑑+9𝑑 /2 )
=
2
𝑒 4𝑑+16𝑑 /2
2
𝑒 5𝑑+13𝑑 /2
= 𝑒 βˆ’π‘‘+3𝑑
the random variable 𝑋3 ~𝑁(βˆ’1,3) since 𝑀𝐴 (𝑑) = 𝑒 πœ‡π‘‘+𝜎
2 𝑑 2 /2
2 /2
. This reveals that
when 𝐴~𝑁(πœ‡, 𝜎 2 ).
Question #12: Let 𝑋1 , … , 𝑋5 be independent 𝑁(πœ‡, 𝜎 2 ) random variables. Give examples of
random variables π‘Œ1 , … , π‘Œ4 defined in terms of the random variables 𝑋1 , … , 𝑋5 such that we
have π‘Œ1 ~𝑁(0,1), π‘Œ2 ~πœ’ 2 (3), π‘Œ3 ~𝐹(1,3), and π‘Œ4 ~𝑑(3).
ο‚·
We have that π‘Œ1 =
𝑋1 βˆ’πœ‡
𝜎
𝑋𝑖 βˆ’πœ‡ 2
~𝑁(0,1), π‘Œ2 = βˆ‘3𝑖=1 π‘Œπ‘–2 = βˆ‘3𝑖=1 (
2
𝑋 βˆ’πœ‡
( 1 ) /1
𝜎
𝑋 βˆ’πœ‡ 2
3
[βˆ‘π‘–=1( 𝑖 ) ]/3
𝜎
~𝐹(1,3), and π‘Œ4 =
π‘Œ1
βˆšπ‘Œ2 /3
𝑋 βˆ’πœ‡
( 1 )
𝜎
=
2
𝜎
) ~πœ’ 2 (3), π‘Œ3 =
π‘Œ2 /3
=
~𝑑(3). These random variables
𝑋 βˆ’πœ‡
√[βˆ‘3𝑖=1( 𝑖 ) ]/3
𝜎
are constructed since if some 𝑋~𝑁(πœ‡, 𝜎 2 ), then 𝑍 =
π‘‹βˆ’πœ‡
𝜎
~𝑁(0,1). This implies that
π‘Œ = 𝑍 2 ~πœ’ 2 (1), so that π‘ˆ = βˆ‘π‘›π‘–=1 π‘Œπ‘– ~πœ’ 2 (𝑛). Finally, we note that 𝑉 =
where 𝑆~πœ’ 2 (π‘˜) and that 𝑇 =
π‘Œ12 /1
𝑍
βˆšπ‘ˆ/𝑛
~𝑑(𝑛).
π‘ˆ/𝑛
𝑆/π‘˜
~𝐹(𝑛, π‘˜)
Question #13: If 𝑋1 and 𝑋2 are independent 𝐸𝑋𝑃(1) random variables such that their
𝑋
densities are 𝑓(π‘₯) = 𝑒 βˆ’π‘₯ 1{π‘₯ > 0}, then find the joint density of π‘ˆ = 𝑋1 and 𝑉 = 𝑋1 + 𝑋2.
2
ο‚·
By independence, 𝑓𝑋1 𝑋2 (π‘₯1 , π‘₯2 ) = 𝑓𝑋1 (π‘₯1 )𝑓𝑋2 (π‘₯2 ) = [𝑒 βˆ’π‘₯1 ][𝑒 βˆ’π‘₯2 ] = 𝑒 βˆ’(π‘₯1 +π‘₯2 ) if π‘₯1 > 0
π‘₯
and π‘₯2 > 0. We have 𝑒 = π‘₯1 and 𝑣 = π‘₯1 + π‘₯2 so π‘₯1 = 𝑒π‘₯2 = 𝑒(𝑣 βˆ’ π‘₯1 ), which implies
2
𝑒𝑣
𝑒𝑣
that π‘₯1 + 𝑒π‘₯1 = 𝑒𝑣 so π‘₯1 = 1+𝑒. Then π‘₯2 = 𝑣 βˆ’ π‘₯1 = 𝑣 βˆ’ 1+𝑒 =
πœ•π‘₯1
allows us to compute the Jacobian as 𝐽 =
πœ•π‘₯1
πœ•π‘’
𝑑𝑒𝑑 [πœ•π‘₯
2
πœ•π‘£
]
πœ•π‘₯2
πœ•π‘’
𝑣(1+𝑒)βˆ’π‘’π‘£
1
(1+𝑒)
(1+𝑒)2
𝑒
𝑣
𝑣
𝑒𝑣
+ 1+𝑒 ((1+𝑒)2 ) = (1+𝑒)3 + (1+𝑒)3 =
𝑒𝑣
𝑣
πœ•π‘£
𝑣(1+𝑒)
(1+𝑒)3
𝑣
𝑣
1+𝑒
𝑣
= 1+𝑒. This
𝑣(1+𝑒)βˆ’π‘’π‘£
(1+𝑒)2
𝑑𝑒𝑑 [ βˆ’π‘£
(1+𝑒)2
= (1+𝑒)2 .
𝑒𝑣
𝑣
=
𝑣+π‘’π‘£βˆ’π‘’π‘£
Thus,
𝑒
1+𝑒
]
1
=
1+𝑒
the
joint
𝑣
density is π‘“π‘ˆπ‘‰ (𝑒, 𝑣) = 𝑓𝑋1 𝑋2 (1+𝑒 , 1+𝑒) |𝐽| = (1+𝑒)2 𝑒 βˆ’(1+𝑒+1+𝑒) = (1+𝑒)2 𝑒 βˆ’π‘£ whenever
we have that 𝑒 > 0, 𝑣 > 0 and zero otherwise.
Question #14: If 𝑋1 , 𝑋2 , 𝑋3 are a random sample of size 𝑛 = 2 from the π‘ˆπ‘πΌπΉ(0,1)
distribution, then a) find the joint density of the order statistics π‘Œ1 , π‘Œ2 , π‘Œ3 , b) find the marginal
densities of π‘Œ1 and π‘Œ3 , and c) find the mean of the sample range 𝑅 = π‘Œ3 βˆ’ π‘Œ1 .
a) We have π‘“π‘Œ1 π‘Œ2 π‘Œ3 (𝑦1 , 𝑦2 , 𝑦3 ) = 3! ∏3𝑖=1 𝑓𝑋 (𝑦𝑖 ) = 6 ∏3𝑖=1 1 = 6 if 0 < 𝑦1 < 𝑦2 < 𝑦3 < 1.
b) We have π‘“π‘Œ1 (𝑦1 ) = 3𝑓𝑋 (𝑦1 )[1 βˆ’ 𝐹𝑋 (𝑦1 )]2 = 3(1 βˆ’ 𝑦1 )2 if 0 < 𝑦1 < 1 and that
π‘“π‘Œ3 (𝑦3 ) = 3𝑓𝑋 (𝑦3 )[𝐹𝑋 (𝑦3 )]2 = 3𝑦32 whenever 0 < 𝑦3 < 1.
1
3
3
c) We have 𝐸(𝑅) = 𝐸(π‘Œ3 ) βˆ’ 𝐸(π‘Œ1 ), so we compute 𝐸(π‘Œ3 ) = ∫0 𝑦3 3𝑦32 𝑑𝑦3 = 4 [𝑦33 ]10 = 4
1
1
𝑦
and 𝐸(π‘Œ1 ) = ∫0 𝑦1 3(1 βˆ’ 𝑦1 )2 𝑑𝑦1 = 3 ∫0 𝑦1 βˆ’ 2𝑦12 + 𝑦13 𝑑𝑦1 = 3 [ 21 βˆ’
3
1
1
2𝑦13
3
+
𝑦14
1
1
] = 4.
4
0
Then, we have that 𝐸(𝑅) = 𝐸(π‘Œ3 ) βˆ’ 𝐸(π‘Œ1 ) = 4 βˆ’ 4 = 2. Alternatively, we could have
found π‘“π‘Œ1 π‘Œ3 (𝑦1 , 𝑦3 ) and computed 𝐸(𝑅) as a double integral.
Related documents