Download HW_2_AMS 570 Q.1. Suppose that and are random variables with

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
HW_2_AMS 570
Q.1. Suppose that π’€πŸ and π’€πŸ are random variables with joint pdf
πŸ–π’š π’š , 𝟎 < π’šπŸ < π’šπŸ < 𝟏
π’‡π’€πŸ ,π’€πŸ (π’šπŸ , π’šπŸ ) = { 𝟏 𝟐
𝟎,
𝐨𝐭𝐑𝐞𝐫𝐰𝐒𝐬𝐞.
Find the pdf of π‘ΌπŸ = π’€πŸ /π’€πŸ .
We are given that the joint pdf of 𝑋 and π‘Œisπ‘“π‘Œ1 ,π‘Œ2 (𝑦1 , 𝑦2 ) = 8𝑦1 𝑦2 , 0 < 𝑦1 < 𝑦2 < 1.
Now we need to introduce a second random variable π‘ˆ2 which is a function of 𝑋 andπ‘Œ.
We wish to do this in such a way that the resulting bivariate transformation is one-to-one
and our actual task of finding the pdf of π‘ˆ1 is as easy as possible. Our choice of π‘ˆ2 is of
course, not unique. Let us defineπ‘ˆ2 = π‘Œ2 . Then the transformation is, (using𝑒1 , 𝑒2 , 𝑦1 , 𝑦2 ,
since we are really dealing with the range spaces here).
𝑦1 = 𝑒1 βˆ— 𝑒2
𝑦2 = 𝑒2
From it, we find the Jacobian,
𝑒
J=| 2
0
𝑒1
| = 𝑒2
1
To determineℬ, the range space of π‘ˆ1 andπ‘ˆ2 , we note that
0 < 𝑦1 < 𝑦2 < 1 β‡’ 0 < 𝑒1 βˆ— 𝑒2 < 𝑒2 < 1, equivalently,
𝑒1 > 0
𝑒1 < 1
𝑒2 > 0
𝑒2 < 1
So ℬ is as indicated in the diagram below.
1
π‘“π‘ˆ1,π‘ˆ2 (𝑒1 , 𝑒2 ) = 8 βˆ— (𝑒1 βˆ— 𝑒2 ) βˆ— 𝑒2 βˆ— 𝑒2 = 8𝑒1 𝑒23 ,
Thus, the marginal pdf of π‘ˆ1 is obtained by integrating π‘“π‘ˆ1,π‘ˆ2 (𝑒1 , 𝑒2 ) with respect to𝑒2 ,
giving
1
π‘“π‘ˆ1 (𝑒1 ) = ∫0 8𝑒1 𝑒23 𝑑𝑒2 = 2 βˆ— 𝑒1 , 0 < 𝑒1 < 1 .
𝟏 𝟐 𝒙
2.3 Suppose X has the geometric pmf 𝒇𝑿 (𝒙) = πŸ‘ (πŸ‘) , 𝒙 = 𝟎, 𝟏, 𝟐, … Determine the
𝑿
probability distribution of 𝒀 = 𝑿+𝟏. Note that here both X and Y are discrete
random variables. To specific the probability distribution of Y, specify its pmf.
𝑦
𝑋
𝑦
1
𝑃(π‘Œ = 𝑦) = 𝑃 (𝑋+1 = 𝑦) = 𝑃 (𝑋 = 1βˆ’π‘¦) = 3 βˆ—
1 2
2 1βˆ’π‘¦
(3) ,
π‘₯
where 𝑦 = 0, 2 , 3 , … … , π‘₯+1 , … …
2.9 If the random variable X has pdf
π’™βˆ’πŸ
𝒇(𝒙) = { 𝟐 , 𝟏 < 𝒙 < πŸ‘
𝟎, π’π’•π’‰π’†π’“π’˜π’Šπ’”π’†
Find a monotone function u(x) such that the random variable 𝒀 = 𝒖(𝑿) has a
uniform (0,1) distribution.
From the probability integral transformation, Theorem 2.1.10, we know that if 𝑒(π‘₯) =
𝐹𝑋 (π‘₯), then u(x)~uniform (0,1). Therefore, for the given pdf, calculate:
2.11 Let X have the standard normal pdf 𝒇𝑿 (𝒙) = (
𝟏
βˆšπŸπ…
π’™πŸ
) π’†βˆ’ 𝟐 .
(a) Find π‘¬π‘ΏπŸ directly, and then by using the pdf of 𝒀 = π‘ΏπŸ from example 2.1.7 and
calculate𝑬𝒀.
(b) Find the pdf of 𝒀 = |𝑿|, and find its mean and variance.
π‘₯2
a. Using integration by parts with 𝑒 = π‘₯, π‘Žπ‘›π‘‘ 𝑑𝑣 = π‘₯𝑒 βˆ’ 2 𝑑π‘₯.
∞
𝐸𝑋 2 = ∫ π‘₯ 2
βˆ’βˆž
1
√2πœ‹
=1
π‘₯2
𝑒 βˆ’ 2 𝑑π‘₯ =
1
√2πœ‹
π‘₯2
∞
π‘₯2
βˆ’
[βˆ’π‘₯𝑒 βˆ’ 2 |∞
βˆ’βˆž + ∫ 𝑒 2 𝑑π‘₯ ] =
βˆ’βˆž
1
√2πœ‹
∞
π‘₯2
∫ 𝑒 βˆ’ 2 𝑑π‘₯
βˆ’βˆž
2
Using example 2.1.7, let π‘Œ = 𝑋 2 , then
π‘“π‘Œ (𝑦) =
∞
πΈπ‘Œ = ∫ 𝑦 βˆ—
0
1
[
1
2βˆšπ‘¦ √2πœ‹
1
𝑦
𝑒 βˆ’2 +
1
√2πœ‹
1
𝑦
𝑒 βˆ’2 ] =
𝑦
√2πœ‹π‘¦
𝑒 βˆ’2
𝑦
√2πœ‹π‘¦
𝑒 βˆ’ 2 𝑑𝑦 = 1 (π·π‘œ π‘–π‘›π‘‘π‘’π‘”π‘Ÿπ‘Žπ‘‘π‘–π‘œπ‘› 𝑏𝑦 π‘π‘Žπ‘Ÿπ‘‘π‘ )
b.
π‘Œ = |𝑋|π‘€β„Žπ‘’π‘Ÿπ‘’ βˆ’ ∞ < π‘₯ < ∞. Therefore, 0 < 𝑦 < ∞. Then
πΉπ‘Œ (𝑦) = 𝑃(π‘Œ ≀ 𝑦) = 𝑃(|𝑋| ≀ 𝑦) = 𝑃(βˆ’π‘¦ ≀ 𝑋 ≀ 𝑦) = 𝑃(𝑋 ≀ 𝑦) βˆ’ 𝑃(𝑋 ≀ βˆ’π‘¦) =
𝐹𝑋 (𝑦) βˆ’ 𝐹π‘₯ (βˆ’π‘¦).
𝑑
2
π‘‡β„Žπ‘’π‘ , π‘“π‘Œ (𝑦) = 𝑑𝑦 πΉπ‘Œ (𝑦) = 𝑓𝑋 (𝑦) βˆ’ 𝑓𝑋 (βˆ’π‘¦) βˆ— (βˆ’1) = βˆšπœ‹ 𝑒
βˆ’
𝑦2
2
π‘“π‘œπ‘Ÿ 𝑦 > 0.
𝑦2
∞
∞
2
2 ∞
2
2
𝑦2
πΈπ‘Œ = ∫0 𝑦 βˆ— βˆšπœ‹ 𝑒 βˆ’ 2 𝑑𝑦 = βˆšπœ‹ ∫0 𝑒 βˆ’π‘’ 𝑑𝑒 = βˆšπœ‹ [βˆ’π‘’ βˆ’π‘’ | ] = βˆšπœ‹ , π‘€β„Žπ‘’π‘Ÿπ‘’ 𝑒 = 2 .
0
𝑦2
𝑦2 ∞
𝑦2
∞
∞
2
2
2
πœ‹
πΈπ‘Œ 2 = ∫0 𝑦 2 βˆ— βˆšπœ‹ 𝑒 βˆ’ 2 𝑑𝑦 = βˆšπœ‹ [βˆ’π‘¦π‘’ βˆ’ 2 | + ∫0 𝑒 βˆ’ 2 𝑑𝑦]= βˆšπœ‹ βˆ— √ 2 = 1
0
2
π‘‰π‘Žπ‘Ÿ(π‘Œ) = πΈπ‘Œ 2 βˆ’ (πΈπ‘Œ)2 = 1 βˆ’ πœ‹
2.13 Consider a sequence of independent coin flips, each of which has probability 𝒑
of being heads. Define a random variable 𝑿 as the length of the run (of either heads
or tails) started by the first trial. (For example, X=3 if either TTTH or HHHT is
observed.) Find the distribution of X, and find EX.
𝑃(𝑋 = π‘˜) = (1 βˆ’ 𝑝)π‘˜ 𝑝 + π‘π‘˜ (1 βˆ’ 𝑝), π‘˜ = 1,2,3 … Therefore,
∞
π‘˜βˆ’1
π‘˜βˆ’1 )
𝐸𝑋 = βˆ‘βˆž
+ βˆ‘βˆž
=
π‘˜=1 π‘˜ βˆ— 𝑃(𝑋 = π‘˜) = (1 βˆ’ 𝑝)𝑝(βˆ‘π‘˜=1 π‘˜(1 βˆ’ 𝑝)
π‘˜=1 π‘˜π‘
(1 βˆ’ 𝑝)𝑝 [βˆ’ βˆ‘βˆž
π‘˜=1
𝑑(1βˆ’π‘)π‘˜
𝑑𝑝
+ βˆ‘βˆž
π‘˜=1
π‘‘π‘π‘˜
1
1
] = (1 βˆ’ 𝑝)𝑝 (𝑝2 + (1βˆ’π‘)2 ) =
𝑑𝑝
1βˆ’2𝑝+2𝑝2
𝑝(1βˆ’π‘)
2.18 Show that if X is continuous random variable, then
𝐦𝐒𝐧 𝑬|𝑿 βˆ’ 𝒂 |= 𝑬|𝑿 βˆ’ π’Ž|
𝒂
, where m is the median of X.
∞
π‘Ž
∞
𝐸|𝑋 βˆ’ π‘Ž| = βˆ«βˆ’βˆž|π‘₯ βˆ’ π‘Ž|𝑓(π‘₯)𝑑π‘₯ = βˆ«βˆ’βˆž βˆ’(π‘₯ βˆ’ π‘Ž)𝑓(π‘₯)𝑑π‘₯ + βˆ«π‘Ž (π‘₯ βˆ’ π‘Ž)𝑓(π‘₯)𝑑π‘₯ . Then,
3
π‘Ž
𝑑
∞
𝐸|𝑋 βˆ’ π‘Ž| = βˆ«βˆ’βˆž 𝑓(π‘₯)𝑑π‘₯ βˆ’ βˆ«π‘Ž 𝑓(π‘₯)𝑑π‘₯ = 0 , the solution is a=median. This is a
π‘‘π‘Ž
minimum since
𝑑2
π‘‘π‘Ž2
𝐸|𝑋 βˆ’ π‘Ž| = 2𝑓(π‘Ž) > 0.
2.34 A distribution cannot be uniquely determined by a finite collection of moments,
as this example from Romano and Siegel (1986) shows. Let X have the normal
distribution that is X has pdf
𝒇𝑿 (𝒙) =
𝟏
βˆšπŸπ…
π’™πŸ
π’†βˆ’ 𝟐 , βˆ’βˆž < 𝒙 < ∞
Define a discrete random variable Y by
𝟏
𝟐
𝑷(𝒀 = βˆšπŸ‘) = 𝑷(𝒀 = βˆ’βˆšπŸ‘) = πŸ”, 𝑷(𝒀 = 𝟎) = πŸ‘.
Show that 𝑬𝑿𝒓 = 𝑬𝒀𝒓 for𝒓 = 𝟏, 𝟐, πŸ‘, πŸ’, πŸ“.
(Romano and Siegel point out that for any finite n there exists a discrete, and hen
nonnormal, random variable whose first n moments are equal to those of X.
π‘Ÿ
𝐸𝑋 =
∞
βˆ«βˆ’βˆž π‘₯ π‘Ÿ
1
βˆ—
π‘Ÿ
1
√2πœ‹
𝑒
(βˆ’
π‘₯2
)
2
𝑑π‘₯. Thus, 𝐸𝑋 π‘Ÿ = 0, π‘€β„Žπ‘’π‘› π‘Ÿ = 1,3,5.
π‘Ÿ
1
πΈπ‘Œ π‘Ÿ = 6 βˆ— 32 + 6 βˆ— (βˆ’1)π‘Ÿ βˆ— 32 . Thus, 𝐸𝑋 π‘Ÿ = 0, π‘€β„Žπ‘’π‘› π‘Ÿ = 1,3,5.
𝑑2
2
𝑑2
𝑑4
𝑀𝑋 (𝑑) = 𝑒 , 𝐸𝑋 2 = 𝑑𝑑 2 𝑀𝑋 (𝑑)| 𝑑=0 = 1, 𝐸𝑋 4 = 𝑑𝑑 4 𝑀𝑋 (𝑑)| 𝑑=0 = 3.
1
2
1
2
4
1
1
4
πΈπ‘Œ 2 = 6 (√3) + 6 (βˆ’βˆš3) = 1, πΈπ‘Œ 4 = 6 (√3) + 6 (βˆ’βˆš3) = 3.
Thus, 𝐸𝑋 π‘Ÿ = πΈπ‘Œ π‘Ÿ , π‘“π‘œπ‘Ÿ π‘Ÿ = 1,2,3,4,5.
2.38 Let X have the negative binomial distribution with pmf
𝒇𝑿 (𝒙) = (
𝒓+π’™βˆ’πŸ 𝒓
) 𝒑 (𝟏 βˆ’ 𝒑)𝒙 , 𝒙 = 𝟎, 𝟏, 𝟐, …,
𝒙
, where 0<p<1, and r>0 is an integer.
(a) Calculate the mgf of X.
(b) Define a new random variable by 𝒀 = πŸπ’‘π‘Ώ. Show that as 𝒑 β†’ 𝟎+ , the mgf of Y
converges to that of a chi squared random variable with 2r degrees of freedom by
1
π‘Ÿ
1
showing that lim π‘€π‘Œ (𝑑) = (1βˆ’2𝑑) , |𝑑| < 2.
𝑝→0+
4
π‘Ÿ+π‘₯βˆ’1
π‘Ÿ+π‘₯βˆ’1
∞
𝑑π‘₯
π‘Ÿ
π‘₯
π‘Ÿ
𝑑 π‘₯
a. 𝑀𝑋 (𝑑) = 𝐸(𝑒 𝑑𝑋 ) = βˆ‘βˆž
π‘₯=0 𝑒 βˆ— ( π‘₯ ) 𝑝 (1 βˆ’ 𝑝) = βˆ‘π‘₯=0( π‘₯ ) 𝑝 [(1 βˆ’ 𝑝)𝑒 ]
π‘Ÿ
=
π‘Ÿ+π‘₯βˆ’1 π‘Ÿ
𝑑
βˆ‘βˆž
π‘₯=0( π‘₯ )𝑝 [1βˆ’(1βˆ’π‘)𝑒 ]
[1βˆ’(1βˆ’π‘)𝑒 𝑑 ]π‘Ÿ
[(1 βˆ’ 𝑝)𝑒 𝑑 ]π‘₯
π‘π‘Ÿ
π‘Ÿ+π‘₯βˆ’1
𝑑 π‘Ÿ
𝑑 π‘₯
= [1βˆ’(1βˆ’π‘)𝑒 𝑑]π‘Ÿ βˆ‘βˆž
π‘₯=0( π‘₯ ) [1 βˆ’ (1 βˆ’ 𝑝)𝑒 ] [(1 βˆ’ 𝑝)𝑒 ]
π‘π‘Ÿ
= [1βˆ’(1βˆ’π‘)𝑒 𝑑]π‘Ÿ
β€²
b. π‘€π‘Œ (𝑑) = 𝐸(𝑒 π‘‘π‘Œ ) = 𝐸(𝑒 2𝑝𝑑𝑋 ) = 𝐸(𝑒 𝑑 𝑋 ), π‘€β„Žπ‘’π‘Ÿπ‘’ 𝑑 β€² = 2𝑝𝑑.
π‘π‘Ÿ
β€²
𝐸(𝑒 𝑑 𝑋 ) =
lim
𝑝→0+
𝑝
β€² π‘Ÿ
[1βˆ’(1βˆ’π‘)𝑒 𝑑 ]
𝑝
1βˆ’(1βˆ’π‘)𝑒 2𝑝𝑑
π‘Ÿ
= (1βˆ’(1βˆ’π‘)𝑒 2𝑝𝑑 ) ,
1
= 1βˆ’2𝑑 , by L'Hôpital's rule.
1
π‘Ÿ
lim π‘€π‘Œ (𝑑) = (1βˆ’2𝑑) .
𝑝→0+
5
Related documents