Download Statistics Exam

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Statistics Exam
Telecommunication Engineering
29th of May 2012
Solution
Questions
2h
C1 (1.5 points).
A firm has two categories of workers: A (80 %) and B (20 %). The direction
wants to restructure the staff and is going to do a Labor Force Adjustment Plan (LFAP). Particularly, if
a worker belongs to category A, he has a probability of 10 % of being sacked; if he belongs to category B,
he has a probability of 60 % of being sacked.
(a) What is the probability of been sacked?
(b) We know that Pepe worked at the firm, but he has been sacked. What is more likely: that he
belonged to category A or that the belonged to category B?
(c) What is the probability of either belonging to category A or being sacked (that is, that at least
one of the two events occurs)?
Solution:
(a) We define the following events:
• A = “the worker belongs to category A”;
• B = “the worker belongs to category B”;
• D = “the worker is sacked”.
We have that P (A) = 0.8, P (B) = 0.2, P (D|A) = 0.1 y P (D|B) = 0.6. Thus, due to the Total
Probability Theorem, the probability of being sacked is
P (D) = P (A) · P (D|A) + P (B) · P (D|B) = 0.8 × 0.1 + 0.2 × 0.6 = 0.08 + 0.12 = 0.2.
(b) Using Bayes’ Theorem, we have that
P (A|D) =
P (A) · P (D|A)
0.8 × 0.1
0.08
=
=
= 0.4.
P (D)
0.2
0.2
Since P (B|D) = 1 − P (A|D) = 0.6, we conclude that it is more likely that he belonged to category B.
(c) P (A ∪ D) = P (A) + P (D) − P (A ∩ D) = 0.8 + 0.2 − P (D|A) · P (A) = 1 − 0.1 × 0.8 = 0.92.
C2 (2 points).
Given the following joint density function:
(
fX,Y (x, y) =
x+y
4 ,
0,
if 1 ≤ x ≤ 2,
otherwise.
1
2 ≤ y ≤ 3,
(a) Find the marginal density functions and check whether the random variables X and Y are independent.
(b) Find P (2 ≤ Y ≤ 2.5|X = 1.25).
(c) Compute the probability P (X + Y ≤ 4).
Solution:
(a)
3
Z
fX (x) =
2
2
Z
fY (y) =
1
x+y
2x + 5
dy =
4
8
if 1 ≤ x ≤ 2,
2y + 3
x+y
dx =
4
8
if 2 ≤ y ≤ 3.
It can be verified that X and Y are not independent random variables, since:
fX,Y (x, y) 6= fX (x)fY (y)
2x + 5 2y + 3
x+y
6=
4
8
8
(b) For fixed x ∈ [1, 2], it follows that
fY |X=x (y) =
(x + y)/4
2(x + y)
fX,Y (x, y)
=
=
fX (x)
(2x + 5)/8
2x + 5
if 2 ≤ y ≤ 3.
Evaluating this function at the point x = 1.25, we get that:
fY |X=1.25 (y) =
2y + 2.5
7.5
si 2 ≤ y ≤ 3.
Therefore,
Z
2.5
P (2 ≤ Y ≤ 2.5|X = 1.25) =
2
3.5
2y + 2.5
dy =
= 0.467.
7.5
7.5
(c)
P (X + Y ≤ 4)
=
=
=
=
=
=
=
3
Z
4−y
x+y
dx dy
4
2
1
Z 3 Z 4−y
1
(x + y)dx dy
4 1
2
4−y
Z 3 2
1 x
+ xy
dy
2
2 4
1
Z 3 1 (4 − y)2
1
+ (4 − y)y − − y dy
2
2
2 4
Z 3
1
(15 − 2y − y 2 )dy
8 2
3
1
y3
2
15y − y −
8
3 2
11/24 = 0.4583.
Z
2
Consider the following stochastic process, X(t) = e−A|t| , where A is an
exponentially distributed random variable with parameter λ = 2.
C3 (1.5 points).
(a) Calculate the statistical mean and the temporal mean of X(t).
(b) Determine if X(t) is ergodic in mean.
Solution:
(a) Statistical mean:
∞
Z
E[X(t)]
=
exp(−a|t|)2 exp(−2a)da
Z0 ∞
2 exp(−(2 + |t|)a)da
a=∞
2
exp(−(2 + |t|)a) =
= −2
(that depends on t).
2 + |t|
2
+
|t|
a=0
=
0
Temporal mean:
MX
=
=
=
=
=
1
lim
T →∞ 2T
1
T →∞ 2T
Z
T
X(t)dt
−T
Z T
lim
exp(−A|t|)dt
−T
Z
0
1
exp(At)dt +
lim
T →∞
2T
−T
t=0
1
1 exp(At) lim
−
T →∞
2T A t=−T 2T
1
exp(−AT )
lim
−
= 0.
T →∞ AT
AT
1
2T
Z
T
!
exp(−At)dt
0
!
exp(−At) t=T
A
t=0
(b) A stochastic process is ergodic in mean if its statistical mean coincides with its temporal mean.
2
However, in this case we have that E[X(t)] = 2+|t|
6= 0 = MX . Therefore, X(t) is not ergodic in
mean.
3
Statistics Exam
Telecommunication Engineering
25th May 2011
Solution
Problems
1h 30m
P1 (2.5 points).
The lifetimes (in thousands of hours) of two devices A and B have, respectively,
the following density functions:

(

0 < x < 1,
 x,
1/3, 0 < x < 3,
fX (x) =
fY (y) =
x − 1, 1 < x < 2,

0,
otherwise.

0,
otherwise;
If there are the same proportion of components of each device:
(a) Calculate the probability that a randomly chosen component lasts more than 500 hours.
(b) If a component has lasted more than 500 hours, calculate the probability that it was from device
A.
(c) If we have 10 components of device A, calculate the probability that exactly 6 of them last more
than 500 hours.
(d) If we have 100 components of type B, what is the probability that more than 20 of them last less
than 500 hours.
Solution:
(a) Define the event: C = “the lifetime of a component is greater than 500 hours”. By the total probability
theorem,
P (C) = P (C|A) × P (A) + P (C|B) × P (B).
Given that,
Z
P (C|A)
1/2
= P (X > 1/2) = 1 − P (X ≤ 1/2) = 1 −
x dx = 7/8,
0
Z
P (C|B)
3
= P (Y > 1/2) =
1/2
then,
P (C) =
1
dy = 5/6,
3
7 1 5 1
× + × = 41/48 = 0.8541.
8 2 6 2
(b) Using Bayes’ Theorem:
P (A|C) =
P (C|A) × P (A)
7/8 × 1/2
=
= 0.5122.
P (C)
0.8541
4
(c) Define the random variable:
W = Number of components of device A that last more than 500 hours.
We know that W is a Binomial random variable with n = 10 and p = 7/8, that is, W ∼ B(10, 7/8).
We are asked:
10
10!
P (W = 6) =
(7/8)6 (1/8)4 =
(7/8)6 (1/8)4 = 0.023
6
6! × 4!
(d) Now we define the variable:
M = Number of components of device B that last less than 500 hours.
We know that this is a Binomial random variable with n = 100 and p = 1/6, W ∼ B(100, 1/6),
since n > 30, we check if we can approximate it by a Normal random variable, to be able to do so
n × p × (1 − p) > 5, in this case, n × p × (1 − p) = 100 × 1/6 × 5/6 = 13.8 > 5. Therefore
p
M ≈ N (np, np(1 − p)) → M ≈ N (16.67, σ = 3.73)
Then,
P (M > 20) = P (M ≥ 21) = P
Z>
21 − 16.67
3.73
Z>
20.5 − 16.67
3.73
= P (Z > 1.16) = 1−P (Z ≤ 1.16) = 1−0.8770 = 0.1230
Or, correcting by continuity
P (M > 20) = P (M ≥ 21) = P
= P (Z > 1.03) = 1−P (Z ≤ 1.03) = 1−0.8485 = 0.1515.
P2 (2.5 points).
Assume that two-dimensional points are selected the way as follows. First,
their x-coordinates are selected from a random variable X ∼ U (0, 1) and, then their y-coordinates are
selected from a random variable Y |X = x ∼ U (0, x2 ).
(a) Calculate the joint density of (X, Y ) and draw its domain of definition.
(b) Determine the marginal density of X and Y . Are X and Y incorrelated of each other?
(c) Determine P (Y < 0.5|X > 0.5).
(d) Fill out the following MATLAB code to approximate P (X > 0.5, Y < 0.5). Indicate as well the
approximate value of p.
n=10000;
x=rand(n,1);
y=rand(n,1).* _____________ ;
cond=( _____________ );
p=sum(cond)/n
Solution:
(a) Since f(X,Y ) (x, y) = f(X) (x)×f(Y |X) (y), the joint density of (X, Y ) is given by the following function:
(
1 · x12 = x12 , if x ∈ (0, 1), y ∈ (0, x2 ),
f(X,Y ) (x, y) =
0,
otherwise.
5
1
0.8
0.6
y=x2
0.4
0.2
0
0
0.2
0.4
0.6
0.8
1
The domain of definition where (X, Y ) takes values is plotted in the figure below: This domain can
be expressed in two equivalent ways:
or
(x, y) ∈ <2 such that x ∈ (0, 1), y ∈ (0, x2 ) ,
√
(x, y) ∈ <2 such that y ∈ (0, 1), x ∈ ( y, 1) .
(b) Marginal density of X: Since X ∼ U (0, 1), we know that its density is
(
1, if x ∈ (0, 1),
fX (x) =
0, otherwise.
Marginal density of Y : By definition, we know that fY (y) =
R1
If y ∈ (0, 1), then fY (y) = √y x12 dx = √1y − 1.
R∞
−∞
f(X,Y ) (x, y)dx.
Therefore:
(
fY (y)
=
√1
y
0,
− 1, if y ∈ (0, 1),
otherwise.
If X and Y were incorrelated, then:
Cov(X, Y ) = 0 ⇒ E[XY ] = E[X]E[Y ].
However, we will see below that this equality does not hold.
6
It is known that:
Z 1 Z x2
y
1
xy 2 dydx =
dydx
x
x
0
0
0
0
4 1
Z 1 2 y=x2
Z
x 1 y 1 1 3
1
dx =
x dx = = ,
x
2
2
8
8
0
0
y=0
0
2 x=1
Z 1
x 1
= ,
x · 1dx = 2
2
0
x=0
Z 1 Z 1
1
√
y √ − 1 dy =
( y − y)dy
y
0
0
3/2
y=1
2
2y
2 1
y 1
= − = .
3 − 2
3 2
6
y=0
Z
E[XY ]
=
=
E[X]
=
E[Y ]
=
=
1
Z
x2
Since:
E[XY ] =
1 1
1
6= · = E[X]E[Y ],
8
2 6
we conclude that X and Y are correlated.
(c) To calculate P (Y < 0.5|X > 0.5) we first use the definition of conditional probability and get that
P (Y < 0.5|X > 0.5) =
P ({Y < 0.5} ∩ {X > 0.5})
.
P (X > 0.5)
Then, we calculate separately the two probabilities involved (one in the numerator and the other in
the denominator) as it is detailed below.
!
Z √
Z 2
Z
Z
0.5
P ({Y < 0.5} ∩ {X > 0.5})
=
=
=
=
1
f (x, y)dydx +
0.5
Z √
P (X > 0.5)
x
=
0
x2
√
0.5
f (x, y)dy dx
0.5
0
!
Z 1 Z 0.5
1
1
dy dx + √
dy dx
x2
x2
0.5
0.5
0
0
Z √0.5
Z 1
0.5
1dx + √
dx
2
0.5
0.5 x
√
1
0.5
|x|0.5 − 0.5x−1 √0.5 ≈ 0.4142.
Z 1
1
1dx = x|0.5 = 1 − 0.5 = 0.5.
0.5
Z
0.5
We conclude that:
P (Y < 0.5|X > 0.5) ≈
(d) The complete MATLAB code is as follows:
n=10000;
x=rand(n,1);
y=rand(n,1).*x.^ 2;
cond=(x>0.5 & y<0.5);
p=sum(cond)/n
7
0.4142
= 0.8284.
0.5
With this code we generate 10000 values of the random vector (X, Y ). In the second line, we generate
10000 values of X using the MATLAB function rand that generates values uniformly distributed in
(0, 1). In the third line, we generate 10000 values of Y |X = x, using the inverse transformation
method and taking into account that Y |X = x is uniformly distributed in (0, x2 ). More specifically, it
is known that the distribution function of Y |X = x is:
Z y
1
y
FY |X=x (y) =
dy = 2 , if y ∈ (0, x2 ),
2
x
0 x
and so, solving in y the equation u = xy2 , we get the formula that allows us to generate values from
Y |X = x in terms of values u uniformly distributed in (0, 1). That formula is given by: y = ux2 . In the
fourth line, we use a logical expression to count the number of times the event {X > 0.5} ∩ {Y < 0.5}
happens in the sample of 10000 points (x, y) drawn from the random vector (X, Y ). Finally, the last
line estimates the probability that the event {X > 0.5} ∩ {Y < 0.5} occurs using its observed relative
frequency.
Note: In case you need it, notice the normal distribution table on the other side of the
page.
Probability
Table entry for z is
the area under the
standard normal curve
to the left of z .
z
TABLE A Standard normal probabilities (continued)
z
.00
.01
.02
.03
.04
.05
.06
.07
.08
.09
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
1.9
2.0
2.1
2.2
2.3
.5000
.5398
.5793
.6179
.6554
.6915
.7257
.7580
.7881
.8159
.8413
.8643
.8849
.9032
.9192
.9332
.9452
.9554
.9641
.9713
.9772
.9821
.9861
.9893
.5040
.5438
.5832
.6217
.6591
.6950
.7291
.7611
.7910
.8186
.8438
.8665
.8869
.9049
.9207
.9345
.9463
.9564
.9649
.9719
.9778
.9826
.9864
.9896
.5080
.5478
.5871
.6255
.6628
.6985
.7324
.7642
.7939
.8212
.8461
.8686
.8888
.9066
.9222
.9357
.9474
.9573
.9656
.9726
.9783
.9830
.9868
.9898
.5120
.5517
.5910
.6293
.6664
.7019
.7357
.7673
.7967
.8238
.8485
.8708
.8907
.9082
.9236
.9370
.9484
.9582
.9664
.9732
.9788
.9834
.9871
.9901
.5160
.5557
.5948
.6331
.6700
.7054
.7389
.7704
.7995
.8264
.8508
.8729
.8925
.9099
.9251
.9382
.9495
.9591
.9671
.9738
.9793
.9838
8
.9875
.9904
.5199
.5596
.5987
.6368
.6736
.7088
.7422
.7734
.8023
.8289
.8531
.8749
.8944
.9115
.9265
.9394
.9505
.9599
.9678
.9744
.9798
.9842
.9878
.9906
.5239
.5636
.6026
.6406
.6772
.7123
.7454
.7764
.8051
.8315
.8554
.8770
.8962
.9131
.9279
.9406
.9515
.9608
.9686
.9750
.9803
.9846
.9881
.9909
.5279
.5675
.6064
.6443
.6808
.7157
.7486
.7794
.8078
.8340
.8577
.8790
.8980
.9147
.9292
.9418
.9525
.9616
.9693
.9756
.9808
.9850
.9884
.9911
.5319
.5714
.6103
.6480
.6844
.7190
.7517
.7823
.8106
.8365
.8599
.8810
.8997
.9162
.9306
.9429
.9535
.9625
.9699
.9761
.9812
.9854
.9887
.9913
.5359
.5753
.6141
.6517
.6879
.7224
.7549
.7852
.8133
.8389
.8621
.8830
.9015
.9177
.9319
.9441
.9545
.9633
.9706
.9767
.9817
.9857
.9890
.9916
Related documents