Download Chapter 6 Jointly Distributed Random Variables (聯合隨機變數)

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of network traffic models wikipedia , lookup

Mathematical model wikipedia , lookup

Inductive probability wikipedia , lookup

Infinite monkey theorem wikipedia , lookup

Birthday problem wikipedia , lookup

Karhunen–Loève theorem wikipedia , lookup

Negative binomial distribution wikipedia , lookup

Law of large numbers wikipedia , lookup

Transcript
Chapter 6
Jointly Distributed Random Variables
(聯合隨機變數)
So far we have only concerned with probability distributions for single random variables.
However, we are also interested in probability statements involving two or even more
random variables. (When?) Now we are going to introduce the case of two discrete random
variables. In order to deal with such probabilities, we define the following.
Definition 6.1: Let X and Y be two discrete random variables. The joint probability mass
function of X and Y is
f(x , y)
= P(X = x, Y = y),
− ∞ < x, y < ∞ .
Example 6.2: Suppose that 3 balls are randomly selected from an urn containing 3 red,
4 white, and 5 blue balls. If we let X and Y denote, respectively, the number of red and white
balls chosen, then find the joint probability mass function of X and Y.
f(0, 0) =
f(0, 1) =
f(0, 2) =
f(0, 3) =
f(1, 0) =
f(1, 1) =
f(1, 2) =
f(2, 0) =
f(2, 1) =
f(3, 0) =
Table for
X\Y
0
1
2
3
f(x, y) = P(X = x, Y = y)
0
1
2
3
Definition 6.3: The random variables X and Y are said to be independent if for any two
sets of real numbers A and B,
P(X ∈ A, Y ∈ B) =
P(X ∈ A) P(Y ∈ B).
In other words, X and Y are independent if, for all A and B, the events EA = {X ∈ A} and
FB = {Y ∈ B} are independent.
It can be shown by using the three axioms of probability that the above equation will
follow if and only if for all a, b,
P(X ≤ a, Y ≤ b) =
P(X ≤ a) P(Y ≤ b).
Hence, in terms of the joint distribution function F of X and Y, we have that X and Y are
independent if
F(a, b) = FX(a) FY(b)
for all a, b.
When X and Y are discrete random variables, the condition of independence in the
above definition is equivalent to
f(x, y)
=
fX(x) fY(y)
for all x, y.
Thus, loosely speaking, X and Y are independent if knowing the value of one does not change
the distribution of the other. Random variables that are not independent are said to be
dependent.
Exercise 6.4: Suppose that n + m independent trials (for instance, tossing a fair coin and
counting the occurrences of heads), having a common success probability p, are performed. If
X is the number of successes in the first n trials, and Y is the number of successes in the final
m trials, then X and Y are independent, since knowing the number of successes in the first n
trials does not affect the distribution of the number of successes in the final m trials (by the
assumption of independent trials). Now let Z be the total number of successes in the n + m
trials.
Are X and Z independent?
Why?
Exercise 6.5: Suppose that two fair dice are tossed. Let M be the maximum number on the
dice and N be the minimum number on the dice. Are M and N independent?
Exercise 6.6: (i)
Prove that
C kn + m =
Why?
k
∑
i =0
C in C km−i .
[Hint: Consider a group of
people consisting of n boys and m girls. How many different combinations if a committee of k
people is formed? ]
(ii)
Let X and Y be independent binomial random variables with respective parameters
(n, p) and (m, p). Calculate the distribution of X + Y.
(iii)
Jack and Mary are independently tossing a fair coin 15 times. What is the
probability that the total number of heads they finally obtained is equal to 13?
(iv)
What is the probability that the total number of heads they finally obtained is
greater than 8?
(v)
What is the probability that the total number of heads Jack obtained is greater than
Mary’s by 10 times?
Exercise 6.7: Jack is throwing a fair die 6 times and Mary is flipping a fair coin 4 times.
Let X be the number of prime numbers obtained and Y be the number of tails obtained.
(i)
Find
P( X = Y ).
(ii)
Find
P (X < Y ).
(iii)
Find
P( X + Y = 5 ).
Exercise 6.8: Two fair dice are tossed. Let X be the sum of the two outcomes and Y be the
product of the outcomes.
Draw a table listing all the possible ordered pairs “X, Y”.
Each pair has a probability equal to
(i)
P( X = Y )
=
(ii)
P (X < Y )
=
(iii)
P( X > Y )
=
(iv)
P( X divides Y )
(v)
(vi)
P( Y − X = 5 )
=
(viii)
P( X is prime)
=
(x)
=
=
(vii)
(ix)
Find the followings.
=
P( X ≡ Y mod 3 )
P( 2X = Y )
.
P( Y − X is prime )
=
P( X + Y + 3 is triangular or X + Y + 1 is perfect square)
=
Exercise 6.9: If X and Y are independent Poisson random variables with respective
parameters λ1 and λ 2 , compute the distribution of X + Y.
Exercise 6.10: If X and Y are independent Poisson random variables with respective
parameters λ1 and λ 2 , calculate the conditional probability of X, given that X + Y = n.