Download Chapt. 9: Joint distributions and independence

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Chapter 9: Joint distributions
and independence
CIS 3033
9.1 Joint distributions: discrete
Number of random variables: one, two, and more,
especially when they are defined on the same sample
space. [Otherwise consider the products of sample
spaces, as in Section 2.4.]
What is new: influence between variables, as relation
among events.
For example, two random variables S and M, the sum
and the maximum of two throws of a die.
9.1 Joint distributions: discrete
The joint probability mass function of discrete random
variables X and Y (on the same sample space Ω) is
the function p: R2→ [0, 1] defined by p(a, b) = P(X = a,
Y = b) for −∞< a,b < ∞.
The joint distribution function of random variables X
and Y is the function F: R2 → [0, 1] defined by F(a, b)
= P(X ≤ a, Y ≤ b) for −∞< a,b < ∞.
The marginal probability mass function of discrete
random variables X or Y can be obtained from p(a, b)
by summing the values of the other variable.
9.1 Joint distributions: discrete
9.1 Joint distributions: discrete
In many cases the joint probability mass functions of
X and Y cannot be retrieved from the marginal
probability mass functions pX and pY. This is also the
case for the distribution functions.
9.2 Joint distributions: continuous
9.2 Joint distributions: continuous
For the distribution functions, the relation is the same as the
discrete case, as given in formula (9.1) and (9.2).
9.3 More than two random variables
The joint distribution function F of X1,X2, . . . , Xn
(all defined in the same Ω) is defined by
F(a1, a2, . . . , an) = P(X1 ≤ a1, X2 ≤ a2, . . ., Xn ≤ an)
for −∞ < a1, a2, . . . , an < ∞.
Joint probability mass function p can be defined for
discrete random variables, and joint density
function f can be defined for continues random
variables, just like the case of two-variable.
9.3 More than two random variables
Suppose a vase contains N balls numbered 1, 2,
..., N, and we draw n balls without replacement.
Since there are N(N−1) · · · (N−n+1) possible
combinations for the values of X1,X2,..., Xn, each
having the same probability, the joint probability
mass function is given by
p(a1, a2, . . . , an) = P(X1=a1, X2=a2, . . . , Xn=an)
= 1 / [N(N − 1) · · · (N − n + 1)], for all distinct
values a1, a2, . . . , an with 1 ≤ aj ≤ N.
The marginal distribution of each Xi is pXi(k) = 1/N.
9.4 Independent random variables
Random variables X and Y are independent if
every event involving only X is independent of
every event involving only Y. Random variables
that are not independent are called dependent.
Random variables X and Y are independent if
P(X ≤ a, Y ≤ b) = P(X ≤ a)P(Y ≤ b),
that is, the joint distribution function F(a, b) =
FX(a)FY(b), for all possible values of a and b.
The same conclusion applies to probability mass
function p and density function f.
9.5 Propagation of independence
Related documents