Download Chapter 2 Random Variables

Document related concepts
Transcript
Stochastic Processes
Chapter 2 Random Variables
Prof. Jernan Juang
Prof. Chun-Hung Liu
Dept. of Engineering Science
Dept. of Electrical and Computer Eng.
National Cheng Kung University
National Chiao Tung University
Spring 2015
15/3/2
Chapter 2 : Random Variables
1
What is a Random Variable? •  Random Experiments Associated with Numerical Results
•  In many probabilistic models, the outcomes are of a numerical nature,
e.g., if they correspond to instrument readings or stock prices.
•  In some experiments, the outcomes are NOT numerical, but they may
be associated with some numerical values of interest. For example, if
the experiment is the selection of students from a given population, we
may wish to consider their grade point average.
•  Basic Concepts of Random Variables
Given an experiment and the corresponding set of possible outcomes (the
sample space), a random variable associates a particular number with each
outcome. This number is referred as the numerical value or the experimental
value of the random variable.
Mathematically, a Random Variable (RV) is a real-valued
function of the experimental outcome.
15/3/2
Chapter 2 : Random Variables
2
What is a Random Variable? •  Visualization of a random variable: It is a function that assigns a
numerical value to each possible outcome of the experiment.
•  An example of a random variable The experiment consists of
two rolls of a 4-sided die,
and the random variable is
the maximum of the two
rolls. If the outcome of the
experiment is (4, 2), the
experimental value of this
random variable is 4.
15/3/2
Chapter 2 : Random Variables
3
More Examples of Random Variables •  In an experiment involving a sequence of 5 tosses of a coin, the
number of heads in the sequence is a random variable. However, the
5-long sequence of heads and tails is not considered a random
variable because it does not have an explicit numerical value.
•  In an experiment involving two rolls of a die, the following are
examples of random variables:
1.  The sum of the two rolls.
2.  The number of sixes in the two rolls.
3.  The second roll raised to the fifth power.
•  In an experiment involving the transmission of a message, the time
needed to transmit the message, the number of symbols received in
error, and the delay with which the message is received are all
random variables.
15/3/2
Chapter 2 : Random Variables
4
Random Variables : A More Math. Point of View •  Schematic Explanation for Random Variable X: Borel set : Sigma-field on the real line.
Consider an experiment H with sample space ⌦ . The elements or points of ⌦ ,
⇣ are the random outcome of H . If to every ⇣ we assign a real number X(⇣) , we
establish a correspondence rule between ⇣ and R, the real line. Such a rule
(subject to certain constraint), is called a Random Variable (RV).
15/3/2
Chapter 2 : Random Variables
5
Random Variables : A More Math. Point of View Defn : Let H be an experiment with sample space ⌦ . Then the
random variable X is a function whose domain is ⌦ that satisfies
the following :
(i)  For every Borel set of numbers B, the set
event and
(ii) is an
Example : A person, chosen at random in the street, is asked if he or she has
a younger brother. If the answer is No (Yes), the data is encoded as Zero
(One). This experiment has sample space
and sigma
field
, and P[{No}] = 3 and P[{Yes}] = 1 .
4
15/3/2
Chapter 2 : Random Variables
4
6
More Examples of Random Variables
Example : A bus arrives at random in [0, T ] ; Let t denote the time of
arrival. The sample description space is ⌦ = {t : t 2 [0, T ]} . A RV X is
defined by
Assume that the arrival time is uniform over [0, T ] . We can now ask
and compute what is Example : An urn contains three colored balls. The balls are colored
white (W), black (B) , and red (R), respectively. So
. We
can define the following RV We can try to compute P[X  x] for any number x.
15/3/2
Chapter 2 : Random Variables
7
More Concepts on Random Variables
Starting with a probabilistic model of an experiment:
•  A random variable is a real-valued function of the outcome of
the experiment.
•  A function of a random variable defines another random
variable.
•  We can associate with each random variable certain
“averages” of interest, such the mean and the variance.
•  A random variable can be conditioned on an event or on
another random variable.
•  There is a notion of independence of a random variable from
an event or from another random variable.
Discrete Random Variables : A random variable is called discrete if
its range (the set of values that it can take) is finite or at most
countably infinite.
15/3/2
Chapter 2 : Random Variables
8
Concepts related to Discrete RVs Starting with a probabilistic model of an experiment:
•  A discrete random variable is a real-valued function of the
outcome of the experiment that can take a finite or countably
infinite number of values.
•  A (discrete) random variable has an associated probability mass
function (PMF), which gives the probability of each numerical
value that the random variable can take.
•  A function of a random variable defines another random
variable, whose PMF can be obtained from the PMF of the
original random variable.
15/3/2
Chapter 2 : Random Variables
9
Probability Mass Function (PMF) •  For a discrete random variable X, these are captured by the prob. mass
function (PMF for short) of X, denoted pX . In particular, if x is any
possible value of X, the probability mass of x, denoted pX (x) is the
prob. of the event {X = x} consisting of all outcomes that give rise to a
value of X equal to x: pX (x) = P[{X = x}]
For example, let the experiment consist of two independent tosses of
a fair coin, and let X be the number of heads obtained. Then the
PMF of X is
Throughout this course, we will use upper case characters to
denote random variables, and lower case characters to denote real
numbers such as the numerical values of a random variable.
15/3/2
Chapter 2 : Random Variables
10
Probability Mass Function (PMF) Note that
where in the summation above, x ranges over all the possible
numerical values of X. By a similar argument, for any set S of real
numbers, we also have
•  For example, if X is the number of heads obtained in two independent
tosses of a fair coin, as above, the probability of at least one head is
Calculating the PMF of X is conceptually straightforward, and is
illustrated in the following figure.
15/3/2
Chapter 2 : Random Variables
11
Probability Mass Function (PMF) (a) Illustration of the method to
calculate the PMF of a random
variable X. For each possible
value x, we collect all the
outcomes that give rise to X=x
and add their probabilities to
obtain pX (x).
(b) Calculation of the PMF pX
of the random variable X =
maximum roll in two
independent rolls of a fair 4sided die. There are four
possible values x, namely, 1, 2,
3, 4. To calculate pX (x) for a
given x, we add the
probabilities of the outcomes
that give rise to x.
15/3/2
Chapter 2 : Random Variables
12
Bernoulli Random Variable
Consider the toss of a biased coin, which comes up a head with
probability p, and a tail with probability 1 p . The Bernoulli random
variable takes the two values 1 and 0, depending on whether the
outcome is a head or a tail:
Its PMF is •  For all its simplicity, the Bernoulli random variable is very important.
In practice, it is used to model generic probabilistic situations with
just two outcomes, such as:
•  The state of a telephone at a given time that can be either free or busy.
•  A person who can be either healthy or sick with a certain disease.
15/3/2
Chapter 2 : Random Variables
13
Binomial Random Variable
The Motivating Sense of “Binomial”:
A biased coin is tossed n times. At each toss, the coin comes up a head
with probability p, and a tail with probability 1 p , independently of
prior tosses. Let X be the number of heads in the n-toss sequence. We
refer to X as a binomial random variable with parameters n and p.
The PMF of X consists of the binomial probabilities
P
•  The normalization property x pX (x) = 1 , specialized to the binomial
random variable, is written as
Some special cases of the binomial PMF are sketched in the following figure:
15/3/2
Chapter 2 : Random Variables
14
Binomial Random Variable
The PMF of a binomial random variable. If p = 1/2, the PMF is
symmetric around n/2. Otherwise, the PMF is skewed towards 0 if
p < 1/2, and towards n if p > 1/2.
15/3/2
Chapter 2 : Random Variables
15
Geometric Random Variable
Consider we repeatedly and independently do an experiment with the
probability of success p , where 0 < p < 1 . The geometric random
variable is the number X of doing experiments needed for a success to
come up for the first time. So it PMF is given by
since (1 p)k 1 p is the probability of the sequence consisting of k-1
successive tails followed by a head, as shown in the following figure.
The PMF (1 p)k 1 p decreases as a
geometric progression with parameter
1 p . Is it a legitimate PMF ? Yes, because
15/3/2
Chapter 2 : Random Variables
16
Poisson Random Variable
A Poisson random variable takes nonnegative integer values. Its PMF is
given by
where is a positive parameter characterizing the PMF; see the figure.
k
The PMF e k! of the Poisson random variable for different values of . Note
that if < 1 , then the PMF is monotonically decreasing, while if > 1 , the PMF
first increases and then decreases as the value of k.
15/3/2
Chapter 2 : Random Variables
17
Continuous Random Variables and Their PDFs
A random variable X is called continuous if its probability law can be
described in terms of a nonnegative function fX , called the probability
density function (pdf) of X, which satisfies
for every subset B of the real line. In particular, the probability that
the value of X falls within an interval is
The probability that X
takes value
R a in an interval
[a,b] is b fX (x)dx, which
is the shaded area in the
figure.
15/3/2
Chapter 2 : Random Variables
18
Continuous Random Variables and Their PDFs
Ra
For any single value a, we have P[X = a] = a fX (x)dx = 0 . For this
reason, including or excluding the endpoints of an interval has no
effect on its probability:
• 
Note that to qualify as a PDF, a function fX (·) must be
nonnegative, i.e., fX (x) 0 for every x, and must also satisfy the
normalization equation
Graphically, this means that the entire area under the graph of the
PDF must be equal to 1.
15/3/2
Chapter 2 : Random Variables
19
Continuous Random Variables and Their PDFs
To interpret the PDF, note that for an interval [x, x + ] with very
small length , we have
What physical meaning does the above equation imply ?
fX (x) can be interpreted as
“probability mass per unit length”
around x. If is very small, the prob.
that X takes value in the interval
[x, x + ] which is the shaded area in the
figure, which is approximately equal
tofX (x) · x .
15/3/2
Chapter 2 : Random Variables
20
Continuous Uniform RV : Example
Example: Continuous Uniform Random Variable. A gambler spins
a wheel of fortune, continuously calibrated between 0 and 1, and
observes the resulting number. Assuming that all subintervals of [0,1] of
the same length are equally likely, this experiment can be modeled in
terms a random variable X with PDF
for some constant c. This constant can be determined by using the
normalization property
so that c = 1.
15/3/2
Chapter 2 : Random Variables
21
Continuous Uniform Random Variable
•  Generalization: consider a random variable X that takes values in an
interval [a, b], and again assume that all subintervals of the same
length are equally likely. We refer to this type of random variable as
uniform or uniformly distributed. •  Its PDF has the form
where c is a constant.
For fX (·) to satisfy the normalization
property, we must have
15/3/2
Chapter 2 : Random Variables
22
Continuous Uniform Random Variable
Note that the probability P[X 2 I] that X takes value in a set I is
Example (Piecewise Constant PDF.) : Alvin’s driving time to work is
between 15 and 20 minutes if the day is sunny, and between 20 and 25
minutes if the day is rainy, with all times being equally likely in each
case. Assume that a day is sunny with probability 2/3 and rainy with
probability 1/3. What is the PDF of the driving time, viewed as a
random variable X?
We interpret the statement that “all times are equally likely” in the
sunny and the rainy cases, to mean that the PDF of X is constant in
each of the intervals [15, 20] and [20, 25]. Furthermore, since these
two intervals contain all possible driving times, the PDF should be
zero everywhere else:
15/3/2
Chapter 2 : Random Variables
23
Continuous Uniform RV : Example
where c1 and c2 are some constants. We can determine these constants
by using the given probabilities of a sunny and of a rainy day:
so that 15/3/2
Chapter 2 : Random Variables
24
Cumulative Distribution Function (CDF)
• 
The Cumulative Distribution Function (CDF) of a random variable
X is denoted by FX (·) and provides the probability P[X  x] . In
particular, for every x we have
Loosely speaking, the CDF FX (x) “accumulates” probability “up
to” the value x.
Remark : Any random variable associated with a given probability
model has a CDF, regardless of whether it is discrete, continuous, or
other. This is because {X  x} is always an event and therefore has a
well-defined probability.
15/3/2
Chapter 2 : Random Variables
25
Discrete CDF : Example
The CDF is related to the PMF through the formula
FX (x) = P[X  x] =
15/3/2
X
pX (k),
kx
Chapter 2 : Random Variables
26
Continuous CDF : Example
The CDF is related to the PDF through the formula
Z x
FX (x) = P[X  x] =
fX (t) dt.
1
15/3/2
Chapter 2 : Random Variables
27
Cumulative Distribution Function (CDF)
•  Generalization: The cumulative distribution function (CDF) is
defined by
The above equation means the prob. that “the set of all outcomes ⇣ in the
sample space such that the function X(⇣) has values less than or equal to
x ”
•  Properties of FX (x)
15/3/2
Chapter 2 : Random Variables
28
CDF : Examples
Example : A bus arrives at random in (0, T ] . Let RV X denote the time of
arrival. Suppose it is known that the bus is equally likely or uniformly likely to
come at any time within (0, T ].
What is the CDF of X ? 15/3/2
Chapter 2 : Random Variables
29
CDF : Examples
Example : Compute the CDF for a binominal RV with parameter (n , p) Let X be the binominal RV and X 2 {0, 1, 2, 3, . . . , n}
Since X only takes on integers, then event , where
is the largest integer equal to or smaller than x
Then FX (x) is given by
P[1.99 < X  3] = 0.6656
15/3/2
Chapter 2 : Random Variables
30
CDF : Examples
Example: The Maximum of Several Random Variables. You are
allowed to take a certain test three times, and your final score will be
the maximum of the test scores. Thus,
X = max{X1 , X2 , X3 },
where X1 , X2 , X3 are the three test scores and X is the final score.
Assume that your score in each test takes one of the values from 1 to
10 with equal probability 1/10, independently of the scores in other
tests. What is the PMF pX of the final score?
We calculate the PMF indirectly. We first compute the CDF FX (k) and
then obtain the PMF as
15/3/2
Chapter 2 : Random Variables
31
More Properties of CDF
15/3/2
Chapter 2 : Random Variables
32
Probability Density Function (PDF) 15/3/2
Chapter 2 : Random Variables
33
PDF : Properties and Examples
15/3/2
Chapter 2 : Random Variables
34
Normal (Gaussian) Distribution
Definitions of Expectation (Mean) and Variance for a continuous RV
15/3/2
Chapter 2 : Random Variables
35
Other Important PDFs
15/3/2
Chapter 2 : Random Variables
36
Exponential, Rayleigh and Uniform PDFs 15/3/2
Chapter 2 : Random Variables
37
Table of Continuous PDfs and CDFs 1
erf , p
2⇡
15/3/2
Z
x
e
1 2
2t
dt
0
Chapter 2 : Random Variables
38
Table of Discrete PDFs and CDFs 15/3/2
Chapter 2 : Random Variables
39
Some Examples of Discrete CDF and PMF Example :
15/3/2
Chapter 2 : Random Variables
40
Some Examples of Discrete CDF and PMF Example :
15/3/2
Chapter 2 : Random Variables
41
Expectation of a Discrete RV Definition : X is a discrete RV and its expectation is defined by
15/3/2
Chapter 2 : Random Variables
42
Expectation of the Function of a RV 15/3/2
Chapter 2 : Random Variables
43
Jointly Distributed Random Variables
•  A Motivating Example : Consider a probability space (⌦, F, P)
involving an underlying experiment consisting of the simultaneous
throwing of two fair coins. Since the ordering is not important here
and the key outcomes are ⇣1 = HH, ⇣2 = HT, ⇣3 = TT , the sample
space is ⌦ = {HH, HT, T T } , the sigma-field of events is F = {;, ⌦, HT, T T, HH, {T T, HT }, {HH, T T }, {HH, HT }}
The probabilities are 0, 1, ½, ¼, ¼, ¾ , ¾, and ½ . Now define two
random variables (
0, if at least one H
X1 (⇣1 ) =
1, otherwise
(
1, if one H and one T
X2 (⇣1 ) =
1,
otherwise
Then P [X1 = 0] = 3/4, P [X1 = 1] = 1/4, P [X2 = 1] = 1/2, P [X2 = 1] = 1/2.
P [X1 = 0, X2 = 1] = P [{HH}] =
15/3/2
1
, P [X1 = 1, X2 =
4
Chapter 2 : Random Variables
1] = P [{;}] = 0.
44
Jointly Distributed Random Variables
•  Definition of Joint CDF of RVs X and Y
15/3/2
Chapter 2 : Random Variables
45
Jointly Distributed Random Variables
In other words, we also have
Z x Z
FXY (x, y) =
1
y
fXY (⇠, ⌘)d⇠d⌘
1
•  Properties of Joint CDFFXY (x, y)
(1)  FXY (1, 1) = 1; FXY ( 1, y) = FXY (x, 1) = 0;
also FXY (x, 1) = FX (x); FXY (1, y) = FY (y);
(2)  If x1  x2 , y1  y2 , then FXY (x1 , y1 )  FXY (x2 , y2 )
✏, > 0
(3)  FXY (x, y) = lim✏!, !0 FXY (x + ✏, y + )
(continuity from the right and from above)
(4)  For all x2 x1 and y2 y1 , we must have
FXY (x2 , y2 )
FXY (x2 , y1 )
FXY (x1 , y2 ) + FXY (x1 , y1 )
0
(How to prove this ?)
15/3/2
Chapter 2 : Random Variables
46
Examples of a Joint CDF
15/3/2
Chapter 2 : Random Variables
47
Joint PDF and Its Marginal PDF 15/3/2
Chapter 2 : Random Variables
48
Summery of PDF, CDF and Expectation •  Definition of CDFFX (x) = P[X  x]
•  Definition of Expectation and Variance
15/3/2
Chapter 2 : Random Variables
49
Summery of PDF, CDF and Expectation 15/3/2
Chapter 2 : Random Variables
50
Independent Random Variables •  Definition of the Independence of two RVs
fXY (x, y) = fX (x) · fY (y)
15/3/2
Chapter 2 : Random Variables
51
Conditional CDF and PDF
•  Definition of Conditional CDF : Consider event C consisting of all
outcomes ⇣ 2 ⌦ such that X(⇣)  x and ⇣ 2 B ⇢ ⌦ , where B is
another event. So we know
The conditional CDF of X given event B is defined by
The conditional PDF is simply defined by
•  Example : Let B = {X  10} . We want to find FX (x|B) .
15/3/2
Chapter 2 : Random Variables
52
Conditional CDF and PDF : Examples
The previous results can be shown in the following figure.
Can you calculate FX (x|B) when B = {b < X  a} ? 15/3/2
Chapter 2 : Random Variables
53
Conditional CDF and PDF : Examples
•  Example (Poisson Conditioned on Even) : Let X be a Poisson RV
with parameter µ > 0 . We wish to compute the conditional PMF
and CDF of X given the event {X = 0, 2, 4, . . .} , {X is even}.
First observe that P[X even] is given by 1
X
µk µ
P[X = 0, 2, . . .] =
e .
k!
k=0,2,...
Then for X odd, we have P[X = 1, 3, . . .] =
1
X
k=1,3,...
From these relations, we obtain 1
X
k 0 and even
15/3/2
µk
e
k!
µ
1
X
k 0 and odd
µk
e
k!
µ
Chapter 2 : Random Variables
µk
e
k!
µ
.
1
X
( µ)k
=
e
k!
µ
=e
2µ
k=0
54
Conditional CDF and PDF : Examples
and 1
X
k 0 and even
µk
e
k!
µ
+
1
X
k 0 and odd
µk
e
k!
Hence, P[X even] = P [X = 0, 2, . . .] = 12 (1 + e
definition of conditional PMF, we obtain 2µ
µ
= 1.
) . Using the
P[X = k, X even]
PX [k|X even] =
P[X even]
If k is even, then {X = k} is a subset of {X even} . If k is odd,
{X = k} \ {X even} = ; . Hence P[X = k, X even] = P[X = k] for k
even and it equals 0 for k odd. So we have
(
2µk
µ
e
, k 0 and even,
µ
(1+2e
)k!
PX [k|X even] =
0,
k odd
15/3/2
Chapter 2 : Random Variables
55
The Weighted Sum of Conditional CDFs
The conditional CDF is then X
FX (x|X even) =
pX (k|X even) =
all kx
X
0kx, and even
2µk
e
(1 + 2e µ )k!
µ
•  The CDF can be written as a weighted sum of conditional distribution
functions. Consider now that event B consists of n mutually exclusive
events {Ai }, i = 1, . . . , n , defined on the same prob. Space as B. with
B , {X  x}, we immediately obtain from the total prob. formula:
FX (x) =
n
X
FX (x|Ai )P[Ai ]
i=1
The above result describes FX (x) as a weighted sum of conditional
distribution functions. One way to view it is an “average” over all the
conditional CDFs.
15/3/2
Chapter 2 : Random Variables
56
Conditional CDF and PDF : Examples
Example (Defective Memory Chips) : In the automated manufacturing
of computer memory chips, company Z produces one defective chip for
every five good chips. The defective chips (DC) have a time of failure X
that obeys the CDF (x in months)
FX (x|DC) = (1 e x/2 )u(x)
while the time of failure for the good chips (GC) obeys the CDF
FX (x|GC) = (1
e
x/10
)u(x)
(x in months)
The chips are visually indistinguishable. A chip is purchased. What
is the probability that the chip will fail before six months of use?
Sol : The unconditional CDF for the chip is FX (x) = FX (x|DC)P [DC] + FX (x|GC)P [GC]
where P [DC] and P [GC] are the probabilities of selecting a defective and good
chip, respectively. From the given data P [DC] = 1 andP [GC] = 56 . Thus, FX (6) = [1
15/3/2
3 1
e ] + [1
6
e
Chapter 2 : Random Variables
6
0.6
5
] = 0.534
6
57
Bayes’ Formula for PDFs
Consider the events B and {X = x} defined on the same probability
space. Then from the definition of conditional probability, so it seems
reasonable to write P [B, X = x]
P[B|X = x] =
P [X = x]
What’s wrong with the above equation?
The problem is that if X is a continuous RV, then P[X = x] = 0 . Hence
it will become undefined. Nevertheless, we can compute P[B|X = x] by
taking appropriate limits of probabilities involving the event
{x < X  x + x} . Thus, consider the expression P[B|x < X  x +
x] =
Then note that P[x < X  x +
15/3/2
P [x < X  x + x|B]P [B]
P [x < X  x + x]
x|B] = F (x +
Chapter 2 : Random Variables
x|B)
F (x|B)
58
Bayes’ Formula for PDFs
Dividing numerator and denominator on the right side by
making x ! 0 , we obtain P[B|X = x] = lim P[B|x < X  x +
!0
x] =
x and
fX (x|B)P [B]
fX (x)
The left quantity is sometimes called the a posteriori prob. (or a
posteriori density) of B given X=x. Then we can have the following result :
Z 1
P[B] =
P[B|X = x]fX (x) dx
Why?
1
How to interpret the above expression ? P[B] can be called the average probability of B, suggested by its form. 15/3/2
Chapter 2 : Random Variables
59
Bayes’ Formula for PDFs : Example
Example (Detecting Closed Switch) : A signal, X, can come from one of
three different sources designated as A, B, or C. The signal from A is
distributed as N(-1,4); the signal from B is distributed as N(0,1) and the
signal from C has an N(1,4) distribution. In order for the signal to reach
its destination at R, the switch in the line must be closed. Only one
switch can be closed when the signal X is observed at R, but it is
unknown which switch it is. However, it is known that switch a is closed
twice as often as switch b, which is closed twice as often as switch c (see
the figure). 15/3/2
Chapter 2 : Random Variables
60
Bayes’ Formula for PDFs : Example
(a)  Compute P[X  1]
(b)  Given that we observe the event {X >
was this signal most likely? 1} , from which source
Sol : (a) Let P[A] denote the prob. that A is responsible for the
observation at R. From the information about the switches we get P [A] = 2P [B] = 4P [C] and P [A] + 2P [B] + 4P [C] = 1
P [A] =
4
7
P [B] =
Next we compute P [X 
P [X 
where
15/3/2
1] = P [X 
2
7
P [C] =
1] from 1|A]P [A] + P [X 
P [X 
1
7
1|B]P [B] + P [X 
1|C]P [C],
1
1|A] =
2
Chapter 2 : Random Variables
61
Bayes’ Formula for PDFs : Example
P [X 
1|B] =
1
2
erf(1) = 0.159
P [X 
1|C] =
1
2
erf(1) = 0.159
1 4
2
1
P [X  1] = · + 0.159 · + 0.159 · = 0.354
2 7
7
7
(b) We wish to compute max{P
[A|X > 1], P [B|X > 1], P [C|X >
Hence, 1]}
Note that P [X > 1|A] = 1 P [X  1|A] and other cases are the
same. So concentrating on source A, and using Bayes’ rule, we get
P [A|X >
1] =
(1
P [X  1|A])P [A]
1 P [X  1]
Thus, P [B|X > 1] = 0.372 P [C|X >
P [A|X > 1] = 0.44
1] = 0.186
(Source A was the most likely cause of the event {X >
15/3/2
Chapter 2 : Random Variables
1}.)
62
Conditional CDF and PDF
15/3/2
Chapter 2 : Random Variables
63