Download PROBABILITY REVIEW Notation, Basic Probability • Sample spaces

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
PROBABILITY REVIEW
Notation, Basic Probability
• Sample spaces S with events Ai, probabilities P (Ai);
union A ∪ B and intersection AB, complement Ac.
• Axioms: P (A) ≤ 1; P (S) = X
1;
P (Ai).
for exclusive Ai, P (∪iAi) =
i
• Conditional probability: P (A|B) = P (AB)/P (B);
P (A) = P (A|B)P (B) + P (A|B c)P (B c)
• Random variables (RVs) X;
the cumulative distribution function (cdf)
F (x) = P {X ≤ x};
for a discrete RV, probability mass function (pmf)
X
f (xi);
f (x) = P {X = x}, x = x1, x2, . . . ; F (x) =
xi ≤x
for a continuous RV, probability
Z density function
Z x(pdf)
f (x), with P {X ∈ C} = f (x)dx; F (x) =
f (t)dt.
C
−∞
PROBABILITY REVIEW CONTINUED
Notation, Basic Probability Continued
• Generalizations for more than one variable, e.g.
two RVs X and Y : joint cdf F (x, y) = P {X ≤ x, Y ≤ y};
pmf f (x, y) = P {X = x, Y = y}; orZ Z
pdf f (x, y), with P {(X, Y ) ∈ A} =
f (x, y)dxdy;
A
independent X and Y iff f (x, y) = fX (x)fY (y).
• Expected value or mean: for RV X, µ = E[X];
discrete RVs
X
X
E[X] =
xif (xi), or E[g(X)] =
g(xi)f (xi);
i
i
continuous RVs
Z ∞
Z
E[X] =
xf (x)dx, or E[g(X)] =
−∞
∞
−∞
E[aX + b] = aE[X] + b = aµ + b.
2
g(x)f (x)dx;
PROBABILITY REVIEW CONTINUED
• Variance: V ar(X) = E[(X − µ)2], with
2
V ar(X) = E[X 2] − µ2, V ar(aX
+
b)
=
a
V ar(X),
p
and standard deviation σ = V ar(X);
with RVs X, Y , covariance
Cov(X, Y ) = E[(X − µX )(Y − µY )], and
V ar(X + Y ) = V ar(X) + V ar(Y ) + 2 Cov(X, Y );
independent RVs have Cov(X, Y ) = 0;
the correlation
p
Corr(X, Y ) = Cov(X, Y )/ V ar(X)V ar(Y ).
Chebyshev’s Inequality : for RV X with µ and σ
P {|X − µ| ≥ kσ} ≤ 1/k 2.
Weak Law of Large Numbers : if X1, X2, . . . , is sequence of
independent and identically distributed (iid) RVs
with mean µ, then for any > 0,
(
)
X1 + X2 + · · · + Xn
lim P |
− µ| > = 0.
n→∞
n
3
PROBABILITY REVIEW CONTINUED
Some Discrete RVs
• Binomial RVs: n independent trials, success probability is p.
If X is number of successes,
n i
P {X = i} =
p (1 − p)n−i;
i
with E[X] = np, V ar(X) = np(1 − p);
if n = 1, X is a Bernoulli RV.
• Poisson RVs: take values 0, 1, 2, . . . ,
P {X = i} = e
i
−λ λ
i!
;
with E[X] = V ar(X) = λ.
For small p, Poisson RV’s approximate the number of
successes in a large number (n) of trials, with λ ≈ np.
4
PROBABILITY REVIEW CONTINUED
• Geometric RVs: for independent trials, success probability p.
If X is the number of the first success,
P {X = i} = (1 − p)i−1p;
with E[X] = 1/p, V ar(X) = (1 − p)/p2.
• Negative Binomial RVs: for independent trials with success
probability p.
If X is the number of trials for r success,
n−1
P {X = n} =
(1 − p)n−r pr ;
r−1
with E[X] = r/p, V ar(X) = r(1 − p)/p2.
Some Continuous RVs
• Uniform RVs: RV X uniform on [a, b] has pdf
1/(b − a) if a ≤ x ≤ b
,
f (x) =
0
otherwise
and cdf F (x) = (x − a)/(b − a); with E[X] = (b + a)/2,
E[X 2] = (a2 + b2 + ab)/3, so V ar(X) = (b − a)2/12
5
PROBABILITY REVIEW CONTINUED
•
1
−(x−µ)2 /(2σ 2 )
√
, −∞ < x <
Normal RVs: pdf f (x) = 2πσ e
R
x
1
−(t−µ)2 /(2σ 2 )
√
and cdf F (x) = 2πσ −∞ e
dt = Φ( X−µ
σ );
2
∞,
with E[X] = µ, V ar(X) = σ .
2
Standardized Z = (X − µ)/σ has pdf φ(x) = √12π e−x /2,
Z x
1
2
cdf Φ(x) = √
e−t /2dt; E[X] = 0, V ar(X) = 1.
2π −∞
Central Limit Theorem: if X1, X2, . . . , is a sequence
of iid RVs with finite mean µ and finite variance σ 2, then
(
)
X1 + X2 + · · · + Xn − nµ
√
< x = Φ(x).
lim P
n→∞
σ n
Note: this is often used in the form
(
)
σ
P |X̄ − µ| < x √
= 1 − α ≈ 2Φ(x) − 1,
n
Pn
to compute an α-confidence interval for X̄ = i=1 Xi/n.
6
PROBABILITY REVIEW CONTINUED
Continuous RVs Continued
• Exponential RVs’: pdf is f (x) = λe−λx, 0 < x < ∞,
cdf is F (x) = 1−e−λx; with E[X] = 1/λ, V ar(X) = 1/λ2.
Exponentional RVs are memoryless:
P {X > s + t|X > s} = P {X > t} or
P {X > s + t} = P {X > s}P {X > t} = e−λse−λt.
n−1
λe−λx (λx)
(n−1)!
• Gamma RVs: pdf f (x) =
, 0 < x < ∞;
n−1
P (λx)i
n
−λx
cdf F (x) = 1 − e
,
E[X]
=
i!
λ , V ar(X) =
i=0
n
.
λ2
Poisson processes: if N (t) is # events occuring in [0, t] with
=λ
N (0) = 0, events are independent, lim P {N (h)=1}
h
h→0
P {N (h)≥2}
h
h→0
N (s+t)−N (s) independent of s, and lim
= 0.
Conditions imply N (t) is Poisson RV with mean λt.
If Xi ith inter-arrival time, Xi’s are iid exponential with
∞
Pn
P
(λt)i
−λt
P { i=1 Xi < t} = P {N (t) ≥ n} = e
i!
i=n
Homogeneous processes have λ independent of t;
Nonhomogeneous processes have λ(t) (dependent on t).
7
PROBABILITY REVIEW CONTINUED
Conditional Expectation and Variance
X
E[X|Y = y] =
P {X = x, Y = y}/P {Y = y} discrete
Zx
Z
=
xf (x, y)dx/ f (x, y)dx continuous;
conditional variance formula
V ar(X) = E[V ar(X|Y )] + V ar(E[X|Y ]).
8
Related documents