Download 1 Basic Probability laws 2 Random variable X

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
1
Basic Probability laws
0 ≤ P (X = x) ≤ 1
(1)
where X is the random variable, and x is the value of the random variable.
n
X
P (X = xi ) = 1
(2)
i=1
where x1 , x2 , · · · , xn are mutually exclusive.
Mutually exclusive: One event precludes the occurance of another event. Ex: Toss of a coin in which
both H and T cannot occur at the same time in one trial. So H and T are mutually exclusive events.
If x1 , x2 , · · · , xn are mutually exclusive then
P (X = x1 ∪ X = x2 ∪ · · · ..... ∪ X = xn ) = P (X = x1 ) ∪ P (X = x2 ) ∪ · · · ..... ∪ P (X = xn )
(3)
Probability of an event in set A or in set B to occur is given by
P (A ∪ B) = P (A) + P (B) − P (A ∩ B)
(4)
if A and B have something in common (not mutually exclusive). Otherwise it is
P (A ∪ B) = P (A) + P (B)
(5)
if A and B have nothing in common (mutually exclusive).
Independent events: If A and B are independent trials, that is both trials can occur and are independent of each other, then
P (A ∩ B) = P (A) × P (B)
(6)
Ex: Tossing a coin 2 times and getting Head twice. Here the event of getting H occur in independent
trials. The probability of getting two heads is 0.5 × 0.5, which is 0.25.
2
Random variable X
X takes on values xi . Probability of xi is P (X = xi ) or P (xi ).
2.1
Discrete Random variable
pmf= probability mass function p(xi ) = P (X = xi ), cdf = cumulative distribution function F (xi ) =
P
P (X ≤ xi ) = ii=0 p(xi )
2.2
Continuous Random variable
pdf= probability density function f (x), cdf = cumulative density function F (x) =
2.3
Rb
a
f (x)dx
Expected Value of a Random variable
E(X) =
n
X
xi p(xi )
(7)
xf (x)dx
(8)
i=0
Z n
E(X) =
0
1
2.4
Variance of a Random variable
V ar(X) = E(X 2 ) − (E(X))2 (9)
E(X 2 ) =
E(X 2 ) =
n
X
(xi )2 p(xi )
i=0
Z n
x2 f (x)dx
(10)
(11)
0
3
3.1
Discrete Distributions
Binomial
x successes in n trials
P (X = x) = (nCx)px (1 − p)n−x
(12)
where p is the probability of success (Ex. success could be defined as finding a defect in manufacturing).
The mean is np and the variance is np(1 − p). Please note: the distribution is for the number of
successes.
3.2
Geometric
First success in n trials
P (X = n) = p(1 − p)n−1
(13)
Please note: the distribution is for the number of trials.
The mean is (1/p) and the variance is (1 − p)/p2 .
3.3
Negative Binomial
xth success in the nth trial
P (X = n − x) = {(n − 1)C(x − 1)}px (1 − p)n−x
(14)
This means there are x − 1 successes in n − 1 trials. There are a total of n − x failures.
The mean is (x/p) and the variance is x(1 − p)/p2 .
Negative Binomial is also a sum of Geometric distributions. Please note: the distribution is for the
number of failures.
3.4
Hypergeometric
D defects in N , x defects in n, where n is sampled from N .
P (X = x) =
{(N − D)C(n − x)}{(D)C(x)}
{(N )C(n)}
2
(15)
3.5
Poisson
n is large and p is very small. Ex: Arrival process. λ = np.
P (X = x) =
e−λ λx
x!
(16)
mean = variance = λ
3.6
4
Poisson approximation to Binomial
Continuous Distributions
4.1
Exponential
f (x) = λe−λx
Z x
F (x) =
f (x)dx = 1 − e−λx
(17)
(18)
0
mean = 1/λ, variance = 1/λ2
4.2
Normal
mean = µ, std. dev = σ, Normal = N(µ, σ), Standard Normal = Z(0,1)
Z = (X − µ)/σ
5
(19)
Conditional Probability
Probability of event A given event B has already occurred is
P (A|B) =
P (A and B)
P (B)
(20)
P (B|A) =
P (A and B)
P (A)
(21)
P (A|B) =
P (B|A)P (A)
.
P (B)
(22)
Similarly,
Therefore, combining the above
P (B) = P (B|A)P (A) + P (B|Ā)P (Ā)
(23)
where Ā is the complement of A or in other words (not in A) = Ā.
5.1
Bayes Theorem
Combining the above two equations (9 and 10) yields the Bayes Theorem
P (A|B) =
P (B|A)P (A)
.
P (B|A)P (A) + P (B|Ā)P (Ā)
3
(24)
Related documents