Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
S
Conditional Probability
P(A/B )=
π(π΄β©π΅)
A
Aβ©π΅
B
π(π΅)
- Knowing that event B has occurred, S is reduced to B
Figure1
- Using relative frequency repeated n-times.
ππ΄β©π΅/π
π(π΄β©π΅)
ππ΅/π
π(π΅)
= P(A/B)
Note
-
π(π΄ β© π΅) = P(A/B)P(B)= P(B/A)P(A)
Similarly,
-
P(Aβ©B β© πΆ) = P(A/B)
= P(A/Bβ© πΆ)π(π΅ β© πΆ)P(C)
Example:
In a binary communication system, suppose that P(A0)=1-P, P(A1)=P and P(error)=Ξ΅. Fined
π(π΄π β© π΅π )
where, i,j=0,1
Sol.
P(A0β©B0)=P(B0/A0)P(A0 ) = (1-Ξ΅)(1-P)
P(error)= Ξ΅ =P(B0/A1)=P(B1/A0)
P(A0β©B1)=P(B1/A0)P(A0)= Ξ΅(1-P)
P(A1β©B0)=P(B0/A1)P(A1)= Ξ΅ * P
P(A1β©B1)=P(B1/A1)P(A1)= (1-Ξ΅)P
Total Probability theorem
Let S=B1 β© B2 β¦β¦ β© Bn (mutually exclusive events)
B1, B2, β¦β¦β¦..,Bn are called partition of S
Any event A=Aβ©S = Aβ©(B1βͺB2βͺB3β¦β¦.βͺBn)
= (Aβ©B1) βͺ (Aβ©B2) βͺ (Aβ©B3)β¦β¦ βͺ (Aβ©Bn)
P(A)=P(A/B1)P(B1)+β¦β¦β¦β¦β¦β¦.+P(A/Bn)P(B)
This is useful when an experiment can be viewed as sequences of two sub-experiments as
shown in figure 2.
β¦β¦β¦.
Bn-1
B1
A
B2
Bn
B3
β¦
Figure2
Example:
A factory produces a mix of good and bad chips. The lifetime of good chips follow exponential
law with rate Ξ± and bad chips with rate 1000 Ξ±.
Suppose (1-P) of hips are good, find the probability that a randomly selected chip is still
functioning after t?
Sol.
G=β Chip is goodβ
B=β Chip is badβ
C=β Chip is functioning after time t β
P(C/G) = π βπΌπ‘ , P(C/B) = π β1000πΌπ‘
Find P(C)
P(C) =P(CβͺG) + P(Cβ©B)
=P(C/G)P(G)+P(C/B)P(B)
= π βπΌπ‘ (1-P) + π β1000πΌπ‘ P
Bayesβ Rule
Example:
Let S=B1βͺB2βͺ β¦β¦β¦β¦β¦β¦β¦.βͺBn (Partition)
Suppose that event A occurs, what is the probability of Bj?
Sol.
P(π΅π /A) =
π(π΄β©π΅π )
π(π΄)
π΄
)π(π΅π )
π΅π
π΄
βπ
π=1 π(π΅ )π(π΅π )
π
π(
=
Bayesβ rule is useful in finding the βa posterioriβ
(P(Bj/A) in terms of the β a prioriβ (P(Bj) before experiment is performed) and occurrence of A
Example:
In binary communication system, find which input is more probable given that the receive has
output 1. Assume P(A0) = P(A1) and P(error)= Ξ΅.
Sol.
P(A0/B1) =
=π
2
πβ12
(1βπ)
+
2
π΅
π( 1)π(π΄0 )
π΄0
π(π΅1 )
=
π΅
π( 1)π(π΄0 )
π΄0
π΅1
π΅
π( )π(π΄0 )+π( 1)π(π΄1 )
π΄0
π΄1
=Ξ΅
P(A1/B1) =
π΅
π( 1)π(π΄1 )
π΄1
π(π΅ )
=
(1β)πβ12
π (1βπ)
+
2
2
If Ξ΅<12, then A1 is more probable.
= 1-Ξ΅
Independence of events
Events A and B are independent if
P(A/B)= P(A)P(B) or P(A/B) = P(A)
Example: Two number x and y are selected at random between 0 and 1. Let event A={x> 0.5},
B={y<0.5}
C={x>y}
Are A and B independent?
Are A and C independent?
Sol.
P(A)=0.5=P(B)=P(C)
P(Cβ©B) = 14
P(Aβ©B) = 14
P(Aβ©C) = 38
P(Aβ©B) = 14 = P(A)P(B)
then A & B Independent.
P(Aβ©C) = 38 β P(A)P(B)
then A & C dependent.
Notes
(1): If P(A) > 0 and A and B are mutually exclusive or disjoint, then A and B cannot be
independent.
(2) If A and B are independent and mutually exclusives, then P(A)=0 or P(B)=0.
(3) A, B and C are independent, if:
P(Aβ©B)= P(A)P(B)
P(Aβ©C)= P(A)P(C)
P(Bβ©C)= P(B)P(C)
P(Aβ©Bβ© πΆ)= P(A)P(B)P(C)
Pairwise Independent
(4) Independent is often assumed if events have no obvious physical relation.
Sequential experiments
-
Many experiments can be decomposed of sequences of simpler sub experiments, which
may or may not be independent.
Sequences of independent experiments A1, A2, β¦β¦β¦, Ak be events associated with the
outcomes of k independent sub-experiments S1, S2, β¦β¦.., Sk then
P(A1β©A2β¦ β¦ β¦ β© π΄k)= P(A1)P(A2)β¦β¦P(Ak)
Example:
Suppose 10 numbers are selected at random between [0 1]. Find the probability that the first 5
numbers < 14 and the last 5 β₯ 12
Sol.
1 3
1 3
Probability (4) (2) = 3.05 *10β3
The binomial law
A Bernoulli trial involves performing as experiment and noting whether event A occurs
βsuccessβ or not βfailureβ.
Theorem: Let k be the number of success in n independent Bernoulli trials, then the probability
of k successes:
Pn(k)= (ππ)ππ (1 β π)πβπβ3
where k = 0, β¦β¦β¦β¦n
Example:
let k be the number of active speakers in a group of 8 independent speakers.
Suppose a speaker is active with probability = 13
Find probability that the number of active speakers is greater than 6.
Sol.
Let k=number of active speakers.
P(k>6) = P(k=7) + P(k=8)
1 7 2 1
1 8 2 0
= (87) (3) (3) + (88) (3) (3) = 0.00259
The multinomial probability law
Let the experiment is repeated n times and each Bj occurred kj times. Then, probability of the
vector (k1, k2, β¦β¦β¦β¦kn)
P(k1, k2, β¦β¦β¦β¦kn) =
π!
π1!π2!β¦β¦..ππ!
π1π1 π2π2 β¦β¦β¦. ππππ
Example:
Pick 10 phone numbers at random and note the last digit. What is the probability that you
obtain 0 to 9 (with ordering)
Probability =
10!
1!1!β¦β¦..1!
(0.1)10 = 3.6 * 10β4
The Geometric probability law
-
Here, the outcome is the number of independent Bernoulli trials until the occurrence of
the first success. S= {1, 2, β¦β¦β¦β¦β}
The probability P(m) that m trials are required means that the first m-1 result in failures
and the mβth trials is success. Let Ai = βsuccess in iβth trialsβ
P(m) = P(A1, A2, β¦β¦β¦..Am-1, An) = (1 β π)πβ1 p
-
where n=1,2, β¦β¦β¦.
The probability that more than k trials are required before a success:
β
β
P[{m>K}] = p βπ=π+1 π πβ1 = p π k βπ=0 π π
= p πk
= πk
1
1βπ
Example:
Computer A sends a message to computer B. If the probability of error is (0.1). if error
occurs, B request A resend the message. What is the probability that a message is
transmitted more than twice?
Sol.
Each transmission is Bernoulli trial with P(success) = 0.9 and P(failure) =0.1
P(m>2) = (1 β p)2 (1 β 0.9)2 = (0.1)2 = 0.01