Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Probability 1 Random Experiment A experiment with more than one outcome. Outcomes vary in a unpredictable way when the experiment is repeated. Sample space: denoted by S, the collection of all possible outcomes of the random experiment. Events are subsets of the sample space. we usually denote them be A,B,C elementarily events (outcomes) are single outcomes from the sample space, denoted usually by a,b,c. Set-Ops (algebra on events) vendiagram B B A A B A B A Union of A and B π΄ βͺ π΅, A or B or both occur (maroon) Interception of A and B π΄ β© π΅ both A and B occur (darker section) Complement: π΄πΆ (Yellow) ππ π΄Μ A does not occur Difference: π΄\π΅ = π΄ β© π΅Μ A occurs, B does not (light purple) Rules: Μ Μ Μ Μ Μ Μ Μ π΄ βͺ π΅ = π΄Μ β© π΅Μ Μ Μ Μ Μ Μ Μ Μ π΄ β© π΅ = π΄Μ βͺ π΅Μ Interception is done before union, Operation order. The above rules daisy chain, with themselves but not with each other Probability: π: {π π’ππ ππ‘π : ππ π} β [0,1] Axioms of probability: 1) P(A)β₯0 for each A 2) P(S)=1 3) π(π΄ βͺ π΅) = π(π΄) + π(π΅) if A and B are disjoint or mutrally exclusive ie π΄ β© π΅ = { } == ππππ‘π¦ π ππππ Rules: 4) π(π΄1 βͺ π΄2 βͺ β¦ π΄π ) = π(π΄1 ) + π(π΄2 ) + π(π΄π ) if Ai β© Aj = {} for each pair of i,j: iβ j 5) π(π) = 0 as π΄ βͺ π = π΄ and π΄ β© β = β so π(π΄ βͺ π) = π(π΄) = π(π΄) + π(π) therefore π(π) = 0 6) P(Aβ)=1-P(A) 7) if π΄ β π΅ then π(π΅\π΄) = π(π΄ β© π΅Μ ) = π(π΅) β π(π΄) 8) π(π΄ βͺ π΅) = π(π΄) + π(π΅) β π(π΄ β© π΅) Example: page 77 prob 3.48 Know: P(A)=0.35 P(B)=0.73 π(π΄ β© π΅) = 0.14 a) π΄ βͺ π΅ = π΄ βͺ (π΅\(π΄ β© π΅)) π΄ β© (π΅\(π΄ β© π΅) = {} π(π΄ βͺ π΅) = π(π΄ βͺ (π΅\(π΄ β© π΅))) = π(π΄) + P(B\(Aβ© π΅))= π(π΄) + π(π΅) β π(π΄ β© π΅) b)π(π΄ β©Bβ) c) π(π΄β² βͺ π΅β² ) = 1 β π((π΄β² βͺ π΅β² )β² ) = 1 β π(π΄β²β² β© π΅β²β² ) = 1 β π(π΄ β© π΅) = 1 β 0.14 Conditional Probability. what are the chances of something happening given something else has happened. π(π΄|π΅)ππ π(π΄)πππ£ππ π΅ ππππ’ππ Let ππ΄ be the number of times A occurs in n experiments Let ππ΅ be the number of times B occurs in n experiments Let ππ΄β©π΅ be the number of times A and B both occurs in n experiments π(π΄|π΅) β ππ΄β©π΅ ππ΅ β π(π΄β©π΅) π(π΅) π(π΄ β© π΅) π(π΅) π(π΄|π΅) = 0 if P(B)=0 π(π΄|π΅) = Independence of events Roughly about If π(π΄|π΅) = π(π΄) A is independent of B. π(π΄|π΅) = π(π΄) = π(π΄|π΅β² ) π(π΄β² |π΅) = π(π΄β² ) = π(π΄β² |π΅β² ) Definition of independence A and B are said to be independent if π(π΄ β© π΅) = π(π΄)π(π΅) Properties: π(π΄|π΅) = π(π΄) = π(π΄|π΅β² ) π(π΄β² |π΅) = π(π΄β² ) = π(π΄β² |π΅β² ) Similar hold true for B, in respect to prior knowledge of A. From: π(π΄ β© π΅) = π(π΄)π(π΅) (A and B are independent) it follows π(π΄β² β© π΅) = π(π΄β²)π(π΅) π(π΄ β© π΅β²) = π(π΄)π(π΅β²) π(π΄β² β© π΅β²) = π(π΄β² )π(π΅β²) Example (3.62 p89) population of all individuals with certain income. 58% invest in money market 25% invest in stocks 19% invest in both If you randomly pick a person who invests in money market, what is the probability that they will also invest in stocks. ie: What is the probability of a person investing in stocks given that they invest in the money market Solution: Relevant Events: A: event that the person picked is investing in money market B: event that the person picked is investing in stocks We Know: P(A)=0.58 P(B)=0.25 π(π΄βB)=0.19 We want to know: P(B|A) Example: Let A and be mutually exclusive (disjoint). π΄βB={} .β. P(π΄βB)=0 P(A)>0, P(B)>0 question: are A and B independent? No as P(A)P(B)=\=0 Impendence of more than two events three events; A,B,C are indendent if (all of the following are true) i) π(π΄ β© π΅) = π(π΄)π(π΅) ii) π(π΄ β© πΆ) = π(π΄)π(πΆ) iii) π(π΅ β© πΆ) = π(π΅)π(πΆ) iv) π(π΄ β© π΅ β© πΆ) = π(π΄)π(π΅)π(πΆ) Properties: P(Aββͺ B|C)=P(Aββͺ B) P(C|Aβ© Bβ)=P(C) Similar for more than 3 events. (more than three wonβt be ascessed) Total Probability formula Let π΄1 , π΄2 β¦ π΄π be a partition of the sample space (S). s.t. π΄π β© π΄π = { } for each pair i,j π S = β π΄π = π΄1 βͺ π΄2 β¦ βͺ π΄π = π π=1 Let B be an event which overlaps with some of the Partitions A_i (it must, as they cover all of S) π π(π΅) = β π(π΅ β© π΄π ) π=1 Remember: π(π΄|π΅) = π(π΄β©π΅) => π(π΅) P(Aβ© B)=P(B)P(A|B)=P(A)P(B|A) π π π(π΅) = β π(π΅ β© π΄π ) = β π(π΅|π΄π )π(π΄π ) π=1 π=1 Example; (P90 Q3.73) let D be event of renting a car from agency D E renting from agency E F renting from agency F B be the event of the car rented having bad tyres P(D)=0.2 P(E)=0.2 P(F)=0.6 these are mutually exclusive and Dβͺ Eβͺ F=S, so is a partition of S P(B|D)=0.1, P(B|E) =0.12 P(B|F)=0.04 10% of cars from D have bad tyres Total Probability Formula P(B)=P(B|D)P(D)+P(B|E)P(E)+P(B|F)P(F) = 0.1x0.2 + 0.12x0.2 + 0.04x0.6= proability of getting bad tyres Q3.74 What is the proability that a car with bad tyres that is rented, has been rented froim agency F. π(πΉ β© π΅) π(π΅|πΉ)π(πΉ) 0.04 β 0.6 π(πΉ|π΅) = = = π(π΅) π(π΅) π(π΅) Bayes Formula π΄1 , π΄2 β¦ , π΄π forms a partition of S π(π΄π |π΅) = π(π΅|π΄π )π(π΄π ) π βπ=1(π(π΅|π΄π )π(π΄π )) Random Variable (R.V) A random variable, say X, is a real function on a sample space. X:Sββ. Roughly Maps partitions on to real numbers. Discrete R.V: finite number of possible values of Random Variables> Continuous RV: the possibilities cover an interval from β Discrete RVs p(x)=P(X=x), x is a possible value of X p(x) is called the probability the probability mass function of X. (can also be written ππ (π₯) ) Properties of Probaility Mass Function: 1) p(x)β₯0 for all x 2) βall x p(x) = 1 Examples of important discrete R.Vβs 1) beruoulli r.v: two possible outcomes: X has a sample space ππ = πππ πππ ππππ π£πππ’ππ ππ π ππ = {0,1} P(1)=p called probability of βsuccessesβ P(0)=1-p called probability of βfailureβ 2) indicator function: (also a beruoulli r.v:) if you divide a sample space into A (1) and Aβ (0) 1 π€βπ΄ πΌπ΄ (π€) = { 0 π€βπ΄ π(πΌπ΄ = 1) = π(π΄) Bernoulli RVs let π1 , π2 , π3 , β¦ , ππ be n indiepent Bernoulli RVs, with probability of success p (ππ ~ π΅πππππ’πππ(π) ) π = π1 + π2 + π3 + β¦ ππ Y is the count of successes in βnβ indepenedent Bernoulli(p) r.vβs. Defininition:y is calle Binominal r. (Y~Bin(n,p)) Find the pmf (probability Mass function) of Y. ππ (π¦) = π(π = π¦); ππ (0) = π(πππ π‘πππππ πππ’ππ π§πππ) = (1 β π)(1 β π) β¦ (1 β π) = (1 β π)π ; ππ (1) = π(1 π π’ππππ π πππ (π β 1) πππππ’πππ ) = ππ(1 β π)πβ1 ; ππ (π) = π(π π π’ππππ π ππ , (π β π)πππππ’πππ ) = (ππ)ππ (1 β π)(πβπ) ; k=0,1,2,3β¦,n Example: p112 Q4.11= family individuals 13 individuals attend. Count how main have a cold. is not modalable with a binomial, because they arenβt independent because other family member could give each other colder. A meeting of strangers, who donβt live near each other, etc. We can now model with binomial how many have colds. Because is independent. A shop has 8 projecter, 2 of which are broken (but not marked). If we pick 2 what are the chances of how many of the 2 are defective. Not modalable with binomial as we change the same space by removing one. Cumulative Distribution function Let Z be a r.v with ππ (π§) with pmf: πΉ(π§) = π(π β€ π§) , zβ β πΉ(π§) = β ππ (π₯) πππ π₯β€π§ we will use tables. Example Q4.26 p 114 (for very large sample doesnβt matter if no replacing) 2 out of 20 new buildings (10%) in a city violate building code. the building inspector who checks 4 buildings: Let Y=# of sampled buildings, out of 4, violating the buildingcode. π~π΅ππ(4, 0.1) A) What is the probability that none of these (checked buildings ) violate the building code? 4 π(π = 0) = ( ) 0.10 (1 β 0.1)4 = 0.94 0 4 π(π = 1) = ( ) 0.11 (1 β 0.1)3 = 1 Expected Value of a discrete r.v expected value = expectation =mean Let X be a discrete r.v with p.m.f ππ (π₯) Defn: πΈ(π) = β π₯ ππ (π₯) πππ π₯ Example: π₯ ππ (π₯) -1 0.2 0 0.5 2 0.3 πΈ(π) = β π₯ ππ (π₯) = (β1)0.2 0(0.5) + 2(0.3) = π. π πππ π₯ EXs: X~bernoulli(p): πΈ(π) = 1π + 0(1 β π) = π X~Bin(n,p): βππ₯=0(π₯ (ππ₯)π π₯ (1 β π)πβπ₯ = ππ Long Run interpretation of E(X) The random experiment is repeated n times, and the value of X are recorded as π₯1 , π₯2 , β¦ . , π₯π . Then π₯1 +π₯2 + β¦+π₯π β πΈ(π) π πββ X β descrete r.v let Y=f(X) Y is therefore a random varirable Find E(Y) πΈ(π) = β π¦ ππ (π¦) πππ π¦ Proposition: πΈ(π) = β π(π₯) ππ (π₯) πππ π₯ Variabnce if a discrerte r.v πππ(π) = πΈ[(π β πΈ(π)2 ] 2 π(π₯) = (π₯ β πΈ(π)) , πΈ(π)ππ π ππππ π‘πππ‘ = πΈπ₯ππππ‘ππ‘πππ .β. πππ(π) = πΈ[(π β πΈ(π)2 ] = πΈ(π(π)) = β π(π₯) ππ (π₯) πππ π₯ Example: X~bernoulli(p) 1 2 πππ(π) = β(π₯ β πΈ(π)) ππ₯ (π₯) = (0 β π)2 (1 β π) + (1 β π)2 π = π(1 β π) π₯=0 Expection Properties: πΈ(π1 + π2 + β― + ππ ) = πΈ(π1 ) + πΈ(π2 ) + β― + πΈ(ππ ) πΈ(ππ + π) = ππΈ(π)+b 2 2 πΈ[π β πΈ(π)) ] = πΈ(π 2 ) β (πΈ(π)) Moments of r.vβs πΈ(π π ) β ππππππ π‘βπ ππ‘β ππππππππ‘ ππ π E(X) is the first moment of X πΈ(π 2 ) is the second moment of X Variance of a biniomial: π π πππ(π) = β ((π₯ β ππ)2 ( ) π π₯ (1 β π)πβπ₯ ) = ππ(1 β π) π₯ π₯=0 Poisson RVβs ππ (π₯) = π(π = π₯) = ππ = {0,1,2, β¦ } ππ₯ βπ π , π₯! where Ξ» >0 X~Poi(Ξ») Axiom 3 expended: If π΄1 , π΄2 , β¦. mutrally exclusive β β π (β π΄π ) = β(π(π΄π )) π=1 π=1 β β π₯=0 π₯=0 ππ₯ βπ ππ₯ βπ β ( π ) = π β ( ) = π βπ π π = 1 π₯! π₯! Macrabian expansion: β π¦ π = β( π=1 π¦π ) π! Properties: π~πππ(π) E(X)=Ξ» β β β π₯=0 π₯=0 π₯=1 ππ₯ ππ₯ πΈ(π) = β π₯ππ (π₯) = β π₯ π βπ = π βπ β = π βπ ππ π = π π₯! (π₯ β 1)!