Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
Chapter 1- Set theory:
π΄ β© (π΅ βͺ πΆ) = (π΄ β© π΅) βͺ (π΄ β© πΆ)
π΄ βͺ (π΅ β© πΆ) = (π΄ βͺ π΅) β© (π΄ βͺ πΆ)
Aο B
DeMorganβs Laws:
(π΄ βͺ π΅)β² = π΄β² β© π΅ β²
(π΄ β© π΅)β² = π΄β² βͺ π΅ β²
π(π΅) = π(π΅ β© π΄) + π(π΅ β© π΄β² )
π(π΄ βͺ π΅) = π(π΄) + π(π΅) β π(π΄ β© π΅)
π(π΄ βͺ π΅ βͺ πΆ) = π(π΄) + π(π΅) + π(πΆ) β π(π΄ β© π΅) β π(π΅ β© πΆ) β π(πΆ β© π΄) + π(π΄ β© π΅ β© πΆ)
Independence:
π(π΄ β© π΅) = π(π΄)π(π΅)
Chapter 2- Counting Techniques:
Permutations:
Order matters!!
π(π, π) = number of ordered samples of size k from n without replacement.
π(5,3) = 5 β 4 β 3
π(10,2) = 10 β 9
Combinations:
Order doesnβt matter!!
πΆ(π, π) = number of unordered samples of size k from n using sampling without replacement.
5β4β3
10β9
5
10
πΆ(5,3) = 3β2β1 = ( )
πΆ(10,2) = 2β1 = ( )
2
3
Samples With Replacement:
N objects, sample size k
Probability= ππ
Distinguishable permutations:
11!
How many distinct words can you make from the word Mississippi = 4!4!2!
Chapter 3- Conditional Probability:
π(π΄|π΅) = 1 β π(π΄β² |π΅)
π(π΄|π΅) =
π(π΄β©π΅)
π(π΅)
π(π΄ β© π΅) = π(π΄|π΅)π(π΅)
Bayesβ Theorem:
π(π΄)π(π΅|π΄)
π(π΄|π΅) =
π(π΅)
π(π΅) = π(π΄)π(π΅|π΄) + π(π΄β² )π(π΅|π΄β² )
Independence:
π(π΄|π΅) = π(π΄)
π(π΄ β© π΅ β© πΆ) = π(π΄)π(π΅|π΄)π(πΆ|π΄ β© π΅)
Chapter 4- Random Variables:
Continuous random variables:
The probability of any single value is zero, since there are an infinite number of values.
Probability Density Function (pdf):
π(π₯) = π(π = π₯)
π
π(π β€ π β€ π) = β«π π(π₯)ππ₯ = πΉ(π) β πΉ(π)
β
β«ββ π(π₯) = 1
Continuous Discrete Function (cdf):
πΉ(π) = π(π β€ π₯)
π₯
π(π β€ π₯) = β«ββ π(π₯)ππ₯
π(π β€ π₯) = π(π < π₯)
πΉ(ββ) = 0
πΉ(β) = 1
Converting between pdf and cdf:
π₯
πΉ(π₯) = β«ββ π(π₯)ππ₯
π(π₯) = πΉ β² (π₯)
Measures of Central Tendency:
πππ(π + π + π) = πππ(π) + πππ(π) + 2πΆππ£(π, π)
πππ(ππ β ππ) = π2 πππ(π) + π 2 πππ(π) β 2πππΆππ£(π, π)
π(ππ + π) = |π|π(π)
πΈ(ππ + π) = ππΈ(π) + π
Approximations of discrete random variables:
When using an integral to approximate a discrete case (which can only take integer values), you
need to change the limits.
π+0.5
π(π β€ π β€ π) β π(π β 0.5 < π < π + 0.5) = β«
πβ0.5
Chebyshevβs Inequality:
π(π¦)ππ¦
X is a random variable with mean π and variance π 2 .
1
π(|π β π| β₯ ππ) β€ 2
π
1
The probability that X is not within 2π of the mean, is π 2 .
Percentiles:
π(π β€ π₯π ) = πΉ(π₯π ) = π
The probability that X is in the 35% percentile:
π(π β€ π₯0.35 ) = πΉ(π₯0.35 ) = 0.35 Solve for x.
Moment generating function:
ππ₯ (π‘) = ππ π‘π₯
ππ₯ (0) = 1
ππ₯ β² (0) = πΈ(π)
ππ₯ β²β² (0) = πΈ(π 2 )
π = π + ππ₯
ππ (π‘) = π ππ‘ ππ₯ (ππ‘) = π ππ‘ β π ππ‘π₯
ππ+π (π‘) = ππ₯ (π‘) β ππ (π‘)
Mode:
The point where π(π₯) reaches its maximum. Set π β² (π₯) = 0 (the maximum value of a function
happens with the derivative =0). Solve for x.
Median:
-If given cdf, set πΉ(π₯) = 0.5, solve for x
π₯
-If given pdf, integrate between β«0 π(π₯)ππ₯ and set answer equal to 0.5, solve for x.
-For split distributions, draw graphs or substitute in limits to decide where 0.5 falls.
Standard deviations:
βWhat percent of claims fall within one standard deviation of the mean?β
-Find πΈ(π), π
-Make range: [πΈ(π) β π, πΈ(π) + π]
-add up the percentage of claims in that range.
Integrals:
4 |π₯|
4
2
π₯
π₯
β«
ππ₯ = β«
ππ₯ β β«
ππ₯
β2 10
0 10
0 10
β« π₯ β1 ππ₯ = ln π₯
Chapter 5- Discrete Distributions:
Name
Binomial
Negative
Binomial
Density
f ( x) ο½
ο¨ ο©p q
f ( x) ο½
ο¨
Geometric
Hypergeometric
Poisson
Uniform
MGF
n
k
k nοk
ο¨
M x ο¨t ο© ο½ pe t ο« q
ο©
n
n failures before k
successes
ο¦ 1 ο qe t
M x ο¨t ο© ο½ ο§ο§
ο¨ p
οΆ
ο·
ο·
οΈ
n
n failures before the
first success
ο¦ 1 ο qe t
M x ο¨t ο© ο½ ο§ο§
ο¨ p
οΆ
ο·
ο·
οΈ
c(m1 , x)c(m2 , n ο x)
c(m, n)
Sample of n taken
from total of m
n ο« k ο1
n
ο©p q
k n
f ( x) ο½ q p
f ( x) ο½
x = 0,1,2β¦n
e οο¬ ο¬x
f ( x) ο½
x!
1
f ( x) ο½
n
Bernoulli
f ( x) ο½ p x q n ο x
Poisson Approx.
to the Binomial
e ο np (np ) x
f ( x) ο½
x!
ο¬ is the rate
of a rare event
x = 0,1,2β¦n
M x (t ) ο½
great : n ο³ 100, np ο£ 10
Var ( X ) ο½ npq
kq
E( X ) ο½
p
Var ( X ) ο½
kq
q
p
Var ( X ) ο½
q
E( X ) ο½
E( X ) ο½
t
ο1)
N (e t ο 1)
M x ο¨t ο© ο½ pe t ο« q
good : n ο³ 20, p ο£ 0.05
E ( X ) ο½ np
ο1
e t (e Nt ο 1)
ο¨
Var(X)
οk
m ο½ m1 ο« m2
M x (t ) ο½ e ο¬ (e
E(X)
ο©
nm1
m
E( X ) ο½ ο¬
E( X ) ο½
N ο«1
2
E( X ) ο½ p
p2
p2
ο¦ m οΆο¦ m οΆο¦ m ο n οΆ
Var ( X ) ο½ nο§ 1 ο·ο§ 2 ο·ο§
ο·
ο¨ m οΈο¨ m οΈο¨ m ο 1 οΈ
Var ( X ) ο½ ο¬
N 2 ο1
Var ( X ) ο½
12
Var ( X ) ο½ pq
Chapter 6- Continuous Distributions:
Name
Uniform
Density
π(π₯) =
1
πβπ
π(π₯) =
Beta
Weibull
Pareto
Exponential
(π + π β 1)!
π₯ πβ1 (1 β π₯)πβ1
(π β 1)! (π β 1)!
π(π₯) =
ππ₯
π(π₯) =
Mx ο¨t ο©
CDF
π₯
πβ1 β( βπ)
π₯βπ
πβπ
πβ€π₯β€π
πΉ(π₯) =
π, π
0β€π₯β€1
No set formula
ππ₯ (π‘) =
π
π, π
πΉ(π₯) = 1 β π β(
πΌπ πΌ
(π₯ + π)πΌ+1
πΌ, π
π πΌ
πΉ(π₯) = 1 β (
)
π₯+π
1 βπ₯β
π π
π
π
πΉ(π₯) = 1 β π
π₯β )
π
βπ₯β
π
π¬(πΏπ ) =
Gamma
Normal
Lognormal
π(π₯) =
π
(β
(π₯βπ)2
)
2π2
πΌ, π
(π, π )
Must be calculated
with table
of values
(π, π 2 )
not equal
to mean,
variance
Must be calculated
with table
of values
2
πβ2π
Not worth it
Not worth it
π+π
2
πΈ(π) =
N/A
π
πΈ(π) =
π+π
N/A
1
πΈ(π) = π β ( ) !
π
π½π π!
(πΆ β π). . (πΆ β π)
ππ₯ (π‘) =
πΈ(π) =
π
πΌβ1
Var ο¨X ο©
πππ(π) =
(π β π)2
12
πππ(π)
=
ππ
(π + π)2 (π + π + 1)
Not worth it
use ππ₯ (π‘) to calculate
1
1 β ππ‘
πΈ(π) = π
πππ(π) = π 2
1
(1 β ππ‘)πΌ
πΈ(π) = πΌπ
πππ(π) = πΌπ 2
πΈ(π) = π
πππ(π) = π 2
ππ₯ (π‘) =
βπ₯
π₯ πΌβ1 π βπ
π(π₯) = πΌ
π (π β 1)!
π ππ‘ β π ππ‘
π‘(π β π)
π
π
ππ
π(π₯) =
E ο¨X ο©
ππ₯ (π‘) = π
(ππ‘+
π2 π‘ 2
)
2
2
N/A
πΈ(π) = π
(π+12π2 )
πππ(π) = π (2π+2π )
2
β π (2π+π )
Chapter 7- Normal Distribution:
The normal distribution:
Has a complicated pdf and an unsolvable cdf. Use the table of values to look up the cdf for
certain values.
The Standard Normal Distribution has π = 0 and π = 1.
π~π(π, π 2 )
Evaluating the standard normal:
π(π β€ π₯) = Ξ¦(x)
π(π < π₯) = 1 β π(π > π₯)
π(π > π₯) = π(π β€ βπ₯) = Ξ¦(βπ₯) = 1 β Ξ¦(π₯)
Ξ¦(π₯) = 1 β Ξ¦(βπ₯)
π(π β€ βπ₯) = Ξ¦(βπ₯) = 1 β Ξ¦(π₯)
π(π > βπ₯) = π(π < π₯) = Ξ¦(π₯)
*Values in the table are for STANDARD normal distribution with π = 0 and π = 1.*
What if π = 20 and π = 15?
π~π(20,225)
π₯βπ
π₯βπ
π (π <
) = Ξ¦(
)
π
π
12 β 20
π(π < 12) = π (
) = π(π < β0.53) = 1 β Ξ¦(0.53)
15
π(π > 5) = 1 β π(π < 5) = 1 β π (π <
5 β 20
) = 1 β π(π < β1) = 1 β (1 β Ξ¦(1)) = Ξ¦(1)
15
Adding independent distributions:
π1 ~π(30,100), π2 ~π(40,150)
π = π1 + π2
π~π(70,250)
π~π(π, π 2 ) and π = ππ + π
π~π(ππ + π, π2 π 2 )
Subtraction of distributions:
π1 ~π(10,20)
π2 ~π(30,10)
π = π1 β π2
π~π(π1 β π2 , π12 + π22 )~π(β20,30)
The Central Limit Theorem:
π1 , π2 , β¦ , ππ are independent and identically distributed with π and π 2 . When n is large (π β₯
30), then the sum: π1 + π2 + β― + ππ is ~π(ππ, ππ 2 )
So even if π1 , π2 , β¦ , ππ arenβt normally distributed, we can use the normal distribution on the
sum.
Sample mean:
1
π = (π1 + π2 + β― + ππ )
π
π2
π~π (π, π )
1
(π + π )
2 1 2 22
π +π
π~π (π, 1 22 2 )
π=
Using normal distribution to estimate a discrete distribution:
Continuous can take any value, discrete can take only integers.
π(π < π₯ < π) β π(π β 0.5 < π₯ < π + 0.5)
If Y follows a normal distribution π βΌ π(π, π 2 ), then:
π + 0.5 β π
π β 0.5 β π
π(π < π₯ < π) = Ξ¦ (
)βΞ¦(
)
π
π
Binomial:
π~π(ππ, πππ)
Poisson:
π~π(π, π)
Lognormal Distribution:
-parameters π and π are not the mean and standard deviation.
-if X follows a lognormal with parameters π and π 2 , and π = ln π₯, then Y follows a normal
distribution with mean π and standard deviation π.
-if X follows a normal distribution with mean π and variance π 2 , and π = π π₯ , then Y follows a
lognormal distribution with parameters π and π 2 .
Chapter 8- Multivariate Distribution:
Joint pdf:
ππ₯,π¦ (π₯, π¦)
Find π(π₯ = π₯) by adding row π₯ = π₯ in table.
π(π = π₯) = ππ₯ (π₯)
π(π = π¦) = ππ¦ (π¦)
ππ₯ (π₯|π = π¦) =
ππ₯,π¦ (π₯, π¦)
ππ¦ (π¦)
π(π₯, π¦) = π(π¦|π₯)π(π₯)
Independence:
ππ₯,π¦ (π₯, π¦) = ππ₯ (π₯)ππ¦ (π¦)
= marginal probability functions
πΈ(ππ) = πΈ(π)πΈ(π)
Joint continuous pdf:
ππ₯,π¦ (π₯, π¦) β₯ 0
Double integral:
Total area must sum to 1
β
β
β« β« ππ₯,π¦ (π₯, π¦)ππ₯ππ¦ = 1
ββ ββ
Marginal Continuous probability functions:
β
ππ₯ (π₯) = β« ππ₯,π¦ (π₯, π¦)ππ¦
ββ
β
ππ¦ (π¦) = β« ππ₯,π¦ (π₯, π¦)ππ₯
ββ
β
πΈ(π) = β π₯ππ₯ (π₯) = β« π₯ππ₯ (π₯)ππ₯
ββ
πΈ(π + π) = πΈ(π) + πΈ(π)
Moment generating function:
ππ,π (π , π‘) = πΈ(π π π₯+π‘π¦ )
πΈ(π|πΉ) = β π₯ππ₯ (π₯|π)
πππ(π|π) = πΈ(π 2 |π) β πΈ(π|π)2
Covariance:
πΆππ£(π, π) = πΈ(π, π) β πΈ(π)πΈ(π)
πππ(π) = πΆππ£(π, π)
πΆππ£(π, π) = πΆππ£(π, π)
πΆππ£(ππ + ππ, π + π) = ππΆππ£(π, π) + ππΆππ£(π, π) + ππΆππ£(π, π) + ππΆππ£(π, π)
Independence:
πΆππ£(π, π) = 0
Correlation Coefficient:
πΆππ£(π, π)
π=
ππ₯ ππ¦
Chapter 9- Transformations of Random Variables:
Where x is a random variable, π = π(π₯) is a function of x.
Method of Transformations:
π(π¦) = ππ₯ (πβ1 (π¦)) β |[πβ1 (π¦)]β² |
1. ππ₯ (π₯) is usually given, as well as π = π(π₯)
1 βπ₯β
ππ₯ (π₯) =
π 100
100
π = 1.1π
2. Find πβ1 (π¦) (aka solve for x in second equation)
π = 1.1π
π = πβ1.1
3. Use equation
π(π¦) = ππ₯ (πβ1 (π¦)) β |[πβ1 (π¦)]β² | = (
1 β(πβ1.1)β
1
1 βπβ
100 ) β (
π
)=
π 110
100
1.1
110
Method of Distribution Functions:
Use πΉπ¦ (π¦) = π(π β€ π¦) and πΉπ₯β² (π₯) = ππ₯ (π₯)
1. ππ₯ (π₯) and π = π(π₯) given
πΉπ¦ (π¦) = π(π β€ π¦)
Substitute Y ο π(π β€ π¦) = π(1.1π β€ π¦)
Solve for x ο π(π β€ πβ1.1)
Means: πΉπ₯ (πβ1.1)
2. Use cdf of ππ₯ (π₯) = 1 β π β(
Notes on SOA 127 Packet:
Average of two variances:
π1 +π2
πππ (
2
2
) = (12 )πππ(π1 + π2 )
Average of n variances:
ππ
ππππ(π) πππ(π)
πππ ( ) =
=
π
π2
π
πβ )/100
1.1
= 1 β πβ
πβ
110
ππ‘
π(π¦) = π(π‘(π¦)) β | | = π(π‘) β π‘β²
ππ¦
Integration by parts:
π
π
β« π’ ππ£ = π’π£|ππ β β« π£ ππ’
π
π
Example 1:
π
π(π₯) = π₯π π₯ ,
β« π₯π π₯ ππ₯ = β« π’ ππ£
π
π’=π₯
ππ£ = π π₯ ππ₯
ππ’ = ππ₯
π£ = ππ₯
implies
π
=
π’π£|ππ
π
β β« π£ ππ’ =
π₯π π₯ |ππ
β β« π π₯ ππ₯
π
π
Example 2:
π(π₯) = π₯π
βπ₯βπ
π
,
β« π₯π β
π₯β
π
ππ₯ = β« π’ ππ£
π
π’=π₯
π₯
ππ£ = π β βπ ππ₯
implies
π
= π’π£|ππ β β« π£ ππ’ = βππ₯π β
π
Example 3:
π₯ π₯
π(π₯) = π β βπ ,
π
π’=π₯
π₯
1
ππ£ = π π β βπ ππ₯
ππ’ = ππ₯
π₯
π£ = βππ β βπ
π
β« π₯π β
π₯β π
π|
π
π₯β
π
π
+ β« ππ β
π₯β
π
ππ₯
π
ππ₯ = β« π’ ππ£
π
implies
π
= π’π£|ππ β β« π£ ππ’ = βπ₯π β
π
π₯β π
π|
π
ππ’ = ππ₯
π₯
π£ = βπ β βπ
π
+ β« πβ
π₯β
π
ππ₯
π
Deductibles:
An insurance policy has a deductible of d. This means that if x is less than d, the payout is 0, and
if x is more than d, the payout is x-d.
Payout = {
0
π₯<π
π₯βπ π₯ β₯π
β
πΈ(π) = β«π (π₯ β π)π(π₯)ππ₯
β
πΈ(π 2 ) = β«π (π₯ β π)2 π(π₯)ππ₯