Download Chapter 3 Discrete Random Variables

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Chapter3DiscreteRandomVariables
3.1Definition
A random variable (X) assigns numbers to outcomes in the
sample space of an experiment.
Example
Ø Randomly choose an integer (X) between 0 and 10.
The range of X is SX = {0, 1, 2, 3,….10}
Ø From Mon. to Fri., the days of watch (w) and not watch (n) TV.
A random variable related to this experiment is N, the total
days of watch TV
The range of N is SN = {0, 1, 2, 3, 4, 5}
3.1Definition
RandomVariable
A random variable consists of an experiment with a probability
measure P[.] defined on a sample space S and a function that assigns a
real number to each outcome in the sample space of the experiment.
DiscreteRandomVariable
X is a discrete random variable if the range of X is a countable set
SX = {x1, x2, …, xn}
Example
The experiment is to observe the grade of ECE352. The sample
space is S = {A, B, B, A, C, B, B, A}
3.1Definition
Example
The experiment is to observe the grade of ECE352. The sample
space is S = {A, B, B, A, C, B, B, A}. We use a function G(.) to map
this sample space into a random variable. For example, G(A) = 4.0,
G(B) = 3.0, G(C) = 2.0.
Outcomes A B
B
A C
B B
A
G
4.0 3.0 3.0 4.0 2.0 3.0 3.0 4.0
3.2ProbabilityMassFunction
Definition
The probability mass function (PMF) of the discrete random
variable X is
PX(x) = P[X = x]
Note that X = x is an event consisting of all outcomes of the
underlying experiment for which X(s) = x.
3.2ProbabilityMassFunction
Example
My family order a pizza for dinner. Andrew eats 1/6, Daniel eats
1/6, my wife eats 1/4, and I eat the left over (5/12). What is the
probability mass function.
family member
A (random variable)
Px(100)=5/12+1/4
Px(9)=1/6
Px(8)=1/6
me
wife
100
100
Andrew
9
⎧
⎪
⎪
⎪
PX (x) = ⎨
⎪
⎪
⎪⎩
Daniel
8
2 / 3 x = 100
1/ 6
x=9
1/ 6
x =8
0 otherwise
3.2ProbabilityMassFunction
Theorem3.1
For a discrete random variable X with PMF PX(x) and range SX:
(a) For any x, PX(x) >= 0. (everyone eat or not, but none made pizza)
(b)
∑ PX (x) = 1 (ourfamilyeatawholepizza)
x∈SX
(c)Foranyevent,theprobabilitythatX
isinthesetB is
B ⊂ SX
P[B] = ∑
PX (x)
x∈B
(adult(100)eat¼+5/12)
Note:wholefamilyisSX={8,9,100},adult(mywifeandme)isB={100}
3.3FamiliesofDiscreteRandomVariables
Bernoulli(p)randomvariable
X is a Bernoulli (p) random variable if the PMF of X has the form
⎧
⎪
⎪
PX (x) = ⎨
⎪
⎪
⎩
1− p x = 0
p x =1
0 otherwise
3.3FamiliesofDiscreteRandomVariables
Example3.6
Ø Consideracoinandletitlandonatable.Observewhetherthesidefacingupis
headsortails.LetXbenumberofheadsobserved.
Ø Selectastudentatrandomandfindouthertelephonenumber.LetX=0ifthe
lastdigitiseven.Otherwise,X=1.
Ø Observeonebittransmittedbyamodemthatisdownloadingafilefromthe
internet.LetXbethevalueofthebit(0or1).
Allthreeexperimentsleadtotheprobabilitymassfunction
⎧
⎪
⎪
PX (x) = ⎨
⎪
⎪
⎩
1/ 2
x=0
1/ 2
x =1
0 otherwise
3.3FamiliesofDiscreteRandomVariables
Geometric(p)randomvariable
X is a geometric (p) random variable if the PMF of X has the form
⎧
⎪
PX (x) = ⎨
⎪
⎩
p(1− p) x−1
x = 1, 2,...
0 otherwise
Wheretheparameterp isintherange0<p<1
3.3FamiliesofDiscreteRandomVariables
Example3.9
Inasequenceofindependenttestsofintegratedcircuits,eachcircuitisrejected
withprobabilityp.LetXequalthenumberoftestsuptoandincludingthefirst
testthatresultsinareject.WhatisthePMFofX
p
1-p
r
a
X=1
p
r
1-p
a
X=2
p
1-p
r
X=3
a
⎧
⎪
PX (x) = ⎨
⎪
⎩
p(1− p) x−1
x = 1, 2,...
0 otherwise
3.3FamiliesofDiscreteRandomVariables
Binomial(n,p)randomvariable
X is a binomial (n, p) random variable if the PMF of X has the form
⎛ n ⎞ x
PX (x) = ⎜
⎟ p (1− p)n−x
⎝ x ⎠
Wheretheparameterp isintherange0<p<1,andn isaninteger
suchthatn >=1
3.3FamiliesofDiscreteRandomVariables
Example3.11
Inasequenceofindependenttestsofintegratedcircuits,eachcircuitisrejected
withprobabilityp.LetK equalthenumberofrejectsinthen tests.FinthePMF
PK(k)
⎛ n ⎞ k
PK (k) = ⎜
⎟ p (1− p)n−k
⎝ k ⎠
3.3FamiliesofDiscreteRandomVariables
DiscreteUniform(k,l)randomvariable
X is a discrete uniform (k, l) random variable if the PMF of X has the
form
⎧
⎪ 1 / (l − k +1) x = k, k +1, k + 2..., l
PX (x) = ⎨
⎪
0 otherwise
⎩
Wheretheparameterk andlareintegerssuchthatk < l.
3.3FamiliesofDiscreteRandomVariables
Example3.16
Rollafairdie.TherandomvariableNisthenumberofspotsonthesidefacingup.
Therefore,Nisadiscreteuniform(1,6)randomvariablewithPMF.
⎧
⎪ 1 / 6 n = 1, 2, 3, 4, 5, 6.
PN (n) = ⎨
⎪
0 otherwise
⎩
3.4CumulativeDistributionFunction(CDF)
CumulativeDistributionFunction(CDF)
The cumulative distribution function (CDF) of random variable X is
FX (x) = P[X ≤ x]
Ø Foranyrealnumberx,theCDFistheprobabilitythattherandomvariableX
isnolargerthanx.
Ø Allrandomvariableshavecumulativedistributionfunctions,butonly
discreterandomvariableshaveprobabilitymassfunctions.
Ø Itcanbeviewedasthe“accumulation”(summation)oftheprobabilitymass
n
functions.
FY (n) = ∑ PY [ j]
j=1
3.4CumulativeDistributionFunction(CDF)
Theorem3.2
For any discrete random variable X with range SX = {x1, x2, …},
satisfying x1 ≤ x2 ≤ .....
(a)and
FX (∞) = 1
FX (−∞) = 0
(b)Forallx’>=x,FX(x’)>=FX(x)
(c)Forand,andarbitrarilysmallpositivenumber,
ε
xi ∈ SX
FX (xi ) − FX (xi − ε ) = PX (xi )
x ≤ x < xi+1
(d)Forforallxsuchthat
FX (x) = FX (xi )
3.4CumulativeDistributionFunction(CDF)
Theorem3.3
For all b ≥ a
FX (b) − FX (a) = P[a < X ≤ b]
3.4CumulativeDistributionFunction(CDF)
Example3.21
⎧
⎪
⎪
⎪
PX (x) = ⎨
⎪
⎪
⎪⎩
1/ 4 x = 0
1/ 2
x =1
1/ 4 x = 2
0 otherwise
⎧
⎪
⎪
⎪
FX (x) = PX (X ≤ x) = ⎨
⎪
⎪
⎪⎩
0
x<0
1/ 4 0 ≤ x <1
3/ 4 1≤ x < 2
1 x≥2
3.5AveragesandExpectedValue
Median
A median, xmed, of random variable X is number that satisfies
P[X ≥ xmed ] ≥ 1 / 2
P[X ≤ xmed ] ≥ 1 / 2
Mode
A mode, xmod, of random variable X is number that satisfies
PX [xmod ] ≥ PX [x]
forallx
Expectedvalue
The expected value of random variable X is
E[X] = µ X =
∑
x∈SX
xPX (x)
3.5AveragesandExpectedValue
Example3.23
Thereare11numbers:S={4,9,5,10,8,4,7,5,5,8,7}.
Median:4,4,5,5,5,7,7,8,8,9,10
7
Mean:(4+4+5+5+5+7+7+8+8+9+10)/11
Mode:4,4,5,5,5,7,7,8,8,9,10
mostpopular
Expectedvalue: 4 × 2 + 5 × 3 + 7 × 2 + 8 × 2 + 9 × 1 +10 × 1
11
11
11
11
11
11
3.5AveragesandExpectedValue
Theorem3.4
TheBernoulli (p)randomvariableX hasexpectedvalueE[X]=p
Theorem3.5
Thegeometric (p)randomvariableX hasexpectedvalueE[X]=1/p
Theorem3.7
Thebinomial (n,p)randomvariableX hasexpectedvalueE[X]=np
ThePascal (k,p)randomvariableX hasexpectedvalueE[X]=k/p
Thediscreteuniform (k,l)randomvariableX hasexpectedvalue
E[X]=(k+l)/2
3.6FunctionsofaRandomVariable
Theorem3.9
ForadiscreterandomvariableX,thePMFofY =g(X)is
PY (y) =
∑
PX (x)
x:g(x)=y
Eachsamplevaluey ofaderivedrandomvariableY isamathematical
functiong(x) ofasamplevaluexofanotherrandomvariableX.We
adoptthenotationY=g(X)todescribetherelationshipofthetwo
randomvariables.
3.6FunctionsofaRandomVariable
Example3.26
Aparcelshippingcompanyoffersachargingplan:$1.00forthefirstpound,$0.90
forthesecondpound,etc.,downto$0.60forthefifthpound,withroundingup
forafractionofapound.Forallpackagesbetween6and10pounds,theshipper
willcharge$5.00perpackage.(itwillnotacceptshipmentsover10pounds).Find
afunctionY=g(X) forthechargeincentsforsendingonepackage.
⎧
2
⎪ 105X − 5X X = 1, 2, 3, 4, 5
Y = g(X) = ⎨
⎪
500 X = 6, 7,8, 9,10
⎩
Supposeallpackagesweight1,2,3,or4poundswithequalprobability.Findthe
PMFandexpectedvalueofY,theshippingchargeforapackage.
⎧
⎧
⎪ 1 / 4 x = 1, 2, 3, 4
⎪ 1 / 4 y = 100,190, 270, 340
PX (x) = ⎨
PY (y) = ⎨
⎪ 0 otherwise
⎪
0 otherwise
⎩
⎩
1
E[Y ] = (100 +190 + 270 + 340) = 225
4
3.6FunctionsofaRandomVariable
Example3.26
SupposetheprobabilitymodelfortheweightinpoundsXofapackageis
⎧
⎪
⎪
PX (x) = ⎨
⎪
⎪
⎩
0.15 x = 1, 2, 3, 4
0.1 x = 5, 6, 7,8
0 otherwise
Forthepricingplan,whatisthePMFandexpectedvalueofY,thecostofshipping
apackage?
ForthreevalueofX(6,7,8)à Y=500PY(500)=PX(6)+PX(7)+PX(8)=0.30
⎧
⎪
⎪
⎪
PY (y) = ⎨
⎪
⎪
⎪
⎩
0.15 y = 100,190, 270, 340
0.10
y = 400
0.3 y = 500
0 otherwise
E[Y ] = 0.15(100 +190 + 270 + 340) + 0.1(400) + 0.3(500) = 325
3.7ExpectedvalueofaDerivedRandomVariable
Theorem3.10
GivenarandomvariableX withPMFPX(x) andthederivedrandom
variableY=g(X),theexpectedvalueof Yis
E[Y ] = µY = PY (y) =
∑
x∈SX
g(x)PX (x)
3.7ExpectedvalueofaDerivedRandomVariable
Example3.29
⎧
⎪ 1 / 4 x = 1, 2, 3, 4
PX (x) = ⎨
⎪ 0 otherwise
⎩
and
⎧
2
105X
−
5X
X = 1, 2, 3, 4, 5
⎪
Y = g(X) = ⎨
⎪
500 X = 6, 7,8, 9,10
⎩
WhatisE[Y]?
4
E[Y ] = ∑ g(x)PX (x)
x=1
= (1 / 4)(100 +190 + 270 + 340) = 225
3.7ExpectedvalueofaDerivedRandomVariable
Theorem3.11
ForanyrandomvariableX
E[X − µ X ] = 0
Theorem3.12
ForanyrandomvariableX
E[aX + b] = aE[X]+ b
3.8VarianceandStandardDeviation
ThevarianceVar[X] measuresthedispersionofsamplevaluesof X
aroundtheexpectedvalueE[X].WhenweviewE[X]asanestimate
ofX,Var[X] isthemeansquareerror.
Variance
Var[X] = E[(X − µ X )2 ]
StandardDeviation
σ X = Var[X]
3.8VarianceandStandardDeviation
Theorem3.13
Intheabsenceofobservations,theminimummeansquareerror
estimateofrandomvariableX is
x̂ = E[X]
Theorem3.14
Var[X] = E[X 2 ]− µ X2 = E[X 2 ]− (E[X])2
3.8VarianceandStandardDeviation
Moments
ForrandomvariableX :
(a)Thenthmomentis E[X n ]
(B)ThenthCENTRALmomentis E[(X − µ X )n ]