Download 3.1.1-Inf-Overview-Prob

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Probabilistic
Graphical
Models
Inference
Overview
Conditional
Probability
Queries
Daphne Koller
Inference in a PGM
• PGM encodes P(X1,…,Xn)
• Can be used to answer any query over
joint distribution
Daphne Koller
Conditional Probability Queries
• Evidence: E = e
• Query: a subset of variables Y
• Task: compute P(Y | E=e)
• Applications
– Medical/fault diagnosis
– Pedigree analysis
Daphne Koller
NP-Hardness
The following are all NP-hard
• Given a PGM P, a variable X and a value
xVal(X), compute P(X=x)
– Or even decide if P(X=x) > 0
• Let  < 0.5. Given a PGM P, a variable X and a
value xVal(X), and observation eVal(E), find a
number p that has |P(X=x|E=e) – p| < 
Daphne Koller
Sum-Product
C
I
D
G
S
L
H
J
Daphne Koller
Sum-Product
A
D
B
C
Daphne Koller
Evidence: Reduced Factors
A
D
B
C
Daphne Koller
Evidence: Reduced Factors
C
P(J,I=i, H=h) =
I
D
G
S
L
H
J
Daphne Koller
Sum-Product
Compute
and renormalize
Daphne Koller
Algorithms: Conditional Probability
• Push summations into factor product
– Variable elimination
• Message passing over a graph
– Belief propagation
– Variational approximations
• Random sampling instantiations
– Markov chain Monte Carlo (MCMC)
– Importance sampling
Daphne Koller
Summary
• Conditional probability queries of subset of
variables given evidence on others
• Summing over factor product
• Evidence simply reduces the factors
• Many exact and approximate algorithms
Daphne Koller
END END END
Daphne Koller
Related documents