Download Announcements Where are we? Today

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Probability wikipedia , lookup

Transcript
Announcements
•  Project 3 is due March 14 (day before spring break)
• 
Probability
at 11:59 PM
Today:
o  Presentations
o  Probability
•  Next week:
o  Bayes’ nets
Content adapted from Berkeley CS188. Credit: Dan Klein, Pieter Abbeel
Where are we?
• 
• 
We’re done with Part 1 Search and
Planning
Today
•  Probability
o  Random Variables
o 
o 
o 
o 
o 
Part II: Probabilistic Reasoning
and Diagnosis
o  Tracking Objects
o  Speech recognition
o  Robot mapping
o  Genetics
o  Error correcting codes
o  … lot’s more!
• 
Part III: Machine Learning
• 
Joint and Marginal Distributions
Conditional Distributions
Product Rule, Chain Rule, Bayes’ Rule
Inference
Independence
You’ll need all this stuff A LOT for the next
few weeks, so make sure you understand it
now!
Inference in Ghostbusters
• 
• 
Inference in Ghostbusters
A ghost is in the grid somewhere
Sensor readings tell us how close
a square is to the ghost
o  On the ghost: red
o  1 or 2 away: orange
o  3 or 4 away: yellow
• 
o  5+ away: green
Sensors are noisy but we know P(Color | Distance)
P(red | 3)
P(orange | 3)
0.05
0.15
P(yellow | 3)
P(green | 3)
0.5
0.3
[Demo]
Uncertainty
• 
• 
General situation:
o  Observed variables (evidence): Agent knows
certain things about the state of the world (e.g.,
sensor readings or symptoms)
o  Unobserved variables: Agent needs to reason
about other aspects (e.g., where an object is or
what disease is present)
o  Model: Agent knows something about how the
known variables relate to the unknown variables
Probabilistic reasoning gives us a framework
for managing our beliefs and knowledge
Random Variables
• 
A random variable is some aspect of the world
about which we (may) have uncertainty
o  R= Is it raining?
o  T=Is it hot or cold?
o  D=How long will it take to drive to work?
• 
• 
o  L=Where am I?
We denote random variables with capital letters
Like variables in a CSP, random variables have
domains
o  R in {true, false} (often write as {+r,-r})
o  T in {hot, cold}
o  D in [0,∞)
o  L in possible locations, maybe {(0,0),(0,1),…}
Probability Distribution
• 
Unobserved variables have distributions
• 
• 
A joint distribution over a set of random variables, X1, X2,…Xn,
specifies a real number for each assignment (or outcome):
T)
P)
W)
P)
hot)
0.5)
sun)
0.6)
cold)
0.5)
rain)
0.1)
T)
W)
P)
fog)
0.3)
hot)
sun)
0.4)
meteor)
• 
• 
Shorthand)nota*on:)
Joint Distributions
OK)if#all)domain)entries)are)unique)
0.0)
o  Must obey:
A distribution is a TABLE of probabilities of values
hot)
rain)
0.1)
cold)
sun)
0.2)
A probability (lower-case value) is a single number:
cold)
rain)
0.3)
Must have:
• 
and
Size of distribution of n variables with domain sizes d?
o  For all but the smallest distributions, impractical to write out!
Probabilistic Models
• 
• 
• 
A probabilistic model is a joint distribution
over a set of random variables
Probabilistic models:
o 
(Random) variables with domains
o 
Assignments are called outcomes
o 
Joint distributions: say whether assignments
(outcomes) are likely
o 
Normalized: sum to 1.0
o 
Ideally: only certain variables directly interact
Constraint satisfaction problems:
o 
Variables with domains
o 
Constraints: state whether assignments are
possible
o 
Ideally: only certain variables directly interact
Distribu*on)over)T,W)
T)
W)
P)
hot)
sun)
0.4)
hot)
rain)
0.1)
cold)
sun)
0.2)
cold)
rain)
0.3)
Events
• 
• 
Constraint)over)T,W)
T)
W)
P)
hot)
sun)
T)
hot)
rain)
F)
cold)
sun)
F)
cold)
rain)
T)
• 
An event is a set of E of outcomes
From a joint distribution, we can calculate
the probability of any event
T)
W)
P)
o  Probability that it’s hot AND sunny?
hot)
sun)
0.4)
o  Probability that it’s hot?
o  Probability that it’s hot OR sunny?
hot)
rain)
0.1)
cold)
sun)
0.2)
cold)
rain)
0.3)
Typically, the events we care about are
partial assignments, like P(T=hot)
Marginal Distributions
• 
• 
Marginal distributions are sub-tables which eliminate variables
Marginalization (summing out): Combine collapsed rows by adding
T)
T)
P)
hot)
0.5)
cold)
0.5)
W)
P)
hot)
sun)
0.4)
hot)
rain)
0.1)
cold)
sun)
0.2)
W)
P)
cold)
rain)
0.3)
sun)
0.6)
rain)
0.4)
Conditional Distributions
• 
Conditional distributions are probability distributions over some
variables given fixed values of others
Conditional Probabilities
• 
A simple relation between joint and conditional probabilities
o  In fact, this is taken as the definition of a conditional probability
Normalization Trick
Normalization Trick
Normalization Trick
• 
To Normalize
Probabilistic Inference
•  (Dictionary) To bring or restore to a normal condition
• 
•  Procedure:
• 
o  Step 1: Compute Z = sum over all entries
o  Step 2: Divide every entry by Z
Why does this work? Sum of selection is P(evidence)! (P(T=c), here)
All entries sum to ONE
Probabilistic Inference: compute a desired probability from other
known probabilities (e.g., conditional from joint)
We generally compute conditional probabilities
o  P(on time | no reported accidents) = 0.90
o  These represent the agent’s beliefs given the evidence
• 
Probabilities change with new evidence
o  P(on time | no accidents, 5 a.m.) = 0.95
o  P(on time | no accidents, 5 a.m., raining) = 0.80
o  Observing new evidence causes beliefs to be updated
Inference by Enumeration
•  General case:
Inference by Enumeration
Step 1: Select the
Step 2: Sum out H to
entries consistent with get joint of query and
the evidence
evidence
o  Evidence variables:
o  Query* variable:
o  Hidden variables:
Step 3: Normalize
•  We want:
Inference by Enumeration
•  Obvious problems:
o  Worst-case time complexity:
o  Space complexity: O(dn) to store the joint distribution
Quiz 1: Compute the Following Quantities:
•  P(sun)?
O(dn)
•  P(sun | winter)?
•  P(sun | winter, hot)?
The Product Rule
•  Sometimes we have conditional distributions, but want
the joint distribution
The Product Rule
•  Sometimes we have conditional distributions, but want
the joint distribution
Bayes Rule
D)
W)
P)
D)
W)
wet)
sun)
0.1)
wet)
sun)
0.08)
0.8)
dry)
sun)
0.9)
dry)
sun)
0.72)
0.2)
wet)
rain)
0.7)
wet)
rain)
0.14)
dry)
rain)
0.3)
dry)
rain)
0.06)
R)
P)
sun)
rain)
P)
Bayes’ Rule
• 
Two ways to factor a joint distribution over two variables
Bayes
Rules!
• 
Dividing, we get:
• 
Why is this at all helpful?
• 
o  Let’s us build one conditional from its reverse
o  Often one conditional is tricky, but the other one is simple
o  Foundations of many systems we’ll see later (e.g., ASR, MT)
In the running for most important equation in AI
Inference with Bayes’ Rule
•  Example: Diagnostic probability from causal probability
Ghostbusters, Revisited
• 
Let’s say we have two distributions:
o  Prior distribution over ghost location: P(G)
!  Let’s say this is uniform
o  Sensor reading model: P(R | G)
•  Example:
o  M: meningitis, S: stiff neck
!  Given: we know what our sensors do
!  R = reading color at (1,1)
• 
o  Note: posterior probability of meningitis still very small
o  Note: you should still get stiff necks checked out! Why?
Independence
• 
• 
• 
!  E.g., P(R=yellow | G=(1,1)) = 0.1
We can calculate the posterior distribution
P(G|R) over ghost locations given a reading
using Bayes’ rule:
Example: Independence
Two variables are independent in a joint distribution
o  Says the joint distribution factors into a
product of two simple ones
o  Usually variables aren’t independent!
T)
W)
T)
P)
hot)
0.5)
cold)
0.5)
P)
T)
hot)
sun)
0.3)
hot)
sun)
0.4)
W)
P)
hot)
rain)
0.1)
hot)
rain)
0.2)
Can use independence as a modeling assumption
cold)
sun)
0.2)
cold)
sun)
0.3)
o  Independence can be a simplifying assumption
o  Empirical joint distributions: at best “close” to independent
cold)
rain)
0.3)
cold)
rain)
0.2)
o  What could we assume for {Weather, Traffic, Cavity}
Independence is like something from CSPs: what?
W)
P)
sun)
0.6)
rain)
0.4)
Example: Independence
•  N fair, independent coin flips:
H)
0.5)
T)
0.5)
H)
T)
0.5)
0.5)
H)
0.5)
T)
0.5)