Download Classifiers

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Classifiers
Fujinaga
Bayes (optimal) Classifier (1)
• A priori probabilities: P(w1) and P(w2 ) [ P(w1 ) + P(w2 ) =1]
• Decision rule: given P(w1) and P(w2 ),
decide w1 if P(w1) > P(w2 ),
and probability of error = P(w2 ).
• Let
x be the feature(s).
• Let P(x | wi )be the class (state)- conditional probability
distribution function (pdf) for x ; i.e., the pdf for x given
that the state of nature is w i .
Bayes (optimal) Classifier (2)
• Assume we know P(wi ) and P(x | wi )
and also we discover the value of x.
• Using Bayes Rule:
P(x | w i )P(w i )
P(w i | x) =
P(x)
where P(x) = å P(x | w i )P(w i )
• Decide w1 if
P(w i | x)
P(w1 | x) > P(w2 | x) or max
i, j P(w | x)
j
(Maximum likelihood)
Bayes (optimal) Classifier (3)
A posteriori for a two class decision problem. The red region on the x axes
depicts values for x for which you would decide ‘apple’ and the orange
region is for ‘orange’. At every x, the posteriors must sum to 1.
Fisher’s Linear Discriminant
If Petal Width < 3.272 - 0.3252xPetal Length, then Versicolor
If Petal Width > 3.272 - 0.3252xPetal Length, then Verginica
Decision Tree
If Petal Length < 2.65, then Setosa
If Petal Length > 4.95, then Verginica
If 2.65 < Petal Length < 4.95 then
if Petal Width < 1.65 then Versicolor
if Petal Width > 1.65 then Virginica
Related documents