Download 7. Decision Trees and Decision Rules

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
國立雲林科技大學
National Yunlin University of Science and Technology
Boosting an Associative Classifier
Presenter:Chien-Shing Chen
Author: Yanmin Sun
Yang Wang
Andrew K.C. Wong
2006, TKDE
1
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
Outline







Motivation
Objective
Introduction
Weight Strategies for Voting
Experiments
Conclusions
Personal Opinion
2
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
Motivation
Boosting is a general method for improving the
performance of any learning algorithm.
no reported work on boosting associative classifiers
3
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
Objective
describe three strategies for voting multiple classifiers
in boosting an HPWR classification system
AdaBoost
evidence weight
Hybrid
analyzes the features of these three strategies
4
Intelligent Database Systems Lab
Weighting Strategies for Voting
N.Y.U.S.T.
I. M.
Let εdenotes the weighted training error at each iteration.
weight of evidence provided by x in favor of yi as opposed
to other values
P(x∩y ) / p(y )
i
i
5
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
t=1
x1
x4
t=2
x1
x4
6
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
Experiments
7
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
Experiments
8
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
Experiments
9
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
Opinion
Drawback
lack handing with the Class level (predicting attributes)
Qualification
Application
any classification problem
Future Work
weight of evidence description
Fourth strategic
10
Intelligent Database Systems Lab
Related documents