Download Neural Networks in Data Mining

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Least squares wikipedia , lookup

Regression analysis wikipedia , lookup

Linear regression wikipedia , lookup

Forecasting wikipedia , lookup

Data assimilation wikipedia , lookup

Transcript
Introduction to Support
Vector Machines for
Data Mining
Mahdi Nasereddin Ph.D.
Pennsylvania State University
School of Information Sciences and
Technology
1
Agenda
Introduction
 Support Vector Machines
 Preliminary Experimentation
 Conclusion
 Questions?

2
Data Mining Techniques:
Neural Networks
 Decision Trees
 Multivariate Adaptive Regression Splines
(MARS)
 Rule Induction
 Nearest Neighbor Method and discriminant
analysis
 Genetic Algorithms
 Support Vector Machines

3
Support Vector Machines
First introduced by Vapnik and
Chervonenkis in COLT-92
 Bases on Statistical Learning Theory

Applications
 Basic Theory

• Classification
• Regression
4
Successful Applications of
SVMS
Protein Structure Prediction
http://www.cs.umn.edu/~hpark/papers/surfac
e.pdf
 Intrusion Detection www.cs.nmt.edu/~IT
 Handwriting Recognition
 Detecting Steganography in digital images
http://www.cs.dartmouth.edu/~farid/publicatio
ns/ih02.html

5
Successful Applications of
SVMS
Breast Cancer Prognosis: Chemotherapy
Effect on Survival Rate (Lee, Mangasarian
and Wolberg, 2001)
 Particle and Quark-Flavour Identification in
High Energy Physics
(http://wwwrunge.physik.unifreiburg.de/preprints/EHEP9901.ps)
 Function Approximation

6
Support Vector Machines
(Linearly separable case)
10
8
6
4
2
0
0
5
-2
7
10
15
20
Support Vector Machines
(Linearly separable case)
10
8
6
4
2
0
0
5
-2
8
10
15
20
Support Vector Machines
(Linearly separable case)
10
8
6
4
2
0
1
2
-2
9
3 4
5
6
7
8
9 10 11 12 13 14 15 16 17 18 19 20
Non-Linearly separable case
10
SVM for Regression
In case of regression, the goal is to
construct a hyperplane that is close to
as many points as possible.
 For both classification and regression,
learning is done via quadratic
programming (one optimum point)

11
Strengths and Weaknesses
of SVM
Strengths
Training
is relatively easy
• No local optimal, unlike in neural networks
It
scales relatively well to high dimensional data
Weaknesses
Need
a “good” kernel function
12
Preliminary Experimentation:
Forecasting GDP using Oil
Prices (with F. Malik)
Forecasting model
 Objective: To predict the Gross
Domestic Product (GDP) for the next
quarter using

Oil prices (including time lag)
 GDP time

13
Data Set





14
We looked at quarterly Oil prices and GDP
data
January 1947 – December 2002
Oil price data were obtained from Bureau of
Labor Statistics
GDP data were obtained from the Bureau of
Economic Analysis.
We used the growth rate of GDP and the
growth rate of oil prices.
Models

Neural Networks
Back-propagation
 One hidden layer
 Delta rule was used for training


LS-SVM (Van Gestel, 2001)

15
Matlab toolbox
Experimentation
Created the training data to predict
the last 40 quarters GDP (test data)
 Trained the neural network and the
SVM
 Used the model to predict GDP, and
calculated the error of prediction

16
Results
Model
MAE
Neural
Network
LS-SVM
0.0044
17
0.0052
Good References

Introductions




Martin Law, “An Introduction to Support Vector Machines”
Andrew More, “Support Vector Machines”
www.cs.cmu.edu/~awm
N. Cristianini www.support-vector.net/tutorial.html
In depth

Support Vector Machines book www.support-vector.net
18
Questions
E-mail: [email protected]
 Presentation will be posted (by Friday) at
http://www.bklv.psu.edu/faculty/nasereddin

19