Download 5. Feature EXTRACTION y reducción de la dimensión

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

K-means clustering wikipedia , lookup

Factor analysis wikipedia , lookup

K-nearest neighbors algorithm wikipedia , lookup

Nonlinear dimensionality reduction wikipedia , lookup

Principal component analysis wikipedia , lookup

Transcript
Feature Extraction (I)
Data Mining II
Year 2009-10
Lluís Belanche
Alfredo Vellido
Dimensionality reduction (1)
Dimensionality reduction (2)
Signal representation vs
classification
Principal Components Analysis
(PCA)

General goal : project the
data onto a new
subspace so that a
maximum of relevant
information is preserved

In PCA, relevant
information is variance
(dispersion).
PCA Theory (1)
PCA Theory (2)
PCA Theory (3)
PCA Theory (4)
Algorithm for PCA
PCA examples (1)
PCA examples (2)
PCA examples (2)
PCA examples (3)
PCA examples (4)
Two solutions: in which sense are
they optimal?
1.
2.
3.
4.
In the signal representation sense
In the signal separation sense
In both
In none
Other approaches to FE


Kernel PCA: perform PCA in xΦ(x),
where K(x,y) = < Φ(x), Φ(y)> is a kernel
ICA (Independent Components Analysis):



Image and audio analysis brings own methods





Seeks statistical independence of features (PCA seeks
uncorrelated features)
Equivalence to PCA iff features are Gaussian
Series expansion descriptors (from the DFT, DCT or DST)
Moment-based features
Spectral features
Wavelet descriptors
Cao, J.J. et al. A comparison of PCA, KPCA and ICA for
dimensionality reduction. Neurocomputing 55, pp. 321336 (2003)