Download Special topics on text mining [Representation and preprocessing]

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Nonlinear dimensionality reduction wikipedia , lookup

K-nearest neighbors algorithm wikipedia , lookup

Transcript
Special topics on text mining
[Part I: text classification]
Hugo Jair Escalante, Aurelio Lopez,
Manuel Montes and Luis Villaseñor
Multi label text classification
Hugo Jair Escalante, Aurelio Lopez,
Manuel Montes and Luis Villaseñor
Most of this material was taken from: G. Tsoumakas, I. Katakis and I. Vlahavas. Mining multi-label data. Data
Mining and Knowledge Discovery Handbook, Part 6, O. Maimon, L. Rokach (Ed.), Springer, 2nd edition, pp. 667685, 2010.
Machine learning approach to TC
• Develop automated methods able to classify
documents with a certain degree of success
Trained machine
Training documents
(Labeled)
Labeled
document
Learning machine
(an algorithm)
Unseen (test, query)
document
What is a learning algorithm?
• A function:
f:
d
C
C  {1,..., K }
• Given:
D  {(xi , yi )}1,..., N
xi 
d
; yi  C
Binary vs multiclass classification
• Binary classification: each document can
belong to one of two classes.
f:
d
 {1,1}
• Multiclass classification: each document can
belong to one of K classes.
f:
d
 {1,..., K}
Classification algorithms
• (Some) classification algorithms for TC :
–
–
–
–
–
–
–
–
–
Naïve Bayes
Some
of
this
K-Nearest Neighbors
methods
were
Centroid-based classification
designed for binary
Decision trees
classification
problems
Support Vector Machines
Linear classifiers (including SVMs)
Boosting, bagging and ensembles in general
Random forest
Neural networks
Linear models
• Classification of DNA micro-arrays
x2?
Cancer
w xb  0
?
No Cancer
w xb  0
w xb  0
x1
x  x1 , x2
f ( x)  w x  b
Main approaches to multiclass
classification
• Single machine: Learning algorithms able to
deal with multiple classes (e.g., KNN, Naïve
Bayes)
• Combining the outputs of several binary
classifiers:
– One-vs-all: one classifier per-class
– All-vs-all: one classifier per pair of classes
Multilabel classification
• To what category belong these documents:
Multilabel classification
• A function:
f:
d
Z
Z  L  {1,..., K }
• Given:
D  {(xi , Zi )}1,..., N
xi 
d
; Zi  L
Conventions
n
xi
X={xij}
m
y ={yj}
a
w
Slide taken from I. Guyon. Feature and Model Selection. Machine Learning Summer School, Ile de Re, France, 2008.
Conventions
|L|
n
xi
X={xij}
m
Z ={Zj}
a
w
Slide taken from I. Guyon. Feature and Model Selection. Machine Learning Summer School, Ile de Re, France, 2008.
Multi-label classification
• Each instance can be associated to a set of
labels instead of a single one
• Specialized multilabel classification algorithms
must be developed
• How to deal with the multilabel classification
problem?
(Text categorization is perhaps the
dominant multilabel application)
Multilabel classifiers
• Transformation methods: Transform the
multilabel classification task into several
single-label problems
• Adaptation approaches: Modify learning
algorithms to support multilabel classification
problems
Transformation methods
• Copy transformation. Transforms the
multilabel instances into several single-label
ones
Original ML problem
Transformed ML
problem (unweighted)
Transformed ML
problem (weighted)
Transformation methods
• Select transformation. Replaces the multilabel
of each instance by a single one
Max
Original ML problem
Min
Rand
Transformed ML problem
Ignore approach
Transformation methods
• Label power set. Considers each unique set of
labels in the ML problem as a single class
Original ML problem
Transformed ML problem
Pruning can be applied
Transformation methods
• Binary relevance. Learns a different classifier per each
different label. Each classifier i is trained using the whole data
set by considering examples of class i as positive and
examples of other classes (j≠i) as negative
Original ML problem
Data sets generated by BR
• How labels are assigned to new instances?
Transformation methods
• Ranking by pairwise comparison. Learns a different classifier
per each pair of different labels.
Original ML problem
Data sets generated by BR
Algorithm adaptation techniques
• Many variants, including
– Decision trees
– Boosting ensembles
– Probabilistic generative models
– KNN
– Support vector machines
Algorithm adaptation techniques
• MLkNN. For each test instance:
– Retrieve the top-k nearest neighbors to each
instance
– Compute the frequency of occurrence of each
label
– Assign a probability to each label and select the
labels for the test instance
Feature selection in multilabel
classification
• An (almost) unstudied topic = opportunities
• Wrappers can be applied directly (define an objective function
to optimize based on a multilabel classifier)
Original
feature set
Generation
Subset of
feature
no
Evaluation
Stopping
criterion
Validation
yes
Selected
subset of
feature
process
From M. Dash and H. Liu. http://www.comp.nus.edu.sg/~wongszec/group10.ppt
Feature selection in multilabel
classification
• An almost un-studied topic = opportunities
• Existing filter methods transform the
multilabel problem and apply standard filters
for feature selection
Statistics
• Label cardinality
1 m
LC ( D)   | Li |
m i 1
• Label density
1 m | Li |
LC ( D)  
m i 1 q
Evaluation of multilabel learning
• (New) conventions:
D  {(xi , Yi )}1,..., N xi 
d
; Yi  L
L  { j : j  1,..., q}
Zi  L
Data set
Labels
Predictions of a ML
classifier for
instances in D
Evaluation of multilabel learning
• Hamming loss:
| Yi Z i |
1
HL  
N i 1 | L |
N
• Classification accuracy:
1 N
ACC   I ( Z i  Yi )
N i 1
I (true)  1; I ( false)  0;
Evaluation of multilabel learning
• Precision:
| Yi  Zi |
1
P 
N i 1 | Yi |
N
• Recall:
1 N | Yi  Zi |
R 
N i 1 | Yi  Zi |
Evaluation of multilabel learning
• F1-measure
2 | Yi  Zi |
1
F1  
N i 1 | Zi |  | Yi |
N
Suggested readings
•
G. Tsoumakas, I. Katakis,I. Vlahavas. Mining multi-label data. Data Mining and Knowledge Discovery
Handbook, Part 6, O. Maimon, L. Rokach (Ed.), Springer, 2nd edition, pp. 667-685, 2010.
•
G. Tsoumakas, I. Katakis. Multi-label classification: an overview. International Journal of Data
Warehousing, 3(3), 1—13, 2007.
•
M. Zhang, Z. Zhou. ML-kNN, A lazy learning approach to multi-label learning. Pattern recognition
40:2038—2048, 2007.
•
M. Boutell, J. Luo, X. Shen. C. Brown. Learning multi-label scene classification. Pattern recognition
37:1757—1771, 2004.