Download Machine Learning for Information Retrieval: Neural Networks

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Koinophilia wikipedia , lookup

Dual inheritance theory wikipedia , lookup

Biology and consumer behaviour wikipedia , lookup

Transcript
Machine Learning for
Information Retrieval:
Neural Networks, Symbolic
Learning, and Genetic
Algorithms
Prerak Sanghvi
Paper by: Hsinchun Chen
Artificial Intelligence Lab, University of Arizona
Journal of the American Society for Information Science, 1994
Source: Search on Google with key phrase “Text Classification Algorithms”
• To explain the Algorithms, three examples
have been discussed:
– BackPropagation Neural Network
– The symbolic ID3 / ID5R Algorithms
– Evolution based Genetic Algorithms
With proper user-system interactions, these
methods can greatly complement the
prevailing full-text, keyword-based,
probabilistic, and knowledge-based
techniques.
Symbolic Learning and ID3
• Symbolic machine learning techniques can
be classified based on such underlying
learning strategies as rote learning, learning
by being told, learning by analogy, learning
from examples, and learning from discovery
• The most promising of these is learning by
Example since it involves concept learning,
and relies on past experience.
Neural Networks and
Backpropagation
• Backpropagation networks have been
extremely popular for their unique learning
capability
• Good convergence is obtained if sufficient
examples are provided
• Neural Networks are important since they
seem to work in a large variety of domains
Simulated Evolution and
Genetic Algorithms
• In such algorithms a population of individuals (potential
solutions) undergoes a sequence of unary (mutation) and
higher order (crossover) transformations
• These individuals strive for survival: a selection
(reproduction) scheme, biased towards selecting fitter
individuals, produces the individuals for the next
generation
• After some number of generations the program converges the best individual represents the optimum solution
Comparisons
• ID3 was faster than a Backpropagation net, but the
Backpropagation net was more adaptive to noisy data sets
• using batch learning, Backpropagation performed as well as
ID3, but it was more noise-resistant
• The results indicated that genetic search is, at best, equally
efficient as faster variants of a Backpropagation algorithm in
very small scale networks, but far less efficient in larger
networks. However, it is also showed that using some domainspecific genetic operators to train the Backpropagation network,
instead of using the conventional Backpropagation Delta
learning rule, improved performance