Download Document 8900945

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Ciencia Ergo Sum
ISSN: 1405-0269
[email protected]
Universidad Autónoma del Estado de México
México
Toribio Luna, Primitivo; Alejo Eleuterio, Roberto; Valdovinos Rosas, Rosa María; Rodríguez
Méndez, Benjamín Gonzalo
Training Optimization for Artificial Neural Networks
Ciencia Ergo Sum, vol. 17, núm. 3, noviembre-febrero, 2010, pp. 313-317
Universidad Autónoma del Estado de México
Toluca, México
Available in: http://www.redalyc.org/articulo.oa?id=10415212010
Abstract
Nowadays, with the capacity to model complex problems, the artificial Neural Networks (nn) are very popular in the areas of
Pattern Recognition, Data Mining and Machine Learning. Nevertheless, the high computational cost of the learning phase when big
data bases are used is their main disadvantage. This work analyzes the advantages of using pre-processing in data sets In order
to diminish the computer cost and improve the nn convergence. Specifically the Relative Neighbor Graph (rng), Gabriel's Graph
(gg) and k-ncn methods were evaluated. The experimental results prove the feasibility and the multiple advantages of these
methodologies to solve the described problems.
Keywords
Artificial neural networks, multilayer perceptron, radial basis function neural network, support vector machine, preprocessing data.
How to cite
Complete issue
More information about this article
Journal's homepage in redalyc.org
Scientific Information System
Network of Scientific Journals from Latin America, the Caribbean, Spain and Portugal
Non-profit academic project, developed under the open access initiative