Résumé
The traditional back-propagation training algorithm (BP) is an iterative gradient descent algorithm designed to minimize the mean square error between the actual output of a multilayer feedforward perceptron and the desired output. It is highly accurate for most classification problems but it is time consuming and computer intensive. An adaptive approach is proposed so as to reduce the number of iterations needed to train the neural network. The new method is applied on a multilayer network with one hidden layer to classify the letters A to J. A reduction of 25% in the number of iterations is achieved at 98% classification rate. We also propose the confidence region (CR). It is based on the average and the standard deviation of the output node values. A reduction of 75% in the number of iterations is achieved if CR is used. Experimental results indicate that the adaptive approach in addition to the confidence region is faster than the traditional BP training algorithm.
Langue d'origine | English |
---|---|
Pages (de-à) | 722-725 |
Nombre de pages | 4 |
Journal | Canadian Conference on Electrical and Computer Engineering |
Volume | 2 |
Statut de publication | Published - 1994 |
Événement | Proceedings of the 1994 Canadian Conference on Electrical and Computer Engineering. Part 2 (of 2) - Halifax, Can Durée: sept. 25 1994 → sept. 28 1994 |
ASJC Scopus Subject Areas
- Hardware and Architecture
- Electrical and Electronic Engineering