Document Type
Report
Date
12-1991
Keywords
Backpropagation, Modular Networks, Classification Problems, Multiclass problems, Feedforward Networks
Language
English
Disciplines
Computer Sciences
Description/Abstract
One connectionist approach to the classification problem, which has gained popularity in recent years, is the use of backpropagation-trained feed-forward neural networks. In practice, however, we find that the rate of convergence of net output error is especially low when training networks for multi-class problems. In this paper, we show that while backpropagation will reduce the Euclidean distance between the actual and desired output vectors, the difference between some of the components of these vectors will actually increase in the first iteration. Furthermore, the magnitudes of subsequent weight changes in each iteration are very small, so that many iterations are required to compensate for the increased error in some components in the initial iterations. We describe a modular network architecture to improve the rate of learning for such classification problems. Our basic approach is to reduce a K-class problem to set of K two-class problems with a separately trained network for each of the K problems. We also present the results from several experiments comparing our new algorithm and approach with standard backpropagation, and find that speedups of about one order of magnitude can be obtained.
Recommended Citation
Anand, Rangachari; Mehrotra, Kishan; Mohan, Chilukuri K.; and Ranka, Sanjay, "An Efficient Neural Algorithm for the Multiclass Problem" (1991). Electrical Engineering and Computer Science - Technical Reports. 136.
https://surface.syr.edu/eecs_techreports/136
Source
local
Additional Information
School of Computer and Information Science, Syracuse University, SU-CIS-91-40