Document Type





Backpropagation, Modular Networks, Classification Problems, Multiclass problems, Feedforward Networks




Computer Sciences


One connectionist approach to the classification problem, which has gained popularity in recent years, is the use of backpropagation-trained feed-forward neural networks. In practice, however, we find that the rate of convergence of net output error is especially low when training networks for multi-class problems. In this paper, we show that while backpropagation will reduce the Euclidean distance between the actual and desired output vectors, the difference between some of the components of these vectors will actually increase in the first iteration. Furthermore, the magnitudes of subsequent weight changes in each iteration are very small, so that many iterations are required to compensate for the increased error in some components in the initial iterations. We describe a modular network architecture to improve the rate of learning for such classification problems. Our basic approach is to reduce a K-class problem to set of K two-class problems with a separately trained network for each of the K problems. We also present the results from several experiments comparing our new algorithm and approach with standard backpropagation, and find that speedups of about one order of magnitude can be obtained.

Additional Information

School of Computer and Information Science, Syracuse University, SU-CIS-91-40





To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.