In this paper, we analyze the reason for the slow rate of convergence of net output error when using the backpropagation algorithm to train neural networks for a two-class problems in which the numbers of exemplars for the two classes differ greatly. This occurs because the negative gradient vector computed by backpropagation for an imbalanced training set does not point initially in a downhill direction for the class with the smaller number of exemplars. Consequently, in the initial iteration, the net error for the exemplars in this class increases significantly. The subsequent rate of convergence of the net error is very low. We suggest a modified technique for calculating a direction in weight-space which is downhill for both classes. Using this algorithm, we have been able to accelerate the rate of learning for two-class classification problems by an order of magnitude.
Anand, Rangachari; Mehrotra, Kishan; Mohan, Chilukuri K.; and Ranka, Sanjay, "An Improved Algorithm for Neural Network Classification of Imbalanced Training Sets" (1991). Electrical Engineering and Computer Science Technical Reports. 104.