Keywords

artificial neural networks, predictive model, training sets, speed training, generalization

Abstract

Artificial neural networks provide an effective empirical predictive model for pattern classification. However, using complex neural networks to learn very large training sets is often problematic, imposing prohibitive time constraints on the training process. We present four practical methods for dramatically decreasing training time through dynamic stochastic sample presentation, a technique we call speed training. These methods are shown to be robust to retaining generalization accuracy over a diverse collection of real world data sets. In particular, the SET technique achieves a training speedup of 4278% on a large OCR database with no detectable loss in generalization.

Original Publication Citation

Andersen, T. L., Martinez, T. R., and Rimer, M. E., "Speed Training: Improving the Rate of Backpropagation Learning through Stochastic Sample Presentation", Proceedings of the International Joint Conference on Neural Networks IJCNN'1, pp. 2661-2666, 21.

Document Type

Peer-Reviewed Article

Publication Date

2001-07-19

Permanent URL

http://hdl.lib.byu.edu/1877/2441

Publisher

IEEE

Language

English

College

Physical and Mathematical Sciences

Department

Computer Science

Share

COinS