Keywords

artificial neural network, oracle learning

Abstract

Often the best artificial neural network to solve a real world problem is relatively complex. However, with the growing popularity of smaller computing devices (handheld computers, cellular telephones, automobile interfaces, etc.), there is a need for simpler models with comparable accuracy. The following research presents evidence that using a larger model as an oracle to train a smaller model on unlabeled data results in 1) a simpler acceptable model and 2) improved results over standard training methods on a similarly sized smaller model. On automated spoken digit recognition, oracle learning resulted in an artificial neural network of half the size that 1) maintained comparable accuracy to the larger neural network, and 2) obtained up to a 25% decrease in error over standard training methods.

Original Publication Citation

Menke, J., Peterson, A., Rimer, M, and Martinez, T. R., "Network Simplification through Oracle Learning", Proceedings of the IEEE International Joint Conference on Neural Networks IJCNN'2, pp. 2482-2497, 22.

Document Type

Peer-Reviewed Article

Publication Date

2002-05-17

Permanent URL

http://hdl.lib.byu.edu/1877/2430

Publisher

IEEE

Language

English

College

Physical and Mathematical Sciences

Department

Computer Science

Share

COinS