Keywords

Hopfield network, back-propagation, noisy associative training

Abstract

A new approach to training a generalized Hopfield network is developed and evaluated in this work. Both the weight symmetricity constraint and the zero self-connection constraint are removed from standard Hopfield networks. Training is accomplished with Back-Propagation Through Time, using noisy versions of the memorized patterns. Training in this way is referred to as Noisy Associative Training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets adapted from the U.C. Irvine Machine Learning Repository. Results show that for random patterns, Hopfield networks trained with NAT have an average overall recall accuracy 6.1 times greater than networks produced with either Hebbian or Pseudo-Inverse training. Additionally, these networks have 13% fewer spurious memories on average than networks trained with Pseudo-Inverse or Hebbian training. Typically, networks memorizing over 2N (where N is the number of nodes in the network) patterns are produced. Performance on correlated data shows an even greater improvement over networks produced with either Hebbian or Pseudo-Inverse training - An average of 27.8 times greater recall accuracy, with 14% fewer spurious memories.

Original Publication Citation

Clift, F. and Martinez, T. R., "Improved Hopfield Networks by Training with Noisy Data", Proceedings of the IEEE International Joint Conference on Neural Networks IJCNN'1, pp. 1138-1143, 21.

Document Type

Peer-Reviewed Article

Publication Date

2001-07-19

Permanent URL

http://hdl.lib.byu.edu/1877/2426

Publisher

IEEE

Language

English

College

Physical and Mathematical Sciences

Department

Computer Science

Share

COinS