Keywords
neural networks, recurrently-connected, time delays, time constants
Abstract
Recurrently-connected spiking neural networks are difficult to use and understand because of the complex nonlinear dynamics of the system. Through empirical studies of spiking networks, we deduce several principles which are critical to success. Network parameters such as synaptic time delays and time constants and the connection probabilities can be adjusted to have a significant impact on accuracy. We show how to adjust these parameters to fit the type of problem.
Original Publication Citation
Eric Goodman and Dan Ventura, "Effectively Using Recurrently Connected Spiking Neural Networks", Proceedings of the International Joint Conference on Neural Networks, pp. 1542-1547, July 25.
BYU ScholarsArchive Citation
Goodman, Eric and Ventura, Dan A., "Effectively Using Recurrently-Connected Spiking Neural Networks" (2005). Faculty Publications. 365.
https://scholarsarchive.byu.edu/facpub/365
Document Type
Peer-Reviewed Article
Publication Date
2005-07-01
Permanent URL
http://hdl.lib.byu.edu/1877/2522
Publisher
IEEE
Language
English
College
Physical and Mathematical Sciences
Department
Computer Science
Copyright Status
© 2005 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
Copyright Use Information
http://lib.byu.edu/about/copyright/