Abstract

As neuromorphic computing gains traction, researchers look for learning schemes that are more suitable for brain-like structures. Although many algorithms have been discovered, such as surrogate gradient or Spike Timing Dependent Plasticity, those algorithms are either unsuitable for on-chip training or not biologically aligned. The goal of this thesis is to discover a learning algorithm which is biologically aligned and suitable for on-chip training. Through the discoveries of our two papers, we are able to introduce the concepts of synaptic clefts and neurotransmitters into a common spiking neuron network. We also designed an error propagation method aligning with biology. With our new learning scheme, we are able to achieve a reduced level of memory consumption compared to traditional recurrent neural networks (RNN) while maintaining time consumption. Moreover, we are able to outperform traditional RNN in small temporal tasks.These findings prove that our algorithm is capable of on-chip learning while aligning with biology.

Degree

MS

College and Department

Computer Science; Computational, Mathematical, and Physical Sciences

Rights

https://lib.byu.edu/about/copyright/

Date Submitted

2025-02-14

Document Type

Thesis

Keywords

neuromorphic computing, chip friendly design, low memory consumption, temporal learning, new weight update algorithm

Language

english

Share

COinS