Abstract

This thesis integrates biologically-inspired mechanisms into machine learning to develop novel tuning algorithms, gradient abstractions for depth-wise parallelism, and an original bias neuron design. We introduce neuromodulatory tuning, which uses neurotransmitter-inspired bias adjustments to enhance transfer learning in spiking and non-spiking neural networks, significantly reducing parameter usage while maintaining performance. Additionally, we propose a novel approach that decouples the backward pass of backpropagation using layer abstractions, inspired by feedback loops in biological systems, enabling depth-wise training parallelization. We further extend neuromodulatory tuning by designing spiking bias neurons that mimic dopamine neuron mechanisms, leading to the development of volumetric tuning. This method enhances the fine-tuning of a small spiking neural network for EEG emotion classification, outperforming previous bias tuning methods. Overall, this thesis demonstrates the potential of leveraging neuroscience discoveries to improve machine learning.

Degree

MS

College and Department

Computational, Mathematical, and Physical Sciences; Computer Science

Rights

https://lib.byu.edu/about/copyright/

Date Submitted

2024-06-10

Document Type

Thesis

Handle

http://hdl.lib.byu.edu/1877/etd13241

Keywords

spiking neural networks, neuromorphic computing, dopamine-inspired learning structures, layer abstractions

Language

english

Share

COinS