Abstract

This thesis advances the theory of network specialization by characterizing the effect of network specialization on the eigenvectors of a network. We prove and provide explicit formulas for the eigenvectors of specialized graphs based on the eigenvectors of their parent graphs. The second portion of this thesis applies network specialization to learning problems. Our work focuses on training reservoir computers to mimic the Lorentz equations. We experiment with random graph, preferential attachment and small world topologies and demonstrate that the random removal of directed edges increases predictive capability of a reservoir topology. We then create a new network model by growing networks via targeted application of the specialization model. This is accomplished iteratively by selecting top preforming nodes within the reservoir computer and specializing them. Our generated topology out-preforms all other topologies on average.

Degree

MS

College and Department

Physical and Mathematical Sciences; Mathematics

Rights

https://lib.byu.edu/about/copyright/

Date Submitted

2020-04-09

Document Type

Thesis

Handle

http://hdl.lib.byu.edu/1877/etd11098

Keywords

Complex networks, dynamical systems, reservoir computing, network growth, isospectral transformations, spectral graph theory, chaos

Language

English

Share

COinS