Abstract
Reservoir computing and "Next-generation reservoir computing" built on nonlinear vector autoregression are algorithms for predicting time series given past observations. Both algorithms have been successful in predicting some chaotic dynamical systems, but this success is not theoretically well understood. This paper considers the conditioning of both algorithms and the nature of the solutions found. With either algorithm, the least squares problem which is solved to give parameters as a function of the time series data is ill-conditioned, leading to parameter non-identifiability. Both algorithms fall into the class of "sloppy models." Standard practice to regularize either algorithm includes a bias towards parameters of minimal Euclidean norm. We illustrate, starting from the simpler case of linear autoregression, how the solutions thus found differ from "true" parameters, and why this may result in improved forecasting performance.
Degree
MS
College and Department
Computational, Mathematical, and Physical Sciences; Mathematics
Rights
https://lib.byu.edu/about/copyright/
BYU ScholarsArchive Citation
Jensen, Daniel, "The Conditioning of Reservoir Computing and NG-RC for Forecasting and the Effects of Temporal Overparameterization" (2025). Theses and Dissertations. 10794.
https://scholarsarchive.byu.edu/etd/10794
Date Submitted
2025-04-24
Document Type
Thesis
Handle
http://hdl.lib.byu.edu/1877/etd13663
Keywords
machine learning, recurrent neural networks, autoregression, difference equations
Language
english