Abstract
The following dissertation presents a new paradigm for improving the training of machine learning algorithms, oracle learning. The main idea in oracle learning is that instead of training directly on a set of data, a learning model is trained to approximate a given oracle's behavior on a set of data. This can be beneficial in situations where it is easier to obtain an oracle than it is to use it at application time. It is shown that oracle learning can be applied to more effectively reduce the size of artificial neural networks, to more efficiently take advantage of domain experts by approximating them, and to adapt a problem more effectively to a machine learning algorithm.
Degree
PhD
College and Department
Physical and Mathematical Sciences; Computer Science
Rights
http://lib.byu.edu/about/copyright/
BYU ScholarsArchive Citation
Menke, Joshua Ephraim, "Improving Machine Learning Through Oracle Learning" (2007). Theses and Dissertations. 843.
https://scholarsarchive.byu.edu/etd/843
Date Submitted
2007-03-12
Document Type
Dissertation
Handle
http://hdl.lib.byu.edu/1877/etd1726
Keywords
neural networks, model approximation, oracle learning, Bradley-Terry, Bayesian inference
Language
English