The following dissertation presents a new paradigm for improving the training of machine learning algorithms, oracle learning. The main idea in oracle learning is that instead of training directly on a set of data, a learning model is trained to approximate a given oracle's behavior on a set of data. This can be beneficial in situations where it is easier to obtain an oracle than it is to use it at application time. It is shown that oracle learning can be applied to more effectively reduce the size of artificial neural networks, to more efficiently take advantage of domain experts by approximating them, and to adapt a problem more effectively to a machine learning algorithm.
College and Department
Physical and Mathematical Sciences; Computer Science
BYU ScholarsArchive Citation
Menke, Joshua Ephraim, "Improving Machine Learning Through Oracle Learning" (2007). All Theses and Dissertations. 843.
neural networks, model approximation, oracle learning, Bradley-Terry, Bayesian inference