continuous optimization, inference, active sampling
We recast the problem of unconstrained continuous evolutionary optimization as inference in a fixed graphical model. This approach allows us to address several pervasive issues in optimization, including the traditionally difficult problem of selecting an algorithm that is most appropriate for a given task. This is accomplished by placing a prior distribution over the expected class of functions, then employing inference and intuitively defined utilities and costs to transform the evolutionary optimization problem into one of active sampling. This allows us to pose an approach to optimization that is optimal for each expressly stated function class. The resulting solution methodology can optimally navigate exploration-exploitation tradeoffs using well-motivated decision theory, while providing the process with a natural stopping criterion. Finally, the model naturally accommodates the expression of dynamic and noisy functions, setting it apart from most existing algorithms that address these issues as an afterthought. We demonstrate the characteristics and advantages of this algorithm formally and with examples.
Original Publication Citation
Christopher K. Monson, Kevin D. Seppi, and James L. Carroll. "A Utile Function Optimizer." In Proceedings of the IEEE Congress on Evolutionary Computation (CEC 27), Singapore.
BYU ScholarsArchive Citation
Carroll, James; Monson, Christopher K.; and Seppi, Kevin, "A Utile Function Optimizer" (2007). All Faculty Publications. 946.
Physical and Mathematical Sciences
© 2007 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
Copyright Use Information