Keywords

continuous optimization, inference, active sampling

Abstract

We recast the problem of unconstrained continuous evolutionary optimization as inference in a fixed graphical model. This approach allows us to address several pervasive issues in optimization, including the traditionally difficult problem of selecting an algorithm that is most appropriate for a given task. This is accomplished by placing a prior distribution over the expected class of functions, then employing inference and intuitively defined utilities and costs to transform the evolutionary optimization problem into one of active sampling. This allows us to pose an approach to optimization that is optimal for each expressly stated function class. The resulting solution methodology can optimally navigate exploration-exploitation tradeoffs using well-motivated decision theory, while providing the process with a natural stopping criterion. Finally, the model naturally accommodates the expression of dynamic and noisy functions, setting it apart from most existing algorithms that address these issues as an afterthought. We demonstrate the characteristics and advantages of this algorithm formally and with examples.

Original Publication Citation

Christopher K. Monson, Kevin D. Seppi, and James L. Carroll. "A Utile Function Optimizer." In Proceedings of the IEEE Congress on Evolutionary Computation (CEC 27), Singapore.

Document Type

Peer-Reviewed Article

Publication Date

2007-09-25

Permanent URL

http://hdl.lib.byu.edu/1877/2598

Publisher

IEEE

Language

English

College

Physical and Mathematical Sciences

Department

Computer Science

Share

COinS