Keywords

constructive induction, minimality, generalization, feature sets

Abstract

Constructive induction, which is defined to be the process of constructing new and useful features from existing ones, has been extensively studied in the literature. Since the number of possible high order features for any given learning problem is exponential in the number of input attributes (where the order of a feature is defined to be the number of attributes of which it is composed), the main problem faced by constructive induction is in selecting which features to use out of this exponentially large set of potential features. For any feature set chosen the desirable characteristics are minimality and generalization performance. This paper uses a combination of genetic algorithms and linear programming techniques to generate feature sets. The genetic algorithm searches for higher order features while at the same time seeking to minimize the size of the feature set in order to produce a feature set with good generalization accuracy. The features chosen are used as inputs to a high order perceptron network, which is trained with an interior point linear programming method. Performance on a holdout set is used in conjunction with complexity penalization in order to insure that the final feature set generated by the genetic algorithm does not overfit the training data.

Original Publication Citation

Andersen, T. L. and Martinez, T. R., "Constructing High Order Perceptrons with Genetic Algorithms", Proceedings of the IEEE International Joint Conference on Neural Networks IJCNN'98, pp. 192-1925, 1998.

Document Type

Peer-Reviewed Article

Publication Date

1998-05-09

Permanent URL

http://hdl.lib.byu.edu/1877/2418

Publisher

IEEE

Language

English

College

Physical and Mathematical Sciences

Department

Computer Science

Share

COinS