Noisy fitness evaluation in genetic algorithms and the dynamics of learning

Magnus Rattray, Jon Shapiro

    Research output: Chapter in Book/Published conference outputChapter

    Abstract

    A theoretical model is presented which describes selection in a genetic algorithm (GA) under a stochastic fitness measure and correctly accounts for finite population effects. Although this model describes a number of selection schemes, we only consider Boltzmann selection in detail here as results for this form of selection are particularly transparent when fitness is corrupted by additive Gaussian noise. Finite population effects are shown to be of fundamental importance in this case, as the noise has no effect in the infinite population limit. In the limit of weak selection we show how the effects of any Gaussian noise can be removed by increasing the population size appropriately. The theory is tested on two closely related problems: the one-max problem corrupted by Gaussian noise and generalization in a perceptron with binary weights. The averaged dynamics can be accurately modelled for both problems using a formalism which describes the dynamics of the GA using methods from statistical mechanics. The second problem is a simple example of a learning problem and by considering this problem we show how the accurate characterization of noise in the fitness evaluation may be relevant in machine learning. The training error (negative fitness) is the number of misclassified training examples in a batch and can be considered as a noisy version of the generalization error if an independent batch is used for each evaluation. The noise is due to the finite batch size and in the limit of large problem size and weak selection we show how the effect of this noise can be removed by increasing the population size. This allows the optimal batch size to be determined, which minimizes computation time as well as the total number of training examples required.
    Original languageEnglish
    Title of host publicationFoundations of genetic algorithms: 4th workshop : revised papers
    EditorsRichard Belew, Michael Vose
    Place of PublicationSan Francisco
    PublisherMorgan Kaufmann
    Pages117-139
    Number of pages23
    Volume4
    ISBN (Print)1-55860-460-X
    Publication statusPublished - 2 Apr 1998

    Bibliographical note

    Copyright of Elsevier Availble from Google Books

    Keywords

    • Noisy fitness evaluation
    • genetic algorithms
    • dynamics of learning

    Fingerprint

    Dive into the research topics of 'Noisy fitness evaluation in genetic algorithms and the dynamics of learning'. Together they form a unique fingerprint.

    Cite this