The dynamics of a genetic algorithm for a simple learning problem

Magnus Rattray, Jon Shapiro

    Research output: Contribution to journalArticle

    Abstract

    A formalism for describing the dynamics of Genetic Algorithms (GAs) using method s from statistical mechanics is applied to the problem of generalization in a perceptron with binary weights. The dynamics are solved for the case where a new batch of training patterns is presented to each population member each generation, which considerably simplifies the calculation. The theory is shown to agree closely to simulations of a real GA averaged over many runs, accurately predicting the mean best solution found. For weak selection and large problem size the difference equations describing the dynamics can be expressed analytically and we find that the effects of noise due to the finite size of each training batch can be removed by increasing the population size appropriately. If this population resizing is used, one can deduce the most computationally efficient size of training batch each generation. For independent patterns this choice also gives the minimum total number of training patterns used. Although using independent patterns is a very inefficient use of training patterns in general, this work may also prove useful for determining the optimum batch size in the case where patterns are recycled.
    Original languageEnglish
    Pages (from-to)7451-7473
    Number of pages23
    JournalJournal of Physics A: Mathematical and General
    Volume29
    Issue number23
    DOIs
    Publication statusPublished - 7 Dec 1996

    Fingerprint

    genetic algorithms
    learning
    Genetic Algorithm
    education
    Batch
    self organizing systems
    difference equations
    Population Size
    Perceptron
    statistical mechanics
    Statistical Mechanics
    Difference equation
    Learning
    Deduce
    Simplify
    Training
    Binary
    formalism
    Simulation
    simulation

    Bibliographical note

    ©1996 IOP Publishing Ltd. After the Embargo Period, the full text of the Accepted Manuscript may be made available on the non-commercial repository for anyone with an internet connection to read and download. After the Embargo Period a CC BY-NC-ND 3.0 licence applies to the Accepted Manuscript, in which case it may then only be posted under that CC BY-NC-ND licence provided that all the terms of the licence are adhered to, and any copyright notice and any cover sheet applied by IOP is not deleted or modified.

    Keywords

    • genetic algorithms
    • statistical mechanics
    • binary weights
    • noise

    Cite this

    Rattray, Magnus ; Shapiro, Jon. / The dynamics of a genetic algorithm for a simple learning problem. In: Journal of Physics A: Mathematical and General. 1996 ; Vol. 29, No. 23. pp. 7451-7473.
    @article{53d1f2c47b7141158e045f90c70b0427,
    title = "The dynamics of a genetic algorithm for a simple learning problem",
    abstract = "A formalism for describing the dynamics of Genetic Algorithms (GAs) using method s from statistical mechanics is applied to the problem of generalization in a perceptron with binary weights. The dynamics are solved for the case where a new batch of training patterns is presented to each population member each generation, which considerably simplifies the calculation. The theory is shown to agree closely to simulations of a real GA averaged over many runs, accurately predicting the mean best solution found. For weak selection and large problem size the difference equations describing the dynamics can be expressed analytically and we find that the effects of noise due to the finite size of each training batch can be removed by increasing the population size appropriately. If this population resizing is used, one can deduce the most computationally efficient size of training batch each generation. For independent patterns this choice also gives the minimum total number of training patterns used. Although using independent patterns is a very inefficient use of training patterns in general, this work may also prove useful for determining the optimum batch size in the case where patterns are recycled.",
    keywords = "genetic algorithms, statistical mechanics, binary weights, noise",
    author = "Magnus Rattray and Jon Shapiro",
    note = "{\circledC}1996 IOP Publishing Ltd. After the Embargo Period, the full text of the Accepted Manuscript may be made available on the non-commercial repository for anyone with an internet connection to read and download. After the Embargo Period a CC BY-NC-ND 3.0 licence applies to the Accepted Manuscript, in which case it may then only be posted under that CC BY-NC-ND licence provided that all the terms of the licence are adhered to, and any copyright notice and any cover sheet applied by IOP is not deleted or modified.",
    year = "1996",
    month = "12",
    day = "7",
    doi = "10.1088/0305-4470/29/23/013",
    language = "English",
    volume = "29",
    pages = "7451--7473",
    journal = "Journal of Physics A: Mathematical and General",
    issn = "0305-4470",
    publisher = "IOP Publishing Ltd.",
    number = "23",

    }

    The dynamics of a genetic algorithm for a simple learning problem. / Rattray, Magnus; Shapiro, Jon.

    In: Journal of Physics A: Mathematical and General, Vol. 29, No. 23, 07.12.1996, p. 7451-7473.

    Research output: Contribution to journalArticle

    TY - JOUR

    T1 - The dynamics of a genetic algorithm for a simple learning problem

    AU - Rattray, Magnus

    AU - Shapiro, Jon

    N1 - ©1996 IOP Publishing Ltd. After the Embargo Period, the full text of the Accepted Manuscript may be made available on the non-commercial repository for anyone with an internet connection to read and download. After the Embargo Period a CC BY-NC-ND 3.0 licence applies to the Accepted Manuscript, in which case it may then only be posted under that CC BY-NC-ND licence provided that all the terms of the licence are adhered to, and any copyright notice and any cover sheet applied by IOP is not deleted or modified.

    PY - 1996/12/7

    Y1 - 1996/12/7

    N2 - A formalism for describing the dynamics of Genetic Algorithms (GAs) using method s from statistical mechanics is applied to the problem of generalization in a perceptron with binary weights. The dynamics are solved for the case where a new batch of training patterns is presented to each population member each generation, which considerably simplifies the calculation. The theory is shown to agree closely to simulations of a real GA averaged over many runs, accurately predicting the mean best solution found. For weak selection and large problem size the difference equations describing the dynamics can be expressed analytically and we find that the effects of noise due to the finite size of each training batch can be removed by increasing the population size appropriately. If this population resizing is used, one can deduce the most computationally efficient size of training batch each generation. For independent patterns this choice also gives the minimum total number of training patterns used. Although using independent patterns is a very inefficient use of training patterns in general, this work may also prove useful for determining the optimum batch size in the case where patterns are recycled.

    AB - A formalism for describing the dynamics of Genetic Algorithms (GAs) using method s from statistical mechanics is applied to the problem of generalization in a perceptron with binary weights. The dynamics are solved for the case where a new batch of training patterns is presented to each population member each generation, which considerably simplifies the calculation. The theory is shown to agree closely to simulations of a real GA averaged over many runs, accurately predicting the mean best solution found. For weak selection and large problem size the difference equations describing the dynamics can be expressed analytically and we find that the effects of noise due to the finite size of each training batch can be removed by increasing the population size appropriately. If this population resizing is used, one can deduce the most computationally efficient size of training batch each generation. For independent patterns this choice also gives the minimum total number of training patterns used. Although using independent patterns is a very inefficient use of training patterns in general, this work may also prove useful for determining the optimum batch size in the case where patterns are recycled.

    KW - genetic algorithms

    KW - statistical mechanics

    KW - binary weights

    KW - noise

    UR - http://iopscience.iop.org/0305-4470/29/23/013/

    U2 - 10.1088/0305-4470/29/23/013

    DO - 10.1088/0305-4470/29/23/013

    M3 - Article

    VL - 29

    SP - 7451

    EP - 7473

    JO - Journal of Physics A: Mathematical and General

    JF - Journal of Physics A: Mathematical and General

    SN - 0305-4470

    IS - 23

    ER -