A formalism for describing the dynamics of Genetic Algorithms (GAs) using method s from statistical mechanics is applied to the problem of generalization in a perceptron with binary weights. The dynamics are solved for the case where a new batch of training patterns is presented to each population member each generation, which considerably simplifies the calculation. The theory is shown to agree closely to simulations of a real GA averaged over many runs, accurately predicting the mean best solution found. For weak selection and large problem size the difference equations describing the dynamics can be expressed analytically and we find that the effects of noise due to the finite size of each training batch can be removed by increasing the population size appropriately. If this population resizing is used, one can deduce the most computationally efficient size of training batch each generation. For independent patterns this choice also gives the minimum total number of training patterns used. Although using independent patterns is a very inefficient use of training patterns in general, this work may also prove useful for determining the optimum batch size in the case where patterns are recycled.
Bibliographical note©1996 IOP Publishing Ltd. After the Embargo Period, the full text of the Accepted Manuscript may be made available on the non-commercial repository for anyone with an internet connection to read and download. After the Embargo Period a CC BY-NC-ND 3.0 licence applies to the Accepted Manuscript, in which case it may then only be posted under that CC BY-NC-ND licence provided that all the terms of the licence are adhered to, and any copyright notice and any cover sheet applied by IOP is not deleted or modified.
- genetic algorithms
- statistical mechanics
- binary weights