Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Thomas E. Davis
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (1993) 1 (3): 269–288.
Published: 01 September 1993
Abstract
View article
PDF
This paper develops a theoretical framework for the simple genetic algorithm (combinations of the reproduction, mutation, and crossover operators) based on the asymptotic state behavior of a nonstationary Markov chain algorithm model. The methodology borrows heavily from that of simulated annealing. We prove the existence of a unique asymptotic probability distribution (stationary distribution) for the Markov chain when the mutation probability is used with any constant nonzero probability value. We develop a Cramer's Rule representation of the stationary distribution components for all nonzero mutation probability values and then extend the representation to show that the stationary distribution possesses a zero mutation probability limit. Finally, we present a strong ergodicity bound on the mutation probability sequence that ensures that the nonstationary algorithm (which results from varying mutation probability during algorithm execution) achieves the limit distribution asymptotically. Although the focus of this work is on a nonstationary algorithm in which mutation probability is reduced asymptotically to zero via a schedule (in a fashion analogous to simulated annealing), the stationary distribution results (existence, Cramer's Rule representation, and zero mutation probability limit) are directly applicable to conventional, simple genetic algorithm implementations as well.