Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Amirhossein Rajabi
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2023) 31 (1): 1–29.
Published: 01 March 2023
Abstract
View article
PDF
Recently a mechanism called stagnation detection was proposed that automatically adjusts the mutation rate of evolutionary algorithms when they encounter local optima. The so-called SD-(1 + 1) EA introduced by Rajabi and Witt (2022) adds stagnation detection to the classical (1 + 1) EA with standard bit mutation. This algorithm flips each bit independently with some mutation rate, and stagnation detection raises the rate when the algorithm is likely to have encountered a local optimum. In this article, we investigate stagnation detection in the context of the k -bit flip operator of randomized local search that flips k bits chosen uniformly at random and let stagnation detection adjust the parameter k . We obtain improved runtime results compared with the SD-(1 + 1) EA amounting to a speedup of at least ( 1 - o ( 1 ) ) 2 π m , where m is the so-called gap size, that is, the distance to the next improvement. Moreover, we propose additional schemes that prevent infinite optimization times even if the algorithm misses a working choice of k due to unlucky events. Finally, we present an example where standard bit mutation still outperforms the k -bit flip operator with stagnation detection.