Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-3 of 3
Carola Doerr
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2021) 29 (4): 521–541.
Published: 01 December 2021
Abstract
View article
PDF
It seems very intuitive that for the maximization of the OneMax problem O m ( x ) : = ∑ i = 1 n x i the best that an elitist unary unbiased search algorithm can do is to store a best so far solution, and to modify it with the operator that yields the best possible expected progress in function value. This assumption has been implicitly used in several empirical works. In Doerr et al. (2020), it was formally proven that this approach is indeed almost optimal. In this work, we prove that drift maximization is not optimal. More precisely, we show that for most fitness levels between n / 2 and 2 n / 3 the optimal mutation strengths are larger than the drift-maximizing ones. This implies that the optimal RLS is more risk-affine than the variant maximizing the stepwise expected progress. We show similar results for the mutation rates of the classic (1 + 1) Evolutionary Algorithm (EA) and its resampling variant, the (1 + 1) EA > 0 . As a result of independent interest we show that the optimal mutation strengths, unlike the drift-maximizing ones, can be even.
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2017) 25 (4): 587–606.
Published: 01 December 2017
Abstract
View article
PDF
Black-box complexity theory provides lower bounds for the runtime of black-box optimizers like evolutionary algorithms and other search heuristics and serves as an inspiration for the design of new genetic algorithms. Several black-box models covering different classes of algorithms exist, each highlighting a different aspect of the algorithms under considerations. In this work we add to the existing black-box notions a new elitist black-box model , in which algorithms are required to base all decisions solely on (the relative performance of) a fixed number of the best search points sampled so far. Our elitist model thus combines features of the ranking-based and the memory-restricted black-box models with an enforced usage of truncation selection. We provide several examples for which the elitist black-box complexity is exponentially larger than that of the respective complexities in all previous black-box models, thus showing that the elitist black-box complexity can be much closer to the runtime of typical evolutionary algorithms. We also introduce the concept of p-Monte Carlo black-box complexity , which measures the time it takes to optimize a problem with failure probability at most p . Even for small p , the p -Monte Carlo black-box complexity of a function class can be smaller by an exponential factor than its typically regarded Las Vegas complexity (which measures the expected time it takes to optimize ).
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2015) 23 (4): 641–670.
Published: 01 December 2015
Abstract
View article
PDF
We analyze the unbiased black-box complexities of jump functions with small, medium, and large sizes of the fitness plateau surrounding the optimal solution. Among other results, we show that when the jump size is , that is, when only a small constant fraction of the fitness values is visible, then the unbiased black-box complexities for arities 3 and higher are of the same order as those for the simple OneMax function. Even for the extreme jump function, in which all but the two fitness values and n are blanked out, polynomial time mutation-based (i.e., unary unbiased) black-box optimization algorithms exist. This is quite surprising given that for the extreme jump function almost the whole search space (all but a fraction) is a plateau of constant fitness. To prove these results, we introduce new tools for the analysis of unbiased black-box complexities, for example, selecting the new parent individual not only by comparing the fitnesses of the competing search points but also by taking into account the (empirical) expected fitnesses of their offspring.