Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Martin S. Krejca
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2021) 29 (4): 543–563.
Published: 01 December 2021
Abstract
View articletitled, The Univariate Marginal Distribution Algorithm Copes Well with
Deception and Epistasis
View
PDF
for article titled, The Univariate Marginal Distribution Algorithm Copes Well with
Deception and Epistasis
In their recent work, Lehre and Nguyen ( 2019 ) show that the univariate marginal distribution algorithm (UMDA) needs time exponential in the parent populations size to optimize the DeceptiveLeadingBlocks ( DLB ) problem. They conclude from this result that univariate EDAs have difficulties with deception and epistasis. In this work, we show that this negative finding is caused by the choice of the parameters of the UMDA. When the population sizes are chosen large enough to prevent genetic drift, then the UMDA optimizes the DLB problem with high probability with at most λ ( n 2 + 2 e ln n ) fitness evaluations. Since an offspring population size λ of order n log n can prevent genetic drift, the UMDA can solve the DLB problem with O ( n 2 log n ) fitness evaluations. In contrast, for classic evolutionary algorithms no better runtime guarantee than O ( n 3 ) is known (which we prove to be tight for the ( 1 + 1 ) EA), so our result rather suggests that the UMDA can cope well with deception and epistatis. From a broader perspective, our result shows that the UMDA can cope better with local optima than many classic evolutionary algorithms; such a result was previously known only for the compact genetic algorithm. Together with the lower bound of Lehre and Nguyen, our result for the first time rigorously proves that running EDAs in the regime with genetic drift can lead to drastic performance losses.
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2016) 24 (2): 237–254.
Published: 01 June 2016
FIGURES
Abstract
View articletitled, Robustness of Ant Colony Optimization to Noise
View
PDF
for article titled, Robustness of Ant Colony Optimization to Noise
Recently, ant colony optimization (ACO) algorithms have proven to be efficient in uncertain environments, such as noisy or dynamically changing fitness functions. Most of these analyses have focused on combinatorial problems such as path finding. We rigorously analyze an ACO algorithm optimizing linear pseudo-Boolean functions under additive posterior noise. We study noise distributions whose tails decay exponentially fast, including the classical case of additive Gaussian noise. Without noise, the classical EA outperforms any ACO algorithm, with smaller being better; however, in the case of large noise, the EA fails, even for high values of (which are known to help against small noise). In this article, we show that ACO is able to deal with arbitrarily large noise in a graceful manner; that is, as long as the evaporation factor is small enough, dependent on the variance of the noise and the dimension n of the search space, optimization will be successful. We also briefly consider the case of prior noise and prove that ACO can also efficiently optimize linear functions under this noise model.