Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-3 of 3
Chao Qian
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2021) 29 (4): 463–490.
Published: 01 December 2021
FIGURES
Abstract
View articletitled, Multiobjective Evolutionary Algorithms Are Still Good: Maximizing
Monotone Approximately Submodular Minus Modular Functions
View
PDF
for article titled, Multiobjective Evolutionary Algorithms Are Still Good: Maximizing
Monotone Approximately Submodular Minus Modular Functions
As evolutionary algorithms (EAs) are general-purpose optimization algorithms, recent theoretical studies have tried to analyze their performance for solving general problem classes, with the goal of providing a general theoretical explanation of the behavior of EAs. Particularly, a simple multiobjective EA, that is, GSEMO, has been shown to be able to achieve good polynomial-time approximation guarantees for submodular optimization, where the objective function is only required to satisfy some properties and its explicit formulation is not needed. Submodular optimization has wide applications in diverse areas, and previous studies have considered the cases where the objective functions are monotone submodular, monotone non-submodular, or non-monotone submodular. To complement this line of research, this article studies the problem class of maximizing monotone approximately submodular minus modular functions (i.e., g - c ) with a size constraint, where g is a so-called non-negative monotone approximately submodular function and c is a so-called non-negative modular function, resulting in the objective function ( g - c ) being non-monotone non-submodular in general. Different from previous analyses, we prove that by optimizing the original objective function ( g - c ) and the size simultaneously, the GSEMO fails to achieve a good polynomial-time approximation guarantee. However, we also prove that by optimizing a distorted objective function and the size simultaneously, the GSEMO can still achieve the best-known polynomial-time approximation guarantee. Empirical studies on the applications of Bayesian experimental design and directed vertex cover show the excellent performance of the GSEMO.
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2018) 26 (2): 237–267.
Published: 01 June 2018
FIGURES
| View All (4)
Abstract
View articletitled, On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments
View
PDF
for article titled, On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments
In real-world optimization tasks, the objective (i.e., fitness) function evaluation is often disturbed by noise due to a wide range of uncertainties. Evolutionary algorithms are often employed in noisy optimization, where reducing the negative effect of noise is a crucial issue. Sampling is a popular strategy for dealing with noise: to estimate the fitness of a solution, it evaluates the fitness multiple ( ) times independently and then uses the sample average to approximate the true fitness. Obviously, sampling can make the fitness estimation closer to the true value, but also increases the estimation cost. Previous studies mainly focused on empirical analysis and design of efficient sampling strategies, while the impact of sampling is unclear from a theoretical viewpoint. In this article, we show that sampling can speed up noisy evolutionary optimization exponentially via rigorous running time analysis. For the (1 1)-EA solving the OneMax and the LeadingOnes problems under prior (e.g., one-bit) or posterior (e.g., additive Gaussian) noise, we prove that, under a high noise level, the running time can be reduced from exponential to polynomial by sampling. The analysis also shows that a gap of one on the value of for sampling can lead to an exponential difference on the expected running time, cautioning for a careful selection of . We further prove by using two illustrative examples that sampling can be more effective for noise handling than parent populations and threshold selection, two strategies that have shown to be robust to noise. Finally, we also show that sampling can be ineffective when noise does not bring a negative impact.
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2018) 26 (1): 1–41.
Published: 01 March 2018
FIGURES
| View All (9)
Abstract
View articletitled, Analyzing Evolutionary Optimization in Noisy Environments
View
PDF
for article titled, Analyzing Evolutionary Optimization in Noisy Environments
Many optimization tasks must be handled in noisy environments, where the exact evaluation of a solution cannot be obtained, only a noisy one. For optimization of noisy tasks, evolutionary algorithms (EAs), a type of stochastic metaheuristic search algorithm, have been widely and successfully applied. Previous work mainly focuses on the empirical study and design of EAs for optimization under noisy conditions, while the theoretical understandings are largely insufficient. In this study, we first investigate how noisy fitness can affect the running time of EAs. Two kinds of noise-helpful problems are identified, on which the EAs will run faster with the presence of noise, and thus the noise should not be handled. Second, on a representative noise-harmful problem in which the noise has a strong negative effect, we examine two commonly employed mechanisms dealing with noise in EAs: reevaluation and threshold selection . The analysis discloses that using these two strategies simultaneously is effective for the one-bit noise but ineffective for the asymmetric one-bit noise. Smooth threshold selection is then proposed, which can be proved to be an effective strategy to further improve the noise tolerance ability in the problem. We then complement the theoretical analysis by experiments on both synthetic problems as well as two combinatorial problems, the minimum spanning tree and the maximum matching. The experimental results agree with the theoretical findings and also show that the proposed smooth threshold selection can deal with the noise better.