Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Anton Bouter
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2024) 32 (4): 371–397.
Published: 02 December 2024
Abstract
View articletitled, Parameterless Gene-Pool Optimal Mixing Evolutionary Algorithms
View
PDF
for article titled, Parameterless Gene-Pool Optimal Mixing Evolutionary Algorithms
When it comes to solving optimization problems with evolutionary algorithms (EAs) in a reliable and scalable manner, detecting and exploiting linkage information, that is, dependencies between variables, can be key. In this paper, we present the latest version of, and propose substantial enhancements to, the gene-pool optimal mixing evolutionary algorithm (GOMEA): an EA explicitly designed to estimate and exploit linkage information. We begin by performing a large-scale search over several GOMEA design choices to understand what matters most and obtain a generally best-performing version of the algorithm. Next, we introduce a novel version of GOMEA, called CGOMEA, where linkage-based variation is further improved by filtering solution mating based on conditional dependencies. We compare our latest version of GOMEA, the newly introduced CGOMEA, and another contending linkage-aware EA, DSMGA-II, in an extensive experimental evaluation, involving a benchmark set of nine black-box problems that can be solved efficiently only if their inherent dependency structure is unveiled and exploited. Finally, in an attempt to make EAs more usable and resilient to parameter choices, we investigate the performance of different automatic population management schemes for GOMEA and CGOMEA, de facto making the EAs parameterless. Our results show that GOMEA and CGOMEA significantly outperform the original GOMEA and DSMGA-II on most problems, setting a new state of the art for the field.
Includes: Supplementary data
Journal Articles
Achieving Highly Scalable Evolutionary Real-Valued Optimization by Exploiting Partial Evaluations
UnavailablePublisher: Journals Gateway
Evolutionary Computation (2021) 29 (1): 129–155.
Published: 01 March 2021
FIGURES
Abstract
View articletitled, Achieving Highly Scalable Evolutionary Real-Valued Optimization by Exploiting Partial Evaluations
View
PDF
for article titled, Achieving Highly Scalable Evolutionary Real-Valued Optimization by Exploiting Partial Evaluations
It is known that to achieve efficient scalability of an Evolutionary Algorithm (EA), dependencies (also known as linkage) must be properly taken into account during variation. In a Gray-Box Optimization (GBO) setting, exploiting prior knowledge regarding these dependencies can greatly benefit optimization. We specifically consider the setting where partial evaluations are possible, meaning that the partial modification of a solution can be efficiently evaluated. Such problems are potentially very difficult, for example, non-separable, multimodal, and multiobjective. The Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) can effectively exploit partial evaluations, leading to a substantial improvement in performance and scalability. GOMEA was recently shown to be extendable to real-valued optimization through a combination with the real-valued estimation of distribution algorithm AMaLGaM. In this article, we definitively introduce the Real-Valued GOMEA (RV-GOMEA), and introduce a new variant, constructed by combining GOMEA with what is arguably the best-known real-valued EA, the Covariance Matrix Adaptation Evolution Strategies (CMA-ES). Both variants of GOMEA are compared to L-BFGS and the Limited Memory CMA-ES (LM-CMA-ES). We show that both variants of RV-GOMEA achieve excellent performance and scalability in a GBO setting, which can be orders of magnitude better than that of EAs unable to efficiently exploit the GBO setting.
Includes: Supplementary data