Skip to Main Content
Table 2:
Summary of the algorithms employed in this article, which were selected using ICARUS (Muñoz and Kirley, 2016) and the publicly available results from the BBOB sessions at the 2009 and 2010 GECCO Conferences. Algorithm names are as used for the dataset descriptions available at http://coco.gforge.inria.fr/doku.php?id=algorithms .
AlgorithmDescriptionReference
BFGS The MATLAB implementation () of this quasi-Newton method, which is randomly restarted whenever a numerical error occurs. The Hessian matrix is iteratively approximated using forward finite differences, with a step size equal to the square root of the machine precision. Other than the default parameters, the function and step tolerances were set to and 0 respectively. (Ros, 2009a) 
BIPOP-CMA-ES A multistart CMA-ES variant with equal budgets for two interlaced restart strategies. After completing a first run with a population of size , the first strategy doubles the population size; while the second one keeps a small population given by , where is the latest population size from the first strategy, , and is an independent uniformly distributed random number. Therefore, . All other parameters are at default values. (Hansen, 2009) 
LSstep An axis parallel line search method effective only on separable functions. To find a new solution, it optimizes over each variable independently, keeping every other variable fixed. The STEP version of this method uses interval division, i.e., it starts from an interval corresponding to the upper and lower bounds of a variable, which is divided by half at each iteration. The next sampled interval is based on its “difficulty,” i.e., by its belief of how hard it would be to improve the best-so-far solution by sampling from the respective interval. The measure of difficulty is the coefficient from a quadratic function , which must go through the both interval boundary points. (Pošík and Huyer, 2012) 
Nelder–Doerr A version of the Nelder–Mead algorithm, that uses random restarts, resizing and half-runs. In a resizing step, the current simplex is replaced by a “fat” simplex, which maintains the best vertex, but relocates the remaining ones such that they have the same average distance to the center of the simplex. Such steps are performed every 1000 algorithm iterations. In a half-run, the algorithm is stopped after interations, with only the most promising half-runs being allowed to continue. (Doerr et al., 2009) 
CMA-ES A simple CMA-ES variant, with one parent and one offspring, elitist selection and random restarts. All other parameters are set to the defaults. (Auger and Hansen, 2009) 
AlgorithmDescriptionReference
BFGS The MATLAB implementation () of this quasi-Newton method, which is randomly restarted whenever a numerical error occurs. The Hessian matrix is iteratively approximated using forward finite differences, with a step size equal to the square root of the machine precision. Other than the default parameters, the function and step tolerances were set to and 0 respectively. (Ros, 2009a) 
BIPOP-CMA-ES A multistart CMA-ES variant with equal budgets for two interlaced restart strategies. After completing a first run with a population of size , the first strategy doubles the population size; while the second one keeps a small population given by , where is the latest population size from the first strategy, , and is an independent uniformly distributed random number. Therefore, . All other parameters are at default values. (Hansen, 2009) 
LSstep An axis parallel line search method effective only on separable functions. To find a new solution, it optimizes over each variable independently, keeping every other variable fixed. The STEP version of this method uses interval division, i.e., it starts from an interval corresponding to the upper and lower bounds of a variable, which is divided by half at each iteration. The next sampled interval is based on its “difficulty,” i.e., by its belief of how hard it would be to improve the best-so-far solution by sampling from the respective interval. The measure of difficulty is the coefficient from a quadratic function , which must go through the both interval boundary points. (Pošík and Huyer, 2012) 
Nelder–Doerr A version of the Nelder–Mead algorithm, that uses random restarts, resizing and half-runs. In a resizing step, the current simplex is replaced by a “fat” simplex, which maintains the best vertex, but relocates the remaining ones such that they have the same average distance to the center of the simplex. Such steps are performed every 1000 algorithm iterations. In a half-run, the algorithm is stopped after interations, with only the most promising half-runs being allowed to continue. (Doerr et al., 2009) 
CMA-ES A simple CMA-ES variant, with one parent and one offspring, elitist selection and random restarts. All other parameters are set to the defaults. (Auger and Hansen, 2009) 
Close Modal

or Create an Account

Close Modal
Close Modal