Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Kenneth De Jong
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2023) 31 (2): 73–79.
Published: 01 June 2023
Abstract
View article
PDF
We reflect on 30 years of the journal Evolutionary Computation . Taking the papers published in the first volume in 1993 as a springboard, as the founding and current Editors-in-Chief, we comment on the beginnings of the field, evaluate the extent to which the field has both grown and itself evolved, and provide our own perpectives on where the future lies.
Journal Articles
Publisher: Journals Gateway
Evolutionary Computation (2018) 26 (1): 43–66.
Published: 01 March 2018
FIGURES
| View All (11)
Abstract
View article
PDF
Many real-world problems involve massive amounts of data. Under these circumstances learning algorithms often become prohibitively expensive, making scalability a pressing issue to be addressed. A common approach is to perform sampling to reduce the size of the dataset and enable efficient learning. Alternatively, one customizes learning algorithms to achieve scalability. In either case, the key challenge is to obtain algorithmic efficiency without compromising the quality of the results. In this article we discuss a meta-learning algorithm (PSBML) that combines concepts from spatially structured evolutionary algorithms (SSEAs) with concepts from ensemble and boosting methodologies to achieve the desired scalability property. We present both theoretical and empirical analyses which show that PSBML preserves a critical property of boosting, specifically, convergence to a distribution centered around the margin. We then present additional empirical analyses showing that this meta-level algorithm provides a general and effective framework that can be used in combination with a variety of learning classifiers. We perform extensive experiments to investigate the trade-off achieved between scalability and accuracy, and robustness to noise, on both synthetic and real-world data. These empirical results corroborate our theoretical analysis, and demonstrate the potential of PSBML in achieving scalability without sacrificing accuracy.