Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Gilles Blanchard
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2004) 16 (4): 811–836.
Published: 01 April 2004
Abstract
View articletitled, Different Paradigms for Choosing Sequential Reweighting Algorithms
View
PDF
for article titled, Different Paradigms for Choosing Sequential Reweighting Algorithms
Analyses of the success of ensemble methods in classification have pointed out the important role played by the margin distribution function on the training and test sets. While it is acknowledged that one should generally try to achieve high margins on the training set, the more precise shape of the empirical margin distribution function one should favor in practice is subject to different approaches. We first present two concurrent philosophies for choosing the empirical margin profile: the minimax margin paradigm and the mean and variance paradigm. The best-known representative of the first paradigm is the AdaBoost algorithm, and this philosophy has been shown by several other authors to be closely related to the principle of the support vector machine. We show that the second paradigm is very close in spirit to Fisher's linear discriminant (in a feature space). We construct two boosting-type algorithms, very similar in their form, dedicated to one or the other philosophy. We consequently derive by interpolation a very simple family of iterative reweighting algorithms that can be understood as different trade-offs between the two paradigms and argue from experiments that this can allow for a suitable adaptivity to different classification problems, particularly in the presence of noise or excessive complexity of the base classifiers.