Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Masanori Kawakita
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2008) 20 (11): 2792–2838.
Published: 01 November 2008
Abstract
View article
PDF
We propose a local boosting method in classification problems borrowing from an idea of the local likelihood method. Our proposal, local boosting, includes a simple device for localization for computational feasibility. We proved the Bayes risk consistency of the local boosting in the framework of Probably approximately correct learning. Inspection of the proof provides a useful viewpoint for comparing ordinary boosting and local boosting with respect to the estimation error and the approximation error. Both boosting methods have the Bayes risk consistency if their approximation errors decrease to zero. Compared to ordinary boosting, local boosting may perform better by controlling the trade-off between the estimation error and the approximation error. Ordinary boosting with complicated base classifiers or other strong classification methods, including kernel machines, may have classification performance comparable to local boosting with simple base classifiers, for example, decision stumps. Local boosting, however, has an advantage with respect to interpretability. Local boosting with simple base classifiers offers a simple way to specify which features are informative and how their values contribute to a classification rule even though locally. Several numerical studies on real data sets confirm these advantages of local boosting.