Abstract
Minimizing bounds of leave-one-out errors is an important and efficient approach for support vector machine (SVM) model selection. Past research focuses on their use for classification but not regression. In this letter, we derive various leave-one-out bounds for support vector regression (SVR) and discuss the difference from those for classification. Experiments demonstrate that the proposed bounds are competitive with Bayesian SVR for parameter selection. We also discuss the differentiability of leave-one-out bounds.
This content is only available as a PDF.
© 2005 Massachusetts Institute of Technology
2005
You do not currently have access to this content.