In this paper we consider a nonparametric regression model that admits a mix of continuous and discrete regressors, some of which may in fact be redundant (that is, irrelevant). We show that, asymptotically, a data-driven least squares cross-validation method can remove irrelevant regressors. Simulations reveal that this “automatic dimensionality reduction” feature is very effective in finite-sample settings.
This content is only available as a PDF.
Copyright by the President and Fellows of Harvard College and the Massachusetts Institute of Technology