Abstract
We study the finite sample behavior of Lasso-based inference methods such as post–double Lasso and debiased Lasso. We show that these methods can exhibit substantial omitted variable biases (OVBs) due to Lasso's not selecting relevant controls. This phenomenon can occur even when the coefficients are sparse and the sample size is large and larger than the number of controls. Therefore, relying on the existing asymptotic inference theory can be problematic in empirical applications. We compare the Lasso-based inference methods to modern high-dimensional OLS-based methods and provide practical guidance.
© 2021 The President and Fellows of Harvard College and the Massachusetts Institute of Technology
2021
The President and Fellows of Harvard College and the Massachusetts Institute of Technology
You do not currently have access to this content.