Results 1 -
3 of
3
Valid post-selection and post-regularization inference: An elementary, general approach, arXiv:1501.03430
, 2015
"... Abstract. Here we present an expository, general analysis of valid post-selection or post-regularization inference about a low-dimensional target parameter, α, in the presence of a very high-dimensional nui-sance parameter, η, which is estimated using modern selection or regularization methods. Our ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
Abstract. Here we present an expository, general analysis of valid post-selection or post-regularization inference about a low-dimensional target parameter, α, in the presence of a very high-dimensional nui-sance parameter, η, which is estimated using modern selection or regularization methods. Our analysis relies on high-level, easy-to-interpret conditions that allow one to clearly see the structures needed for achieving valid post-regularization inference. Simple, readily verifiable sufficient conditions are provided for a class of affine-quadratic models. We rely on asymptotic statements which dramatically simplifies theoretical statements and helps highlight the structure of the problem. We focus our discussion on estimation and inference procedures based on using the empirical analog of theoretical equations M(α, η) = 0 which identify α. Within this structure, we show that setting up such equations in a manner such that the orthogonality/immunization condition ∂ηM(α, η) = 0 at the true parameter values is satisfied, coupled with plausible conditions on the smoothness of M and the quality of the estimator η̂, guarantees that inference for the main parameter α based on testing or point estimation methods discussed below will be regular despite selection or regularization biases occurring in estimation of η. In particular, the estimator of α will often be uniformly consistent at the root-n rate and uniformly asymptotically normal even though estimators η ̂ will generally not be asymptotically linear and regular. The uniformity holds over large classes of models that do not impose highly implausible “beta-min ” conditions. We also show that inference can be carried out by inverting tests formed from Neyman’s C(α) (orthogonal score) statistics. As an application and an illustration of these ideas, we provide an analysis of post-selection inference in the linear models with many regressors and many instruments. We conclude with a review of other developments in post-selection inference and argue that many of the developments can be viewed as special cases of the general framework of orthogonalized estimating equations.