Results 1  10
of
59
2010): “Sparse Models and Methods for Optimal Instruments with an Application to Eminent Domain,” Arxiv Working Paper
"... Abstract. We develop results for the use of Lasso and PostLasso methods to form firststage predictions and estimate optimal instruments in linear instrumental variables (IV) models with many instruments, p. Our results apply even when p is much larger than the sample size, n. We show that the IV e ..."
Abstract

Cited by 55 (19 self)
 Add to MetaCart
Abstract. We develop results for the use of Lasso and PostLasso methods to form firststage predictions and estimate optimal instruments in linear instrumental variables (IV) models with many instruments, p. Our results apply even when p is much larger than the sample size, n. We show that the IV estimator based on using Lasso or PostLasso in the first stage is rootn consistent and asymptotically normal when the firststage is approximately sparse; i.e. when the conditional expectation of the endogenous variables given the instruments can be wellapproximated by a relatively small set of variables whose identities may be unknown. We also show the estimator is semiparametrically efficient when the structural error is homoscedastic. Notably our results allow for imperfect model selection, and do not rely upon the unrealistic ”betamin ” conditions that are widely used to establish validity of inference following model selection. In simulation experiments, the Lassobased IV estimator with a datadriven penalty performs well compared to recently advocated manyinstrumentrobust procedures. In an empirical example dealing with the effect of judicial eminent domain decisions on economic outcomes, the Lassobased IV estimator outperforms an intuitive benchmark. Optimal instruments are conditional expectations. In developing the IV results, we estab
2006) Estimation with Many Instrumental Variables mimeo
"... Using many valid instrumental variables has the potential to improve efficiency but makes the usual inference procedures inaccurate. We give corrected standard errors, an extension of Bekker (1994) to nonnormal disturbances, that adjust for many instruments. We find that this adjustment is useful in ..."
Abstract

Cited by 53 (5 self)
 Add to MetaCart
Using many valid instrumental variables has the potential to improve efficiency but makes the usual inference procedures inaccurate. We give corrected standard errors, an extension of Bekker (1994) to nonnormal disturbances, that adjust for many instruments. We find that this adjustment is useful in empirical work, simulations, and in the asymptotic theory. Use of the corrected standard errors in tratios leads to an asymptotic approximation order that is the same when the number of instrumental variables grows as when the number of instruments is fixed. We also give a version of the Kleibergen (2002) weak instrument statistic that is robust to many instruments.
Controlling for endogeneity with instrumental variables in strategic management research
, 2008
"... ..."
GMM with many moment conditions
 Econometrica
, 2006
"... This paper provides a first order asymptotic theory for generalized method of moments (GMM) estimators when the number of moment conditions is allowed to increase with the sample size and the moment conditions may be weak. Examples in which these asymptotics are relevant include instrumental variabl ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
(Show Context)
This paper provides a first order asymptotic theory for generalized method of moments (GMM) estimators when the number of moment conditions is allowed to increase with the sample size and the moment conditions may be weak. Examples in which these asymptotics are relevant include instrumental variable (IV) estimation with many (possibly weak or uninformed) instruments and some panel data models that cover moderate time spans and have correspondingly large numbers of instruments. Under certain regularity conditions, the GMM estimators are shown to converge in probability but not necessarily to the true parameter, and conditions for consistent GMM estimation are given. A general framework for the GMM limit distribution theory is developed based on epiconvergence methods. Some illustrations are provided, including consistent GMM estimation of a panel model with time varying individual effects, consistent limited information maximum likelihood estimation as a continuously updated GMM estimator, and consistent IV structural estimation using large numbers of weak or irrelevant instruments. Some simulations are reported.
2004): ”Estimation and Testing Using Jackknife IV in Heteroskedastic Regression with Many Weak Instruments,” working paper
"... This paper develops Wald type tests for general possibly nonlinear restrictions, in the context of heteroskedastic IV regression with many weak instruments. In particular, it is first shown that consistency and asymptotically normality can be obtained when estimating structural parameters using JIVE ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
This paper develops Wald type tests for general possibly nonlinear restrictions, in the context of heteroskedastic IV regression with many weak instruments. In particular, it is first shown that consistency and asymptotically normality can be obtained when estimating structural parameters using JIVE, even when errors exhibit heteroskedasticity of unkown form. This is not the case, however, with other well known IV estimators, such as LIML, Fuller’s modified LIML, 2SLS, and B2SLS, which are shown to be inconsistent. Second, new covariance matrix estimators (and corresponding Wald test statistics) are proposed for JIVE, which are consistent even when instrument weakness is such that the rate of growth of the concentration parameter, rn, is slower than the rate of growth of the the number of instruments, Kn, and possibly much slower than the sample size, n, provided that Kn rn → 0 as n → ∞. The primary advantage of our tests, relative to those proposed previously in the literature, is that one can test general nonlinear hypotheses, as opposed to simple null hypotheses of the form H0: β = β∗, where β ∗ is the value of β under the null. We feel that this feature, taken together with the fact that the tests are robust to unconditional
Asymptotic normality of singleequation estimators for the case with a large number of weak instruments
, 2003
"... This paper analyzes conditions under which various singleequation estimators are asymptotically normal in a simultaneous equations framework with many weak instruments. In particular, our paper adds to the many instruments asymptotic normality literature, including papers ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
This paper analyzes conditions under which various singleequation estimators are asymptotically normal in a simultaneous equations framework with many weak instruments. In particular, our paper adds to the many instruments asymptotic normality literature, including papers
NORMALIZATION IN ECONOMETRICS
"... □ The issue of normalization arises whenever two different values for a vector of unknown parameters imply the identical economic model. A normalization implies not just a rule for selecting which among equivalent points to call the maximum likelihood estimate (MLE), but also governs the topography ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
□ The issue of normalization arises whenever two different values for a vector of unknown parameters imply the identical economic model. A normalization implies not just a rule for selecting which among equivalent points to call the maximum likelihood estimate (MLE), but also governs the topography of the set of points that go into a smallsample confidence interval associated with that MLE. A poor normalization can lead to multimodal distributions, disjoint confidence intervals, and very misleading characterizations of the true statistical uncertainty. This paper introduces an identification principle as a framework upon which a normalization should be imposed, according to which the boundaries of the allowable parameter space should correspond to loci along which the model is locally unidentified. We illustrate these issues with examples taken from mixture models, structural vector autoregressions, and cointegration models.
Small Concentration Asymptotics and Instrumental Variables Inference
, 2005
"... Poskitt and Skeels (2003) provide a new approximation to the sampling distribution of the IV estimator in a simultaneous equations model, the approximation is appropriate when the concentration parameter associated with the reduced form model is small. A basic purpose of this paper is to provide the ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Poskitt and Skeels (2003) provide a new approximation to the sampling distribution of the IV estimator in a simultaneous equations model, the approximation is appropriate when the concentration parameter associated with the reduced form model is small. A basic purpose of this paper is to provide the practitioner with easily implemented inferential tools based upon extensions to these small concentration asymptotic results. We present various approximations to the sampling distribution of functions of the IV estimator based upon small concentration asymptotics, and investigate hypothesis testing procedures and confidence region construction using these approximations. It is shown that the test statistics advanced are asymptotically pivotal and that the associated critical regions generate locally uniformly most powerful invariant tests. The confidence regions are also shown to be valid. The smallconcentration asymptotic approximations lead to a nonstandard application of standard distributions, facilitating numerical implementation using commonly available software.
Aid and Growth  Have We Come Full Circle?
, 2009
"... The micromacro paradox has been revived. Despite broadly positive evaluations at the micro and mesolevels, recent literature has turned decidedly pessimistic with respect to the ability of foreign aid to foster economic growth. Policy implications, such as the complete cessation of aid to Africa, ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The micromacro paradox has been revived. Despite broadly positive evaluations at the micro and mesolevels, recent literature has turned decidedly pessimistic with respect to the ability of foreign aid to foster economic growth. Policy implications, such as the complete cessation of aid to Africa, are being drawn on the basis of fragile evidence. This paper first assesses the aidgrowth literature with a focus on recent contributions. The aidgrowth literature is then framed, for the first time, in terms of the Rubin Causal Model, applied at the macroeconomic level. Our results show that aid has a positive and statistically significant causal effect on growth over the long run with point estimates at levels suggested by growth theory. We conclude that aid remains an important tool for enhancing the development prospects of poor nations.