Results 1  10
of
53
Testing for Indeterminacy: An Application to U.S. Monetary Policy
, 2003
"... This paper considers a prototypical monetary business cycle model for the U.S. economy, in which the equilibrium is undetermined if monetary policy is `passive'. In previous multivariate studies it has been common practice to restrict parameter estimates to values for which the equilibrium i ..."
Abstract

Cited by 130 (6 self)
 Add to MetaCart
This paper considers a prototypical monetary business cycle model for the U.S. economy, in which the equilibrium is undetermined if monetary policy is `passive'. In previous multivariate studies it has been common practice to restrict parameter estimates to values for which the equilibrium is unique. We show how the likelihoodbased estimation of dynamic stochastic general equilibrium models can be extended to allow for indeterminacies and sunspot fluctuations. We construct
Indirect inference and calibration of dynamic stochastic general equilibrium models
 Journal of Econometrics
, 2007
"... We advocate in this paper the use of a Sequential Partial Indirect Inference (SPII) approach, in order to account for calibration practice where dynamic stochastic general equilibrium models (DGSE) are studied only through their ability to reproduce some wellchosen moments. We stress that, despite ..."
Abstract

Cited by 29 (0 self)
 Add to MetaCart
We advocate in this paper the use of a Sequential Partial Indirect Inference (SPII) approach, in order to account for calibration practice where dynamic stochastic general equilibrium models (DGSE) are studied only through their ability to reproduce some wellchosen moments. We stress that, despite a lack of statistical formalization, the controversial calibration methodology addresses a genuine issue on the consequences of misspecification in highly nonlinear and dynamic structural macromodels. Such likely misspecification is even more detrimental than for direct inference, since the misspecified model is used for building simulated paths. The only way to get robust estimators, but also to assess the model despite misspecification consists in examining the structural model through a convenient and parsimonious instrumental model, which basically does not capture what goes wrong in the simulated paths. We argue that a welldriven SPII strategy might be seen as a rigorous calibrationnist approach, that captures both the advantages of this approach (accounting for structural “astatistical” ideas) and of the inferential approach (precise appraisal of loss functions and conditions of validity). This methodology should be useful for the empirical assessment of structural models such as those stemming from the Real Business Cycle theory or the asset pricing literature.
DSGE Models in a DataRich Environment
 NBER WORKING PAPERS 12772, NATIONAL BUREAU OF ECONOMIC RESEARCH, INC
, 2005
"... Standard practice for the estimation of dynamic stochastic general equilibrium (DSGE) models maintains the assumption that economic variables are properly measured by a single indicator, and that all relevant information for the estimation is summarized by a small number of data series. However, rec ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Standard practice for the estimation of dynamic stochastic general equilibrium (DSGE) models maintains the assumption that economic variables are properly measured by a single indicator, and that all relevant information for the estimation is summarized by a small number of data series. However, recent empirical research on factor models has shown that information contained in large data sets is relevant for the evolution of important macroeconomic series. This suggests that conventional model estimates and inference based on estimated DSGE models are likely to be distorted. In this paper, we propose an empirical framework for the estimation of DSGE models that exploits the relevant information from a datarich environment. This framework provides an interpretation of all information contained in a large data set, and in particular of the latent factors, through the lenses of a DSGE model. The estimation involves Bayesian MarkovChain MonteCarlo (MCMC) methods extended so that the estimates can, in some cases, inherit the properties of classical maximum likelihood estimation. We apply this estimation approach to a stateoftheart DSGE monetary model. Treating theoretical concepts of the model – such as output, inflation and employment – as partially observed, we show that the information from a large set of macroeconomic indicators is important for accurate estimation of the model. It also allows us to improve the forecasts of important economic variables.
Estimation of dsge models when the data are persistent
, 2007
"... Dynamic Stochastic General Equilibrium (DSGE) models are often solved and estimated under specific assumptions as to whether the exogenous variables are difference or trend stationary. However, even mild departures of the data generating process from these assumptions can severely bias the estimates ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
Dynamic Stochastic General Equilibrium (DSGE) models are often solved and estimated under specific assumptions as to whether the exogenous variables are difference or trend stationary. However, even mild departures of the data generating process from these assumptions can severely bias the estimates of the model parameters. This paper proposes new estimators that do not require researchers to take a stand on whether shocks have permanent or transitory effects. These procedures have two key features. First, the same filter is applied to both the data and the model variables. Second, the filtered variables are stationary when evaluated at the true parameter vector. The estimators are approximately normally distributed not only when the shocks are mildly persistent, but also when they have near or exact unit roots. Simulations show that these robust estimators perform well especially when the shocks are highly persistent yet stationary. In such cases, linear detrending and first differencing are shown to yield biased or imprecise estimates.
How Much Inflation is Necessary to Grease the Wheels?
, 2008
"... This paper studies Tobin's proposition that inflation "\greases" the wheels of the labor market. The analysis is carried out using a simple dynamic stochastic general equilibrium model with asymmetric wage adjustment costs. Optimal inflation is determined by a benevolent government th ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
This paper studies Tobin's proposition that inflation "\greases" the wheels of the labor market. The analysis is carried out using a simple dynamic stochastic general equilibrium model with asymmetric wage adjustment costs. Optimal inflation is determined by a benevolent government that maximizes the households' welfare. The Simulated Method of Moments is used to estimate the nonlinear model based on its secondorder approximation. Econometric results indicate that nominal wages are downwardly rigid and that the optimal level of grease inflation for the U.S. economy is about 1.2 percent per year, with a 95 % confidence interval ranging from 0.2 to 1.6 percent.
Dynamic Identification of DSGE Models
, 2009
"... This paper provides conditions for identifying the parameters of a DSGE model and presents its reduced form in the general case when the number of shocks does not equal the number of observed endogenous variables. Combining results from classical econometric theory with structural identification ana ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This paper provides conditions for identifying the parameters of a DSGE model and presents its reduced form in the general case when the number of shocks does not equal the number of observed endogenous variables. Combining results from classical econometric theory with structural identification analysis in control theory, we establish an upper bound on the number of free parameters that can be estimated, and by implication, the minimum number of parameters that must be held fixed. The assumption that shocks are univariate instead of vectorautoregressive processes often impose enough restrictions to satisfy the order condition. A rank condition is also developed to check whether there is a onetoone mapping from the parameters of the optimizing model to the reduced form that induces the autocovariances. We show that identification can fail even in a simple stochastic growth model. Our conditions do not depend on the choice of the estimator and should be verified before estimation.
Devaluations, output and the balance sheet effect: a structural econometric analysis
, 2006
"... ..."
INDIRECT LIKELIHOOD INFERENCE
, 2011
"... ABSTRACT. Given a sample from a fully specified parametric model, let Zn be a given finitedimensional statistic for example, an initial estimator or a set of sample moments. We propose to (re)estimate the parameters of the model by maximizing the likelihood of Zn. We call this the maximum indirec ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
ABSTRACT. Given a sample from a fully specified parametric model, let Zn be a given finitedimensional statistic for example, an initial estimator or a set of sample moments. We propose to (re)estimate the parameters of the model by maximizing the likelihood of Zn. We call this the maximum indirect likelihood (MIL) estimator. We also propose a computationally tractable Bayesian version of the estimator which we refer to as a Bayesian Indirect Likelihood (BIL) estimator. In most cases, the density of the statistic will be of unknown form, and we develop simulated versions of the MIL and BIL estimators. We show that the indirect likelihood estimators are consistent and asymptotically normally distributed, with the same asymptotic variance as that of the corresponding efficient twostep GMM estimator based on the same statistic. However, our likelihoodbased estimators, by taking into account the full finitesample distribution of the statistic, are higher order efficient relative to GMMtype estimators. Furthermore, in many cases they enjoy a bias reduction property similar to that of the indirect inference estimator. Monte Carlo results for a number of applications including dynamic and nonlinear panel data models, a structural auction model and two DSGE models show that the proposed estimators indeed have attractive finite sample properties.
2009) Estimation of Dynamic Latent Variable Models Using Simulated Nonparametric Moments
"... ABSTRACT. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the param ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
ABSTRACT. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to de ne the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. The estimator is consistent and has the same asymptotic distribution as that of the infeasible GMM estimator based on the same moment conditions. Monte Carlo results show how the estimatod may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models. An application to weekly spot exchange rate data further illustrates use of the estimator.
Testing for Weak Identification in Possibly Nonlinear Models
, 2010
"... In this paper we propose a chisquare test for identification. Our proposed test statistic is based on the distance between two shrinkage extremum estimators. The two estimators converge in probability to the same limit when identification is strong, and their asymptotic distributions are different ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
In this paper we propose a chisquare test for identification. Our proposed test statistic is based on the distance between two shrinkage extremum estimators. The two estimators converge in probability to the same limit when identification is strong, and their asymptotic distributions are different when identification is weak. The proposed test is consistent not only for the alternative hypothesis of no identification but also for the alternative of weak identification, which is confirmed by our Monte Carlo results. We apply the proposed technique to test whether the structural parameters of a representative Taylorrule monetary policy reaction function are identified.