Results 1 - 10
of
120
Bayesian analysis of DSGE models
- ECONOMETRICS REVIEW
, 2007
"... This paper reviews Bayesian methods that have been developed in recent years to estimate and evaluate dynamic stochastic general equilibrium (DSGE) models. We consider the estimation of linearized DSGE models, the evaluation of models based on Bayesian model checking, posterior odds comparisons, and ..."
Abstract
-
Cited by 130 (5 self)
- Add to MetaCart
This paper reviews Bayesian methods that have been developed in recent years to estimate and evaluate dynamic stochastic general equilibrium (DSGE) models. We consider the estimation of linearized DSGE models, the evaluation of models based on Bayesian model checking, posterior odds comparisons, and comparisons to vector autoregressions, as well as the nonlinear estimation based on a second-order accurate model solution. These methods are applied to data generated from correctly specified and misspecified linearized DSGE models, and a DSGE model that was solved with a second-order perturbation method. (JEL C11, C32, C51, C52)
Trading costs and returns for US equities: estimating effective costs from daily data
- Journal of Finance
, 2009
"... The effective cost of trading is usually estimated from transaction-level data. This study proposes a Gibbs estimate that is based on daily closing prices. In a validation sample, the daily Gibbs estimate achieves a correlation of 0.965 with the transactionlevel estimate. When the Gibbs estimates ar ..."
Abstract
-
Cited by 117 (1 self)
- Add to MetaCart
The effective cost of trading is usually estimated from transaction-level data. This study proposes a Gibbs estimate that is based on daily closing prices. In a validation sample, the daily Gibbs estimate achieves a correlation of 0.965 with the transactionlevel estimate. When the Gibbs estimates are incorporated into asset pricing specifications over a long historical sample (1926 to 2006), the results suggest that effective cost (as a characteristic) is positively related to stock returns. The relation is strongest in January, but it appears to be distinct from size effects. INVESTIGATIONS INTO THE ROLE of liquidity and transaction costs in asset pricing must generally confront the fact that while many asset pricing tests make use of U.S. equity returns from 1926 onward, the high-frequency data used to estimate trading costs are usually not available prior to 1983. Accordingly, most studies either limit the sample to the post-1983 period of common coverage or use the longer historical sample with liquidity proxies estimated from daily data. This paper falls into the latter group. Specifically, I propose a new approach to estimating the effective cost of trading and the common variation in this cost. These estimates are then used in conventional asset pricing specifications with a view to ascertaining the role of trading costs as a characteristic in explaining expected returns. 1
Computational Experiments and Reality
, 1999
"... This study explores three alternative econometric interpretations of dynamic, stochastic general equilibrium (DSGE) models. (1) A strong econometric interpretation takes the model literally and directly produces a likelihood function for observed prices and quantities. It is widely recognized that u ..."
Abstract
-
Cited by 50 (0 self)
- Add to MetaCart
This study explores three alternative econometric interpretations of dynamic, stochastic general equilibrium (DSGE) models. (1) A strong econometric interpretation takes the model literally and directly produces a likelihood function for observed prices and quantities. It is widely recognized that under this interpretation, most DSGE models are rejected using classical econometrics and assigned zero probability in a Bayesian approach. (2) A weak econometric interpretation commonly made in the calibration literature confines attention to only a few functions of observed prices and interest rates and evaluates a model on its predictive distribution for these functions. This approach is equivalent to a Bayesian prior predictive analysis, developed by Box (1980) and predecessors. This study shows that the weak interpretation retains the implications of the strong interpretation, and therefore DSGE’s fare no better under this approach. (3) Under a minimal econometric interpretation, DSGE’s provide only prior distributions for specified population moments. When coupled with an econometric model (e.g., a vector autoregression) that includes the same moments, DSGE’s may be compared and used for inference using conventional Bayesian methods. This interpretation extends and formalizes an approach suggested by Dejong, Ingram and Whiteman (1996). All three interpretations are illustrated using models of the equity premium, and it is shown that the conclusions from a minimal interpretation differ substantially from those under a weak interpretation. This revision was prepared for the DYNARE Conference, CEPREMAP, Paris, September 4-5, 2006. It is work in progress. Comments welcome. Please do not cite or quote without the author’s permission. 1 1
2007): “Non-stationary Hours in a DSGE Model
- Journal of Money, Credit and Banking
"... The time series fit of dynamic stochastic general equilibrium (DSGE) models often suffers from restrictions on the long-run dynamics that are at odds with the data. This paper modifies a stochastic growth model by incorporating permanent labor supply shocks that can generate a unit root in hours wor ..."
Abstract
-
Cited by 35 (4 self)
- Add to MetaCart
The time series fit of dynamic stochastic general equilibrium (DSGE) models often suffers from restrictions on the long-run dynamics that are at odds with the data. This paper modifies a stochastic growth model by incorporating permanent labor supply shocks that can generate a unit root in hours worked. Using Bayesian methods we estimate the standard specification in which hours worked are stationary and a modified version with permanent labor supply shocks. If firms can freely adjust labor inputs, the data support the latter specification. Once we introduce frictions in terms of labor adjustment costs, the overall time series fit improves and the model specification in which labor supply shocks and hours worked are stationary is preferred.
Time-Varying Parameter VAR Model with Stochastic Volatility: An Overview of Methodology and Empirical
- Applications.” IMES Discussion Paper Series, Institute for Monetary and Economic Studies, Bank of Japan NUNES, R
, 2011
"... You can download this and other papers at the IMES Web site: ..."
Abstract
-
Cited by 18 (5 self)
- Add to MetaCart
You can download this and other papers at the IMES Web site:
ROBUST PRIORS IN NONLINEAR PANEL DATA MODELS
"... Many approaches to estimation of panel models are based on an average or integrated likelihood that assigns weights to di erent values of the individual e ects. Fixed e ects, random e ects, and Bayesian approaches all fall in this category. We provide a characterization of the class of weights (or p ..."
Abstract
-
Cited by 17 (0 self)
- Add to MetaCart
Many approaches to estimation of panel models are based on an average or integrated likelihood that assigns weights to di erent values of the individual e ects. Fixed e ects, random e ects, and Bayesian approaches all fall in this category. We provide a characterization of the class of weights (or priors) that produce estimators that are rst-order unbiased. We show that such bias reducing weights will depend on the data in general unless an orthogonal reparameterization or an essentially equivalent condition is available. Two intuitively appealing weighting schemes are discussed. We argue that asymptotically valid con dence intervals can be read from the posterior distribution of the common parameters when N and T grow at the same rate. Next, we show that random e ects estimators are not bias reducing in general and discuss important exceptions. Moreover, the bias depends on the Kullback-Leibler distance between the population distribution of the e ects and its best approximation in the random e ects family. Finally, we show that in general standard random e ects estimation of marginal e ects is inconsistent for large T, whereas the posterior mean of the marginal e ect is large-T consistent, and we provide conditions for bias reduction. Some examples and Monte Carlo experiments illustrate the results.
Wearout effects of different advertising themes . . .
, 2007
"... Models of advertising response implicitly assume that the entire advertising budget is spent on disseminating one message. In practice, managers use different themes of advertising (for example, price advertisements versus product advertisements) and within each theme they employ different versions ..."
Abstract
-
Cited by 15 (1 self)
- Add to MetaCart
Models of advertising response implicitly assume that the entire advertising budget is spent on disseminating one message. In practice, managers use different themes of advertising (for example, price advertisements versus product advertisements) and within each theme they employ different versions of an advertisement. In this study, we evaluate the dynamic effects of different themes of advertising that have been employed in a campaign. We develop a model that jointly considers the effects of wearout as well as that of forgetting in the context of an advertising campaign that employs five different advertising themes. We quantify the differential wearout effects across the different themes of advertising and examine the interaction effects between the different themes using a Bayesian dynamic linear model (DLM). Such a response model can help managers decide on the optimal allocation of resources across the portfolio of ads as well as better manage their scheduling. We develop a model to show how our response model parameters can be used to improve the effectiveness of advertising budget allocation across different themes. We find that a reallocation of resources across different themes according to our model results in a significant improvement in demand.
2008) The wages of BMI: Bayesian analysis of a skewed treatment-response model with nonparametric endogeneity
- Journal of Applied Econometrics
"... We generalize the specifications used in previous studies of the effect of body mass index (BMI) on earnings by allowing the potentially endogenous BMI variable to enter the log wage equation nonparametrically. We introduce a Bayesian posterior simula-tor for fitting our model that permits a nonpara ..."
Abstract
-
Cited by 11 (1 self)
- Add to MetaCart
We generalize the specifications used in previous studies of the effect of body mass index (BMI) on earnings by allowing the potentially endogenous BMI variable to enter the log wage equation nonparametrically. We introduce a Bayesian posterior simula-tor for fitting our model that permits a nonparametric treatment of the endogenous BMI variable, flexibly accommodates skew in the BMI distribution, and whose imple-mentation requires only Gibbs steps. Using data from the 1970 British Cohort Study, our results indicate the presence of nonlinearities in the relationships between BMI and log wages that differ across men and women, and also suggest the importance of unobserved confounding for our sample of males. 1We thank and acknowledge the UK Data Archive (University of Essex, Colchester, UK) for use of data from the 1970 British Cohort Study. They bear no responsibility for the analysis or interpretation of this data. All errors are, of course, our own. 1
2011): “Nonlinear Panel Data Analysis
- Annual Review of Economics
"... Nonlinear panel data models arise naturally in economic applications, yet their analysis is challenging. Here we provide a progress report on some recent advances in the area. We start by reviewing the properties of random-effects likelihood approaches. We emphasize a link with Bayesian computation ..."
Abstract
-
Cited by 9 (0 self)
- Add to MetaCart
Nonlinear panel data models arise naturally in economic applications, yet their analysis is challenging. Here we provide a progress report on some recent advances in the area. We start by reviewing the properties of random-effects likelihood approaches. We emphasize a link with Bayesian computation and Markov Chain Monte Carlo, which provides a convenient approach to estimation and inference. Relaxing parametric assumptions on the distribution of individual effects raises serious identification problems. In discrete choice models, common parameters and average marginal effects are generally set-identified. The availability of continuous outcomes, however, provides opportunities for point-identification. We end the paper by reviewing recent progress on non fixed-T approaches. In panel applications where the time dimension is not negligible relative to the size of the cross-section, it makes sense to view the estimation problem as a time-series finite sample bias. Several perspectives to bias reduction are now available. We review their properties, with a special emphasis on random-effects methods. JEL codes: C23.