Results 1 - 10
of
448
The Dynamics Effects of Neutral and Investment-Specific Technology Shocks
- Journal of Political Economy
"... The neoclassical growth model is used to identify the short run effects of two technology shocks. Neutral shocks affect the production of all goods homogeneously, and investment-specific shocksaffect only investment goods. The paper finds that previous estimates, based on considering only neutral te ..."
Abstract
-
Cited by 184 (2 self)
- Add to MetaCart
The neoclassical growth model is used to identify the short run effects of two technology shocks. Neutral shocks affect the production of all goods homogeneously, and investment-specific shocksaffect only investment goods. The paper finds that previous estimates, based on considering only neutral technical change, substantially understate the effects of technology shocks. When investment-specific technical change is taken into account, the two technology shocks combined account for 40-60 % of the fluctuations in output and hours at business cycle frequencies. The two shocks also account for more than 50 % of the forecast error of output and hours over an eight year horizon. The investment-specific shocks account for the majority of these short run effects. This paper is a substantial revision to “Technology Shocks Matter. ” Thanks to Lisa Barrow, Lawrence
From “Hindu Growth” to Productivity Surge: The Mystery
- of the Indian Growth Transition,” IMF Working Paper No. 04/77 (Washington: International Monetary Fund
, 2004
"... This Working Paper should not be reported as representing the views of the IMF. The views expressed in this Working Paper are those of the author(s) and do not necessarily represent those of the IMF or IMF policy. Working Papers describe research in progress by the author(s) and are published to eli ..."
Abstract
-
Cited by 112 (11 self)
- Add to MetaCart
This Working Paper should not be reported as representing the views of the IMF. The views expressed in this Working Paper are those of the author(s) and do not necessarily represent those of the IMF or IMF policy. Working Papers describe research in progress by the author(s) and are published to elicit comments and to further debate. This paper explores the causes of India’s productivity surge around 1980, more than a decade before serious economic reforms were initiated. Trade liberalization, expansionary demand, a favorable external environment, or improved agricultural performance did not play a role. We find evidence that the trigger may have been an attitudinal shift by the government in the early 1980s, which, unlike the reforms of the 1990s, was pro-business rather than pro-market in character, favoring the interests of existing business rather than new entrants or consumers. A relatively small shift elicited a large productivity response because India was far away from its income possibility frontier. Registered manufacturing, which had been built up in previous decades, played an important role in determining which states took advantage of the changed environment.
2002): “Market Timing and Return Prediction Under Model Instability
- Journal of Empirical Finance
"... Despite mounting empirical evidence to the contrary, the literature on predictability of stock returns almost uniformly assumes a time-invariant relationship between state variables and returns. In this paper we propose a two-stage approach for forecasting of financial return series that are subject ..."
Abstract
-
Cited by 63 (10 self)
- Add to MetaCart
Despite mounting empirical evidence to the contrary, the literature on predictability of stock returns almost uniformly assumes a time-invariant relationship between state variables and returns. In this paper we propose a two-stage approach for forecasting of financial return series that are subject to breaks. The first stage adopts a reversed ordered Cusum (ROC) procedure to determine in real time when the most recent break has occurred. In the second stage, post-break data is used to estimate the parameters of the forecasting model. We compare this approach to existing alternatives for dealing with parameter instability such as the Bai-Perron method and the time-varying parameter model. An out-of-sample forecasting experiment demonstrates considerable gains in market timing precision from adopting the proposed two-stage forecasting method.
Dealing with Structural Breaks
- IN PALGRAVE HANDBOOK OF ECONOMETRICS
, 2006
"... This chapter is concerned with methodological issues related to estimation, testing and computation in the context of structural changes in the linear models. A central theme of the review is the interplay between structural change and unit root and on methods to distinguish between the two. The top ..."
Abstract
-
Cited by 62 (8 self)
- Add to MetaCart
This chapter is concerned with methodological issues related to estimation, testing and computation in the context of structural changes in the linear models. A central theme of the review is the interplay between structural change and unit root and on methods to distinguish between the two. The topics covered are: methods related to estimation and inference about break dates for single equations with or without restrictions, with extensions to multi-equations systems where allowance is also made for changes in the variability of the shocks; tests for structural changes including tests for a single or multiple changes and tests valid with unit root or trending regressors, and tests for changes in the trend function of a series that can be integrated or trendstationary; testing for a unit root versus trend-stationarity in the presence of structural changes in the trend function; testing for cointegration in the presence of structural changes; and issues related to long memory and level shifts. Our focus is on the conceptual issues about the frameworks adopted and the assumptions imposed as they relate to potential applicability. We also highlight the potential problems that can occur with methods that are commonly used and recent work that has been done to overcome them.
Disaggregate Evidence of the Persistence of Consumer Price Inflation
- Journal of Applied Econometrics
, 2006
"... Todd Clark is a vice president and economist at the Federal Reserve Bank of Kansas City. The views expressed herein are those of the author and do not necessarily reflect the views of the ..."
Abstract
-
Cited by 46 (0 self)
- Add to MetaCart
Todd Clark is a vice president and economist at the Federal Reserve Bank of Kansas City. The views expressed herein are those of the author and do not necessarily reflect the views of the
The predictive content of the output gap for inflation: resolving in-sample and out-of-sample evidence
- Journal of Money, Credit, and Banking
, 2006
"... Todd Clark is vice president and economist at the Federal Reserve Bank of Kansas City and ..."
Abstract
-
Cited by 38 (5 self)
- Add to MetaCart
Todd Clark is vice president and economist at the Federal Reserve Bank of Kansas City and
Pruned dynamic programming for optimal multiple change-point detection. Arxiv preprint arXiv:1004.0887
, 2010
"... Multiple change-point detection models assume that the observed data is a realization of an independent random process affected by K − 1 abrupt changes, called change-points, at some unknown positions. For off-line detection a dynamic programming (DP) algorithm retrieves the K − 1 change-points mini ..."
Abstract
-
Cited by 35 (6 self)
- Add to MetaCart
Multiple change-point detection models assume that the observed data is a realization of an independent random process affected by K − 1 abrupt changes, called change-points, at some unknown positions. For off-line detection a dynamic programming (DP) algorithm retrieves the K − 1 change-points minimizing the quadratic loss and reduces the complexity from Θ(nK) to Θ(Kn2) where n is the number of observations. The quadratic complexity in n still restricts the use of such an algorithm to small or intermediate values of n. We propose a pruned DP algorithm that recovers the optimal solution. We demonstrate that at worst the complexity is in O(Kn2) time and O(Kn) space and is therefore at worst equivalent to the classical DP. We show empirically that the run-time of our proposed algorithm is drastically reduced compared to the classical DP algorithm. More precisely, our algorithm is able to process a million points in a matter of minutes compared to several days with the classical DP algorithm. Moreover, the principle of the proposed algorithm can be extended to other convex losses (for example the Poisson loss) and as the algorithm process one observation after the other it could be adapted for on-line problems. 1
Econometric computing with hc and hac covariance matrix estimators
- Journal of Statistical Software
, 2004
"... This introduction to the R package sandwich is a (slightly) modified version of Zeileis (2004), published in the Journal of Statistical Software. Data described by econometric models typically contains autocorrelation and/or heteroskedasticity of unknown form and for inference in such models it is e ..."
Abstract
-
Cited by 34 (2 self)
- Add to MetaCart
This introduction to the R package sandwich is a (slightly) modified version of Zeileis (2004), published in the Journal of Statistical Software. Data described by econometric models typically contains autocorrelation and/or heteroskedasticity of unknown form and for inference in such models it is essential to use covariance matrix estimators that can consistently estimate the covariance of the model parameters. Hence, suitable heteroskedasticity-consistent (HC) and heteroskedasticity and autocorrelation consistent (HAC) estimators have been receiving attention in the econometric literature over the last 20 years. To apply these estimators in practice, an implementation is needed that preferably translates the conceptual properties of the underlying theoretical frameworks into computational tools. In this paper, such an implementation in the package sandwich in the R system for statistical computing is described and it is shown how the suggested functions provide reusable components that build on readily existing functionality and how they can be integrated easily into new inferential procedures or applications. The toolbox contained in sandwich is extremely flexible and comprehensive, including specific functions for the most important HC and HAC estimators from the econometric literature. Several real-world data sets are used to illustrate how the functionality can be integrated into applications.
Model-Based Recursive Partitioning
- Journal of Computational and Graphical Statistics
, 2008
"... provided by the University Library and the IT-Services. The aim is to enable open access to the scholarly output of the WU. ..."
Abstract
-
Cited by 34 (17 self)
- Add to MetaCart
(Show Context)
provided by the University Library and the IT-Services. The aim is to enable open access to the scholarly output of the WU.