Results 1  10
of
69
The properties of automatic Gets modelling
, 2003
"... We describe some recent developments in PcGets, and consider their impact on its performance across different (unknown) states of nature. We discuss the consistency of its selection procedures, and examine the extent to which model selection is nondistortionary at relevant sample sizes. The problem ..."
Abstract

Cited by 62 (24 self)
 Add to MetaCart
(Show Context)
We describe some recent developments in PcGets, and consider their impact on its performance across different (unknown) states of nature. We discuss the consistency of its selection procedures, and examine the extent to which model selection is nondistortionary at relevant sample sizes. The problems posed in judging performance on collinear data are noted. We also describe how PcGets has been extended to assist nonexperts in model formulation, handle more variables than observations and tackle nonlinear models.
Searching for the causal structure of a vector autoregression
 Oxford Bulletin of Economics and Statistics
, 2003
"... We provide an accessible introduction to graphtheoretic methods for causal analysis. Building on the work of Swanson and Granger (Journal of the American Statistical Association, Vol. 92, pp. 357–367, 1997), and generalizing to a larger class of models, we show how to apply graphtheoretic methods ..."
Abstract

Cited by 55 (2 self)
 Add to MetaCart
(Show Context)
We provide an accessible introduction to graphtheoretic methods for causal analysis. Building on the work of Swanson and Granger (Journal of the American Statistical Association, Vol. 92, pp. 357–367, 1997), and generalizing to a larger class of models, we show how to apply graphtheoretic methods to selecting the causal order for a structural vector autoregression (SVAR). We evaluate the PC (causal search) algorithm in a Monte Carlo study. The PC algorithm uses tests of conditional independence to select among the possible causal orders – or at least to reduce the admissible causal orders to a narrow equivalence class. Our findings suggest that graphtheoretic methods may prove to be a useful tool in the analysis of SVARs. I. The problem of causal order Drawing on recent work on the graphtheoretic analysis of causality, we propose and evaluate a statistical procedure for identifying the contemporaneous causal order of a structural vector autoregression. *We thank Marcus Cuda for his help with programming and computational design, Derek Stimel and Ryan Brady for able research assistance, and to Oscar Jorda, Stephen Perez, and the participants
Truth and Robustness in Crosscountry Growth Regressions
 Oxford Bulletin of Economics and Statistics
, 2004
"... an earlier draft. We also thank Orley Ashenfelter for his help in getting this project off The work of Levine and Renelt (1992) and SalaiMartin (1997a, b) which attempted to test the robustness of various determinants of growth rates of per capita GDP among countries using two variants of Edward L ..."
Abstract

Cited by 47 (3 self)
 Add to MetaCart
an earlier draft. We also thank Orley Ashenfelter for his help in getting this project off The work of Levine and Renelt (1992) and SalaiMartin (1997a, b) which attempted to test the robustness of various determinants of growth rates of per capita GDP among countries using two variants of Edward Leamer’s extremebounds analysis is reexamined. In a realistic Monte Carlo experiment in which the universe of potential determinants is drawn from those in Levine and Renelt’s study, both versions of the extremebounds analysis are evaluated for their ability to recover the true specification. Levine and Renelt’s method is shown to have low size and extremely low power: nothing is robust; while SalaiMartin’s method is shown to have high size and high power: it is undiscriminating. Both methods are compared to a crosssectional version of the generaltospecific search methodology associated with the LSE approach to econometrics. It is shown to have size near nominal size and high power. SalaiMartin’s method and the generaltospecific method are then applied to the actual data from the original two studies. The results are consistent with the Monte Carlo results and are suggestive that the factors that most affect differences of growth rates are ones that are beyond the control of policymakers.
New developments in automatic generaltospecific modelling
 In B. P. Stigum (Ed.), Econometrics and the Philosophy of Economics, 379–419
, 2003
"... Abstract We consider the analytic basis for PcGets, an Ox Package implementing automatic generaltospecific (Gets) modelling for linear, dynamic, regression models. PcGets mimics the theory of reduction whereby empirical models arise by commencing from a general congruent specification, which is ..."
Abstract

Cited by 41 (16 self)
 Add to MetaCart
(Show Context)
Abstract We consider the analytic basis for PcGets, an Ox Package implementing automatic generaltospecific (Gets) modelling for linear, dynamic, regression models. PcGets mimics the theory of reduction whereby empirical models arise by commencing from a general congruent specification, which is then simplified to a minimal representation consistent with the desired selection criteria and the data evidence. We discuss the properties of PcGets, since results to date suggest that model selection can be relatively nondistortionary, even when the mechanism is unknown, and contrast Gets with possible alternatives.
Linear models, smooth transition autoregressions, and neural networks for forecasting macroeconomic time series: A reexamination
 International Journal of Forecasting
, 2005
"... Abstract In this paper, we examine the forecast accuracy of linear autoregressive, smooth transition autoregressive (STAR), and neural network (NN) time series models for 47 monthly macroeconomic variables of the G7 economies. Unlike previous studies that typically consider multiple but fixed model ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
(Show Context)
Abstract In this paper, we examine the forecast accuracy of linear autoregressive, smooth transition autoregressive (STAR), and neural network (NN) time series models for 47 monthly macroeconomic variables of the G7 economies. Unlike previous studies that typically consider multiple but fixed model specifications, we use a single but dynamic specification for each model class. The point forecast results indicate that the STAR model generally outperforms linear autoregressive models. It also improves upon several fixed STAR models, demonstrating that careful specification of nonlinear time series models is of crucial importance. The results for neural network models are mixed in the sense that at long forecast horizons, an NN model obtained using Bayesian regularization produces more accurate forecasts than a corresponding model specified using the specifictogeneral approach. Reasons for this outcome are discussed.
Generaltospecific reductions of Vector Autoregressive Processes
 Econometric Studies  A Festschrift in Honour of Joachim Frohn
, 2001
"... Unrestricted reduced form vector autoregressive (VAR) models have become a dominant research strategy in empirical macroeconomics since Sims (1980) critique of traditional macroeconometric modeling. They are however subjected to the curse of dimensionality. In this paper we propose generaltospecif ..."
Abstract

Cited by 23 (17 self)
 Add to MetaCart
(Show Context)
Unrestricted reduced form vector autoregressive (VAR) models have become a dominant research strategy in empirical macroeconomics since Sims (1980) critique of traditional macroeconometric modeling. They are however subjected to the curse of dimensionality. In this paper we propose generaltospecific reductions of VAR models and consider computerautomated model selection algorithms embodied in PcGets (see Krolzig and Hendry, 2000) for doing so. Starting from the unrestricted VAR, standard testing procedures eliminate statisticallyinsignificant variables, with diagnostic tests checking the validity of reductions, ensuring a congruent final selection. Since jointly selecting and diagnostic testing eludes theoretical analysis, we evaluate the proposed strategy by simulation. The Monte Carlo experiments show that PcGets recovers the DGP specification from a large unrestricted VAR model with size and power close to commencing from the DGP itself. The application of the proposed reduction strategy to a US monetary system demonstrates the feasibility of PcGets for the analysis of large macroeconomic data sets.
A Flexible Tool for Model Building: the Relevant Transformation
 of the Inputs Network Approach (RETINA)”, Oxford Bulletin of Economics and Statistics
, 2003
"... A new method, called Relevant Transformation of the Inputs Network Approach is proposed as a tool for model building. It is designed around flexibility (with nonlinear transformations of the predictors of interest), selective search within the range of possible models, outofsample forecasting abil ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
(Show Context)
A new method, called Relevant Transformation of the Inputs Network Approach is proposed as a tool for model building. It is designed around flexibility (with nonlinear transformations of the predictors of interest), selective search within the range of possible models, outofsample forecasting ability and computational simplicity. In tests on simulated data, it shows both a high rate of successful retrieval of the data generating process, which increases with the sample size and a good performance relative to other alternative procedures. A telephone service demand model is built to show how the procedure applies on real data. I.
Empirical Econometric Modelling
 PcGive
, 2007
"... The theory of reduction explains the origins of empirical models, by delineating all the steps involved in mapping from the actual data generation process (DGP) in the economy – far too complicated and high dimensional ever to be completely modeled – to an empirical model thereof. Each reduction st ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
The theory of reduction explains the origins of empirical models, by delineating all the steps involved in mapping from the actual data generation process (DGP) in the economy – far too complicated and high dimensional ever to be completely modeled – to an empirical model thereof. Each reduction step involves a potential loss of information from: aggregating, marginalizing, conditioning, approximating, and truncating, leading to a ‘local ’ DGP which is the actual generating process in the space of variables under analysis. Tests of losses from many of the reduction steps are feasible. Models that show no losses are deemed congruent; those that explain rival models are called encompassing. The main reductions correspond to wellestablished econometrics concepts (causality, exogeneity, invariance, innovations, etc.) which are the null hypotheses of the misspecification tests, so the theory has considerable excess content. Generaltospecific (Gets) modelling seeks to mimic reduction by commencing from a general congruent specification that is simplified to a minimal representation consistent with the desired criteria and the data evidence (essentially represented by the local DGP). However, in small data samples, model selection is difficult. We reconsider model selection from a computerautomation
GeneraltoSpecific Model Selection Procedures for Structural Vector Autoregressions
 Oxford Bulletin of Economics and Statistics
, 2003
"... Number 3 ..."
(Show Context)
Revisiting useful approaches to datarich macroeconomic forecasting. Federal Reserve Bank of
, 2008
"... This paper presents preliminary findings and is being distributed to economists and other interested readers solely to stimulate discussion and elicit comments. The views expressed in this paper are those of the authors and are not necessarily reflective of views at the Federal Reserve Bank of New Y ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
This paper presents preliminary findings and is being distributed to economists and other interested readers solely to stimulate discussion and elicit comments. The views expressed in this paper are those of the authors and are not necessarily reflective of views at the Federal Reserve Bank of New York or the Federal Reserve System. Any errors or omissions are the responsibility of the authors.