Results 1  10
of
59
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 583 (14 self)
 Add to MetaCart
Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model with informative priors, and in the Tobitcensored regression model.
Using simulation methods for Bayesian econometric models: Inference, development and communication
 Econometric Review
, 1999
"... This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a ..."
Abstract

Cited by 356 (19 self)
 Add to MetaCart
(Show Context)
This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models. *This paper was originally prepared for the Australasian meetings of the Econometric Society in Melbourne, Australia,
Error Bands for Impulse Responses
 Econometrica
, 1999
"... We show how correctly to extend known methods for generating error bands in reduced form VAR’s to overidentified models. We argue that the conventional pointwise bands common in the literature should be supplemented with measures of shape uncertainty, and we show how to generate such measures. We fo ..."
Abstract

Cited by 172 (5 self)
 Add to MetaCart
We show how correctly to extend known methods for generating error bands in reduced form VAR’s to overidentified models. We argue that the conventional pointwise bands common in the literature should be supplemented with measures of shape uncertainty, and we show how to generate such measures. We focus on bands that characterize the shape of the likelihood. Such bands are not classical confidence regions. We explain that classical confidence regions mix information about parameter location with information about model fit, and hence can be misleading as summaries of the implications of the data for the location of parameters. Because classical confidence regions also present conceptual and computational problems in multivariate time series models, we suggest that likelihoodbased bands, rather than approximate confidence bands based on asymptotic theory, be standard in reporting results for this type of model. 1 I.
Monte Carlo simulation and numerical integration
, 1996
"... not be cited or distributed without the author's permission. Section 7 of this draft is incomplete. Suggestions and corrections will be gratefully received and acknowledged. ..."
Abstract

Cited by 78 (13 self)
 Add to MetaCart
(Show Context)
not be cited or distributed without the author's permission. Section 7 of this draft is incomplete. Suggestions and corrections will be gratefully received and acknowledged.
Methods for Approximating Integrals in Statistics with Special Emphasis on Bayesian Integration Problems
 Statistical Science
"... This paper is a survey of the major techniques and approaches available for the numerical approximation of integrals in statistics. We classify these into five broad categories; namely, asymptotic methods, importance sampling, adaptive importance sampling, multiple quadrature and Markov chain method ..."
Abstract

Cited by 49 (5 self)
 Add to MetaCart
This paper is a survey of the major techniques and approaches available for the numerical approximation of integrals in statistics. We classify these into five broad categories; namely, asymptotic methods, importance sampling, adaptive importance sampling, multiple quadrature and Markov chain methods. Each method is discussed giving an outline of the basic supporting theory and particular features of the technique. Conclusions are drawn concerning the relative merits of the methods based on the discussion and their application to three examples. The following broad recommendations are made. Asymptotic methods should only be considered in contexts where the integrand has a dominant peak with approximate ellipsoidal symmetry. Importance sampling, and preferably adaptive importance sampling, based on a multivariate Student should be used instead of asymptotics methods in such a context. Multiple quadrature, and in particular subregion adaptive integration, are the algorithms of choice for...
Bayesian Inference in Asset Pricing Tests
, 1990
"... We test the meanvariance efficiency of a given portfolio using a Bayesian framework. Our test is more direct than Shanken's (1987b), because we impose a prior on all the parameters of the multivariate regression model. The approach is also easily adapted to other problems. We use Monte Carlo n ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
We test the meanvariance efficiency of a given portfolio using a Bayesian framework. Our test is more direct than Shanken's (1987b), because we impose a prior on all the parameters of the multivariate regression model. The approach is also easily adapted to other problems. We use Monte Carlo numerical integration to accurately evaluate 9Odimensional integrals. Posteriorodds ratios are calculated for 12 industry portfolios from 19261987. The sensitivity of the inferences to the prior is investigated by using three different distributions. The probability that the given portfolio is meanvariance efficient is small for a range of plausible priors.
Resampling Algorithms for Particle Filters: A Computational Complexity Perspective
"... Newly developed resampling algorithms for particle filters suitable for realtime implementation are described and their analysis is presented. The new algorithms reduce the complexity of both hardware and DSP realization through addressing common issues such as decreasing the number of operations a ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
(Show Context)
Newly developed resampling algorithms for particle filters suitable for realtime implementation are described and their analysis is presented. The new algorithms reduce the complexity of both hardware and DSP realization through addressing common issues such as decreasing the number of operations and memory access. Moreover, the algorithms allow for use of higher sampling frequencies by overlapping in time the resampling step with the other particle filtering steps. Since resampling is not dependent on any particular application, the analysis is appropriate for all types of particle filters that use resampling. The performance of the algorithms is evaluated on particle filters applied to bearingsonly tracking and joint detection and estimation in wireless communications. We have demonstrated that the proposed algorithms reduce the complexity without performance degradation. Key words: particle filters, resampling, computational complexity, sequential implementation 1
Architectures for Efficient Implementation of Particle Filters
, 2004
"... Particle filters are sequential Monte Carlo methods that are used in numerous problems where timevarying signals must be presented in real time and where the objective is to estimate various unknowns of the signal and/or detect events described by the signals. The standard solutions of such proble ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Particle filters are sequential Monte Carlo methods that are used in numerous problems where timevarying signals must be presented in real time and where the objective is to estimate various unknowns of the signal and/or detect events described by the signals. The standard solutions of such problems in many applications are based on the Kalman filters or extended Kalman filters. In situations when the problems are nonlinear or the noise that distorts the signals is nonGaussian, the Kalman filters provide a solution that may be far from optimal. Particle filters are an intriguing alternative to the Kalman filters due to their excellent performance in very di#cult problems including communications, signal processing, navigation, and computer vision. Hence, particle filters have been the focus of wide research recently and immense literature can be found on their theory. Most of these works recognize the complexity and computational intensity of these filters, but there has been no e#ort directed toward the implementation of these filters in hardware. The objective of this dissertation is to develop, design, and build e#cient hardware for particle filters, and thereby bring them closer to practical applications. The fact that particle filters outperform most of the traditional filtering methods in many complex practical scenarios, coupled with the challenges related to decreasing their computational complexity and improving realtime performance, makes this work worthwhile. The main
Nonlinear and NonGaussian StateSpace Modeling with Monte Carlo Techniques: A Survey and Comparative Study
 In Rao, C., & Shanbhag, D. (Eds.), Handbook of Statistics
, 2000
"... Since Kitagawa (1987) and Kramer and Sorenson (1988) proposed the filter and smoother using numerical integration, nonlinear and/or nonGaussian state estimation problems have been developed. Numerical integration becomes extremely computerintensive in the higher dimensional cases of the state vect ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
Since Kitagawa (1987) and Kramer and Sorenson (1988) proposed the filter and smoother using numerical integration, nonlinear and/or nonGaussian state estimation problems have been developed. Numerical integration becomes extremely computerintensive in the higher dimensional cases of the state vector. Therefore, to improve the above problem, the sampling techniques such as Monte Carlo integration with importance sampling, resampling, rejection sampling, Markov chain Monte Carlo and so on are utilized, which can be easily applied to multidimensional cases. Thus, in the last decade, several kinds of nonlinear and nonGaussian filters and smoothers have been proposed using various computational techniques. The objective of this paper is to introduce the nonlinear and nonGaussian filters and smoothers which can be applied to any nonlinear and/or nonGaussian cases. Moreover, by Monte Carlo studies, each procedure is compared by the root mean square error criterion.
Priors for macroeconomic time series and their application
, 1992
"... This paper takes up Bayesian inference in a general trend stationary model for macroeconomic time series with independent Studentt disturbances. The model is linear in the data, but nonlinear in parameters. An informative but nonconjugate family of prior distributions for the parameters is introduc ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
This paper takes up Bayesian inference in a general trend stationary model for macroeconomic time series with independent Studentt disturbances. The model is linear in the data, but nonlinear in parameters. An informative but nonconjugate family of prior distributions for the parameters is introduced, indexed by a single parameter which can be readily elicited. The main technical contribution is the construction of posterior moments, densities, and odds ratios using a sixstep Gibbs sampler. Mappings from the index parameter of the family of prior distribution to posterior moments, densities, and odds ratios are developed for several of the NelsonPlosser time series. These mappings show that the posterior distribution is not even approximately Gaussian, and indicate the sensitivity of the posterior odds ratio in favor of difference stationarity to the choice of the prior distribution.