Results 1  10
of
101
Bayesian Model Assessment In Factor Analysis
, 2004
"... Factor analysis has been one of the most powerful and flexible tools for assessment of multivariate dependence and codependence. Loosely speaking, it could be argued that the origin of its success rests in its very exploratory nature, where various kinds of datarelationships amongst the variable ..."
Abstract

Cited by 103 (10 self)
 Add to MetaCart
Factor analysis has been one of the most powerful and flexible tools for assessment of multivariate dependence and codependence. Loosely speaking, it could be argued that the origin of its success rests in its very exploratory nature, where various kinds of datarelationships amongst the variables at study can be iteratively verified and/or refuted. Bayesian inference in factor analytic models has received renewed attention in recent years, partly due to computational advances but also partly to applied focuses generating factor structures as exemplified by recent work in financial time series modeling. The focus of our current work is on exploring questions of uncertainty about the number of latent factors in a multivariate factor model, combined with methodological and computational issues of model specification and model fitting. We explore reversible jump MCMC methods that build on sets of parallel Gibbs samplingbased analyses to generate suitable empirical proposal distributions and that address the challenging problem of finding e#cient proposals in highdimensional models. Alternative MCMC methods based on bridge sampling are discussed, and these fully Bayesian MCMC approaches are compared with a collection of popular model selection methods in empirical studies.
Bayesian Dynamic Factor Models and Portfolio Allocation
 Journal of Business and Economic Statistics
, 2000
"... This article is available in electronic form on the ISDS web site, http://www.stat.duke.edu ..."
Abstract

Cited by 100 (7 self)
 Add to MetaCart
(Show Context)
This article is available in electronic form on the ISDS web site, http://www.stat.duke.edu
Bayesian dynamic factor models and variance matrix discounting for portfolio allocation
 Journal of Business and Economic Statistics
, 2000
"... We discuss the development of dynamic factor models for multivariate nancial time series, and the incorporation of stochastic volatility components for latent factor processes. Bayesian inference and computation is developed and explored in a study of the dynamic factor structure of daily spot excha ..."
Abstract

Cited by 62 (9 self)
 Add to MetaCart
(Show Context)
We discuss the development of dynamic factor models for multivariate nancial time series, and the incorporation of stochastic volatility components for latent factor processes. Bayesian inference and computation is developed and explored in a study of the dynamic factor structure of daily spot exchange rates for a selection of international currencies. The models are direct generalisations of univariate stochastic volatility models, and represent speci c varieties of models recently discussed in the growing multivariate stochastic volatility literature. We also discuss connections and comparisons with the much simpler method of dynamic variance discounting that, for over a decade, has been a standard approach in applied nancial econometrics in the Bayesian forecasting world. We review empirical ndings in applying these models to the exchange rate series, including aspects of model performance in dynamic portfolio allocation. We conclude with comments on the potential practical utility of structured factor models and future potential developments and model extensions.
Lightweight emulators for multivariate deterministic functions
 FORTHCOMING IN THE JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS
, 2007
"... An emulator is a statistical model of a deterministic function, to be used where the function itself is too expensive to evaluate withintheloop of an inferential calculation. Typically, emulators are deployed when dealing with complex functions that have large and heterogeneous input and output sp ..."
Abstract

Cited by 42 (8 self)
 Add to MetaCart
An emulator is a statistical model of a deterministic function, to be used where the function itself is too expensive to evaluate withintheloop of an inferential calculation. Typically, emulators are deployed when dealing with complex functions that have large and heterogeneous input and output spaces: environmental models, for example. In this challenging situation we should be sceptical about our statistical models, no matter how sophisticated, and adopt approaches that prioritise interpretative and diagnostic information, and the flexibility to respond. This paper presents one such approach, candidly rejecting the standard Smooth Gaussian Process approach in favour of a fullyBayesian treatment of multivariate regression which, by permitting sequential updating, allows for very detailed predictive diagnostics. It is argued directly and by illustration that the incoherence of such a treatment (which does not impose continuity on the model outputs) is more than compensated for by the wealth of available information, and the possibilities for generalisation.
Bayesian Inference in Asset Pricing Tests
, 1990
"... We test the meanvariance efficiency of a given portfolio using a Bayesian framework. Our test is more direct than Shanken's (1987b), because we impose a prior on all the parameters of the multivariate regression model. The approach is also easily adapted to other problems. We use Monte Carlo n ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
We test the meanvariance efficiency of a given portfolio using a Bayesian framework. Our test is more direct than Shanken's (1987b), because we impose a prior on all the parameters of the multivariate regression model. The approach is also easily adapted to other problems. We use Monte Carlo numerical integration to accurately evaluate 9Odimensional integrals. Posteriorodds ratios are calculated for 12 industry portfolios from 19261987. The sensitivity of the inferences to the prior is investigated by using three different distributions. The probability that the given portfolio is meanvariance efficient is small for a range of plausible priors.
The choice of variables in multivariate regression: a nonconjugate Bayesian decision theory approach
, 1999
"... INTRODUCTION Choice of regressor variables in linear regression has attracted considerable attention in the literature, from forward, backward and stepwise regression, model choice criteria such as Akaike's information criterion, to Bayesian techniques. We will focus on the Bayesian Univers ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
INTRODUCTION Choice of regressor variables in linear regression has attracted considerable attention in the literature, from forward, backward and stepwise regression, model choice criteria such as Akaike's information criterion, to Bayesian techniques. We will focus on the Bayesian University of Kent at Canterbury, Institute of Mathematics and Statistics, Cornwallis Building, Canterbury, CT2 7NF, UK. FAX 01227827932, email Philip.J.Brown@ukc.ac.uk y University College London, UK z Texas A & M University, USA 1 decision theory framework, first given by Lindley (1968) for univariate multiple regression, where costs attach to the inclusion of regressor variables. Here it is required to predict a future vector observation Y f comprising r components. Predictions are judged by quadratic loss to which is added a cost penalty on the regressor variables, x f
Posterior simulation and Bayes factors in panel count data models
 JOURNAL OF ECONOMETRICS 86 (1998) 3354
, 1998
"... This paper is concerned with the problems of posterior simulation and model choice for Poisson panel data models with multiple random effects. Efficient algorithms based on Markov chain Monte Carlo methods for sampling the posterior distribution are developed. A new parameterization of the random e¤ ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
This paper is concerned with the problems of posterior simulation and model choice for Poisson panel data models with multiple random effects. Efficient algorithms based on Markov chain Monte Carlo methods for sampling the posterior distribution are developed. A new parameterization of the random e¤ects and fixed effects is proposed and compared with a parameterization in common use, and computation of marginal likelihoods and Bayes factors via Chib's (1995) method is also considered. The methods are illustrated with two real data applications involving large samples and multiple
Reference analysis
 In Handbook of Statistics 25
, 2005
"... This chapter describes reference analysis, a method to produce Bayesian inferential statements which only depend on the assumed model and the available data. Statistical information theory is used to define the reference prior function as a mathematical description of that situation where data would ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
(Show Context)
This chapter describes reference analysis, a method to produce Bayesian inferential statements which only depend on the assumed model and the available data. Statistical information theory is used to define the reference prior function as a mathematical description of that situation where data would best dominate prior knowledge about the quantity of interest. Reference priors are not descriptions of personal beliefs; they are proposed as formal consensus prior functions to be used as standards for scientific communication. Reference posteriors are obtained by formal use of Bayes theorem with a reference prior. Reference prediction is achieved by integration with a reference posterior. Reference decisions are derived by minimizing a reference posterior expected loss. An information theory based loss function, the intrinsic discrepancy, may be used to derive reference procedures for conventional inference problems in scientific investigation, such as point estimation, region estimation and hypothesis testing.
A Bayesian Approach to Blind Source Separation
, 1999
"... This paper adopts a Bayesian statistical approach and a linear synthesis model ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
This paper adopts a Bayesian statistical approach and a linear synthesis model
A Bayesian Approach To Source Separation
 in Proceedings of The Nineteenth International Conference on Maximum Entropy and Bayesian Methods
, 1999
"... Source separation is one of the signal processing's main emerging domain. Many techniques such as maximum likelihood (ML), Infomax, cumulant matching, estimating function, etc. have been used to address this difficult problem. Unfortunately, up to now, many of these methods could not account co ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
(Show Context)
Source separation is one of the signal processing's main emerging domain. Many techniques such as maximum likelihood (ML), Infomax, cumulant matching, estimating function, etc. have been used to address this difficult problem. Unfortunately, up to now, many of these methods could not account completely for noise on the data, for different number of sources and sensors, for lack of spatial independence and for time correlation of the sources. Recently, the Bayesian approach has been used to push farther these limitations of the conventional methods. This paper proposes a unifying approach to source separation based on the Bayesian estimation. We first show that this approach gives the possibility to explain easily the major known techniques in sources separation as special cases. Then we propose new methods based on maximum a posteriori (MAP) estimation, either to estimate directly the sources, or the mixing matrices or even both. Key words: Sources separation, Bayesian estimation 1.