Results 1  10
of
37
Understanding predictive information criteria for Bayesian models ∗
, 2013
"... We review the Akaike, deviance, and WatanabeAkaike information criteria from a Bayesian perspective, where the goal is to estimate expected outofsampleprediction error using a biascorrectedadjustmentofwithinsampleerror. Wefocusonthechoicesinvolvedinsettingupthese measures, and we compare them i ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
(Show Context)
We review the Akaike, deviance, and WatanabeAkaike information criteria from a Bayesian perspective, where the goal is to estimate expected outofsampleprediction error using a biascorrectedadjustmentofwithinsampleerror. Wefocusonthechoicesinvolvedinsettingupthese measures, and we compare them in three simple examples, one theoretical and two applied. The contribution of this paper is to put all these information criteria into a Bayesian predictive context and to better understand, through small examples, how these methods can apply in practice.
Performance of Bayesian Model Selection Criteria for Gaussian Mixture Models.
, 2009
"... ..."
(Show Context)
A Bayesian Poisson Vector Autoregression Model ∗
, 2011
"... Multivariate count models are rare in political science, despite the presence of many count time series. This article develops a new Bayesian Poisson vector autoregression (BaPVAR) model that can characterize endogenous dynamic counts with no restrictions on the contemporaneous correlations. Impuls ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Multivariate count models are rare in political science, despite the presence of many count time series. This article develops a new Bayesian Poisson vector autoregression (BaPVAR) model that can characterize endogenous dynamic counts with no restrictions on the contemporaneous correlations. Impulse responses, decomposition of the forecast errors, and dynamic multiplier methods for the effects of exogenous covariate shocks are illustrated for the model. Two full illustrations of the model, its interpretations, and results are presented. The first example is a dynamic model that reanalyzes the patterns and predictors of superpower rivalry events. The second example applies the model to analyze the dynamics of transnational terrorist targeting decisions between 1968 and 2008. The latter example’s results have direct implications for contemporary policy about terrorists ’ targeting that are both novel and innovative in the study of terrorism. This study was funded by the US Department of Homeland Security (DHS) through the Center for Risk
theory and application
"... Statistical models are the traditional choice to test scientific theories when observations, processes or boundary conditions are subject to stochasticity. Many important systems in ecology and biology, however, are difficult to capture with statistical models. Stochastic simulation models offer an ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Statistical models are the traditional choice to test scientific theories when observations, processes or boundary conditions are subject to stochasticity. Many important systems in ecology and biology, however, are difficult to capture with statistical models. Stochastic simulation models offer an alternative, but they were hitherto associated with a major disadvantage: their likelihood functions can usually not be calculated explicitly, and thus it is difficult to couple them to wellestablished statistical theory such as maximum likelihood and Bayesian statistics. A number of new methods, among them Approximate Bayesian Computing and PatternOriented Modelling, bypass this limitation. These methods share three main principles: aggregation of simulated and observed data via summary statistics, likelihood approximation based on the summary statistics, and efficient sampling. We discuss principles as well as advantages and caveats of these methods, and demonstrate their potential for integrating stochastic simulation models into a unified framework for statistical modelling.
Inherent difficulties of nonBayesian likelihoodbased inference, as revealed by an examination of a recent book by Aitkin (with a reply from the author
 Statistics & Risk Modeling
"... For many decades, statisticians have made attempts to prepare the Bayesian omelette without breaking the Bayesian eggs; that is, to obtain probabilistic likelihoodbased inferences without relying on informative prior distributions. A recent example is Murray Aitkin’s recent book, Statistical Infere ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
For many decades, statisticians have made attempts to prepare the Bayesian omelette without breaking the Bayesian eggs; that is, to obtain probabilistic likelihoodbased inferences without relying on informative prior distributions. A recent example is Murray Aitkin’s recent book, Statistical Inference, which presents an approach to statistical hypothesis testing based on comparisons of posterior distributions of likelihoods under competing models. Aitkin develops and illustrates his method using some simple examples of inference from iid data and twoway tests of independence. We analyze in this note some consequences of the inferential paradigm adopted therein, discussing why the approach is incompatible with a Bayesian perspective and why we do not find it relevant for applied work.
doi:10.1093/biostatistics/kxp053 Bayesian inference for generalized linear mixed models
"... Generalized linear mixed models (GLMMs) continue to grow in popularity due to their ability to directly acknowledge multiple levels of dependency and model different data types. For small sample sizes especially, likelihoodbased inference can be unreliable with variance components being particular ..."
Abstract
 Add to MetaCart
Generalized linear mixed models (GLMMs) continue to grow in popularity due to their ability to directly acknowledge multiple levels of dependency and model different data types. For small sample sizes especially, likelihoodbased inference can be unreliable with variance components being particularly difficult to estimate. A Bayesian approach is appealing but has been hampered by the lack of a fast implementation, and the difficulty in specifying prior distributions with variance components again being particularly problematic. Here, we briefly review previous approaches to computation in Bayesian implementations of GLMMs and illustrate in detail, the use of integrated nested Laplace approximations in this context. We consider a number of examples, carefully specifying prior distributions on meaningful quantities in each case. The examples cover a wide range of data types including those requiring smoothing over time and a relatively complicated spline model for which we examine our prior specification in terms of the implied degrees of freedom. We conclude that Bayesian inference is now practically feasible for GLMMs and provides an attractive alternative to likelihoodbased approaches such as penalized quasilikelihood. As with likelihoodbased approaches, great care is required in the analysis of clustered binary data since approximation strategies may be less accurate for such data.
unknown title
, 2012
"... An application of Bayesian growth mixture modelling to estimate infection incidences from repeated serological tests ..."
Abstract
 Add to MetaCart
(Show Context)
An application of Bayesian growth mixture modelling to estimate infection incidences from repeated serological tests
Statistical Modelling 2015; 15(4): 366–387 Longitudinal mixedeffects models for latent cognitive function
, 2013
"... Abstract: A mixedeffects regression model with a bentcable changepoint predictor is formulated to describe potential decline of cognitive function over time in the older population. For the individual trajectories, cognitive function is considered to be a latent variable measured through an item ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: A mixedeffects regression model with a bentcable changepoint predictor is formulated to describe potential decline of cognitive function over time in the older population. For the individual trajectories, cognitive function is considered to be a latent variable measured through an item response theory model given longitudinal test data. Individualspecific parameters are defined for both cognitive function and the rate of change over time, using the changepoint predictor for nonlinear trends. Bayesian inference is used, where the Deviance Information Criterion and the Lcriterion are investigated for model comparison. Special attention is given to the identifiability of the item response parameters. Item response theory makes it possible to use dichotomous and polytomous test items, and to take into account missing data and surveydesign change during followup. This will be illustrated in an application where data stem from the Cambridge City over75s Cohort Study. Key words: bentcable, change point, cognition, growthcurve model, item response theory (IRT), longitudinal data analysis