Results 1  10
of
15
Bayesian computing with INLA: New features
 Computational Statistics & Data Analysis
, 2013
"... ar ..."
(Show Context)
Bayesian Modeling with Gaussian Processes using the GPstuff Toolbox
, 2014
"... Gaussian processes (GP) are powerful tools for probabilistic modeling purposes. They can be used to define prior distributions over latent functions in hierarchical Bayesian models. The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Gaussian processes (GP) are powerful tools for probabilistic modeling purposes. They can be used to define prior distributions over latent functions in hierarchical Bayesian models. The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and variability of the function. The inference can then be conducted directly in the function space by evaluating or approximating the posterior process. Despite their attractive theoretical properties GPs provide practical challenges in their implementation. GPstuff is a versatile collection of computational tools for GP models compatible with Linux and Windows MATLAB and Octave. It includes, among others, various inference methods, sparse approximations and tools for model assessment. In this work, we review these tools and demonstrate the use of GPstuff in several models.
Animal models and Integrated Nested Laplace Approximations
, 2011
"... Animal models are generalized linear mixed model (GLMM) used in evolutionary biology and animal breeding to identify the genetic part of traits. Integrated Nested Laplace Approximation (INLA) is a methodology for making fast nonsampling based Bayesian inference for hierarchical Gaussian Markov mode ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Animal models are generalized linear mixed model (GLMM) used in evolutionary biology and animal breeding to identify the genetic part of traits. Integrated Nested Laplace Approximation (INLA) is a methodology for making fast nonsampling based Bayesian inference for hierarchical Gaussian Markov models. In this paper we demonstrate that the INLA methodology can be used for many versions of Bayesian animal models. We analyse animal models for both synthetic case studies and house sparrow population case studies with Gaussian, Binomial and Poisson likelihoods using INLA. Inference results are compared with results using Markov Chain Monte Carlo (MCMC) methods. We also introduce an R package, AnimalINLA, for easy and fast inference for Bayesian Animal models using INLA.
Think continuous: Markovian Gaussian models in spatial statistics
, 2011
"... Gaussian Markov random fields (GMRFs) are frequently used as computationally efficient models in spatial statistics. Unfortunately, it has traditionally been difficult to link GMRFs with the more traditional Gaussian random field models as the Markov property is difficult to deploy in continuous spa ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Gaussian Markov random fields (GMRFs) are frequently used as computationally efficient models in spatial statistics. Unfortunately, it has traditionally been difficult to link GMRFs with the more traditional Gaussian random field models as the Markov property is difficult to deploy in continuous space. Following the pioneering work of Lindgren et al. (2011), we expound on the link between Markovian Gaussian random fields and GMRFs. In particular, we discuss the theoretical and practical aspects of fast computation with continuously specified Markovian Gaussian random fields, as well as the clear advantages they offer in terms of clear, parsimonious and interpretable models of anisotropy and nonstationarity. 1
Penalising model component complexity: A principled, practical approach to constructing priors. submitted
, 2015
"... The issue of setting prior distributions on model parameters, or to attribute uncertainty for model parameters, is a difficult issue in applied Bayesian statistics. Although the prior distribution should ideally encode the users ’ prior knowledge about the parameters, this level of knowledge transfe ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The issue of setting prior distributions on model parameters, or to attribute uncertainty for model parameters, is a difficult issue in applied Bayesian statistics. Although the prior distribution should ideally encode the users ’ prior knowledge about the parameters, this level of knowledge transfer seems to be unattainable in practice and applied statisticians are forced to search for a “default ” prior. Despite the development of objective priors, which are only available explicitly for a small number of highly restricted model classes, the applied statistician has few practical guidelines to follow when choosing the priors. An easy way out of this dilemma is to reuse prior choices of others, with an appropriate reference. In this paper, we introduce a new concept for constructing prior distributions. We exploit the natural nested structure inherent to many model components, which defines the model component to be a flexible extension of a base model. Proper priors are defined to penalise the complexity induced by deviating from the simpler base model and are formulated after the input of a userdefined scaling parameter for that model component. These priors are invariant to reparameterisations, have a natural connection to Jeffreys ’ priors, are designed to support Occam’s razor and seem to have excellent robustness properties, all which are highly desirable and allow us to use this approach to define default prior distributions.
Gastrointestinal Stromal Tumor: A Method for Optimizing the Timing of CT Scans in the Followup of Cancer Patients. In Radiology, in press.
"... In this supplementary file, we describe in a detail how to apply the Gaussian processes (GP) in nonhomogoneus Poisson process survival analysis with interval censored data. This statistical methodology is applied in the paper “Gastrointestinal stromal tumor: a method for optimizing the timing of CT ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
In this supplementary file, we describe in a detail how to apply the Gaussian processes (GP) in nonhomogoneus Poisson process survival analysis with interval censored data. This statistical methodology is applied in the paper “Gastrointestinal stromal tumor: a method for optimizing the timing of CT scans in the followup of cancer patients”. For the individual i, where i = 1,..., n, we have survival time yi (possibly right or interval censored) with a censoring indicator δi, where δi = 0 if the ith observation is uncensored and δi = 1 if the observation is right or interval censored. For interval censored survival time, yi is known to fall into an interval [yi,lo, yi,up]. The traditional approach to analyze continuous timetoevent data is to assume the Cox proportional hazards function (Cox, 1972) hi(t) = h0(t) exp(x T i β), (1) where h0 is the unspecified baseline hazard rate, xi is the d × 1 vector of covariates for the ith patient and β is the vector of regression coefficients. The matrix X = [x1,...,xn]T of size n × d includes all covariate observations.
unknown title
"... Statistical Appendix: Cox survival analysis using Gaussian process priors In this supplementary file, we describe in a detail how to apply the Gaussian processes (GP) in Cox survival analyses using the proportional hazards model. This statistical methodology is applied in the paper “Stratification o ..."
Abstract
 Add to MetaCart
(Show Context)
Statistical Appendix: Cox survival analysis using Gaussian process priors In this supplementary file, we describe in a detail how to apply the Gaussian processes (GP) in Cox survival analyses using the proportional hazards model. This statistical methodology is applied in the paper “Stratification of the risk for gastrointestinal stromal tumour recurrence after surgery: a combined analysis of ten populationbased cohorts”. For the individual i, where i =1,..., n, we have observed survival time yi (possibly right censored) with a censoring indicator δi, where δi =0if the ith observation is uncensored and δi =1if the observation is right censored. The traditional approach to analyze continuous timetoevent data is to assume the Cox proportional hazard function 1 hi(t) =h0(t) exp(x T i β), (1) where h0 is the unspecified baseline hazard rate, xi is the d × 1 vector of covariates for the ith patient and β is the vector of regression coefficients. The matrix X =[x1,..., xn] T of size n × d includes all covariate observations. The Cox model with a linear predictor can be extended to more general form to enable, for example, additive and nonlinear effects of covariates. 2,3 We extend the proportional hazards model by hi(t) = exp(log(h0(t)) + ηi(xi)), (2) where the linear predictor is replaced with the latent predictor ηi depending on the covariates xi. By assuming a Gaussian process prior 4 over η =(η1,..., ηn) T, smooth nonlinear effects of continuous covariates are possible, and if there are dependencies between covariates, the GP can model these interactions implicitly. A zeromean GP prior is set for η, which results in the zeromean multivariate Gaussian distribution p(ηX) =N (0,C(X, X)), (3) where C(X, X) is the n × n covariance matrix whose elements are given by the covariance function of the GP. The covariance function defines the smoothness and scale properties of the latent function, and we choose a sum of constant and nonstationary neural network covariance function 5 c(xi, xj) =σc + 2 π sin−1
NORGES TEKNISKNATURVITENSKAPELIGE UNIVERSITET
"... Extending INLA to a class of nearGaussian latent models by ..."