Results 1 
5 of
5
RANDOMIZETHENOPTIMIZE: A METHOD FOR SAMPLING FROM POSTERIOR DISTRIBUTIONS IN NONLINEAR INVERSE PROBLEMS∗
"... Abstract. Highdimensional inverse problems present a challenge for Markov chain Monte Carlo (MCMC)type sampling schemes. Typically, they rely on finding an efficient proposal distribution, which can be difficult for largescale problems, even with adaptive approaches. Moreover, the autocorrelatio ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Highdimensional inverse problems present a challenge for Markov chain Monte Carlo (MCMC)type sampling schemes. Typically, they rely on finding an efficient proposal distribution, which can be difficult for largescale problems, even with adaptive approaches. Moreover, the autocorrelations of the samples typically increase with dimension, which leads to the need for long sample chains. We present an alternative method for sampling from posterior distributions in nonlinear inverse problems, when the measurement error and prior are both Gaussian. The approach computes a candidate sample by solving a stochastic optimization problem. In the linear case, these samples are directly from the posterior density, but this is not so in the nonlinear case. We derive the form of the sample density in the nonlinear case, and then show how to use it within both a Metropolis–Hastings and importance sampling framework to obtain samples from the posterior distribution of the parameters. We demonstrate, with various small and mediumscale problems, that randomizethenoptimize can be efficient compared to standard adaptive MCMC algorithms.
A Computational Framework for the Solution of InfiniteDimensional Bayesian Statistical Inverse Problems with Application to Global Seismic Inversion
, 2015
"... ..."
ACCELERATING MCMC WITH ACTIVE SUBSPACES
"... Abstract. The Markov chain Monte Carlo (MCMC) method is the computational workhorse for Bayesian inverse problems. However, MCMC struggles in highdimensional parameter spaces, since its iterates must sequentially explore a highdimensional space for accurate inference. This struggle is compounded i ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. The Markov chain Monte Carlo (MCMC) method is the computational workhorse for Bayesian inverse problems. However, MCMC struggles in highdimensional parameter spaces, since its iterates must sequentially explore a highdimensional space for accurate inference. This struggle is compounded in physical applications when the nonlinear forward model is computationally expensive. One approach to accelerate MCMC is to reduce the dimension of the state space. Active subspaces are an emerging set of tools for dimension reduction. When applied to MCMC, the active subspace separates a lowdimensional subspace that is informed by the data from its orthogonal complement that is constrained by the prior. With this information, one can run the sequential MCMC on the active variables while sampling independently according to the prior on the inactive variables. We provide a theoretical bound on the Hellinger distance between the true posterior and its approximation with the active subspace. And we demonstrate the active subspaceaccelerated MCMC on two computational examples: (i) a twodimensional parameter space with a quadratic forward model and onedimensional active subspace and (ii) a 100dimensional parameter space with a PDEbased forward model and a twodimensional active subspace. Key words. MCMC, active subspaces, dimension reduction
Dimensionindependent likelihoodinformed MCMC
"... Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterio ..."
Abstract
 Add to MetaCart
Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operatorweighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operatorweighted proposals adapted to the nonGaussian structure of the posterior. The resulting dimensionindependent and likelihoodinformed (DILI) MCMC samplers may be useful for a large class of highdimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.