Results 1  10
of
11
Composite Gaussian process models for emulating expensive functions
 Ann. Appl. Stat
, 2012
"... ar ..."
Massively parallel approximate Gaussian process regression
"... We explore how the bigthree computing paradigms—symmetric multiprocessor (SMP), graphical processing units (GPUs), and cluster computing—can together be brought to bear on largedata Gaussian processes (GP) regression problems via a careful implementation of a newly developed local approximation s ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
We explore how the bigthree computing paradigms—symmetric multiprocessor (SMP), graphical processing units (GPUs), and cluster computing—can together be brought to bear on largedata Gaussian processes (GP) regression problems via a careful implementation of a newly developed local approximation scheme. Our methodological contribution focuses primarily on GPU computation, as this requires the most care and also provides the largest performance boost. However, in our empirical work we study the relative merits of all three paradigms to determine how best to combine them. The paper concludes with two case studies. One is a real data fluiddynamics computer experiment which benefits from the local nature of our approximation; the second is a synthetic example designed to find the largest data set for which (accurate) GP emulation can be performed on a commensurate predictive set in under an hour. Key words: emulator, nonparametric regression, graphical processing unit, symmetric multiprocessor, cluster computing, big data, computer experiment 1
MOVING LEAST SQUARES REGRESSION FOR HIGH DIMENSIONAL SIMULATION METAMODELING
"... Interpolation and smoothing methods form the basis of simulation metamodeling. In high dimensional metamodeling problems, larger numbers of design points are needed to build an accurate metamodel. This paper introduces a procedure to implement a smoothing method called Moving Least Squares regressio ..."
Abstract
 Add to MetaCart
(Show Context)
Interpolation and smoothing methods form the basis of simulation metamodeling. In high dimensional metamodeling problems, larger numbers of design points are needed to build an accurate metamodel. This paper introduces a procedure to implement a smoothing method called Moving Least Squares regression in high dimensional metamodeling problems with a large number of design points. We test the procedure with two queueing examples: a multiproduct M/G/1 queue and a multiproduct Jackson network. 1
Three Doubly Orthogonal Sudoku Latin Squares
"... • Rule: Fill in the missing entries subject to each row, column and threebythree subsquare containing no duplicated numbers. • Invented by Howard Garns in the 1970’s. • Became widespread in Japan during the 1980’s. • Now is played by millions of people across the globe. 3 ..."
Abstract
 Add to MetaCart
• Rule: Fill in the missing entries subject to each row, column and threebythree subsquare containing no duplicated numbers. • Invented by Howard Garns in the 1970’s. • Became widespread in Japan during the 1980’s. • Now is played by millions of people across the globe. 3
Speeding up neighborhood search in local Gaussian process prediction
"... Recent implementations of local approximate Gaussian process models have pushed computational boundaries for nonlinear, nonparametric prediction problems, particularly when deployed as emulators for computer experiments. Their flavor of spatially independent computation accommodates massive paral ..."
Abstract
 Add to MetaCart
(Show Context)
Recent implementations of local approximate Gaussian process models have pushed computational boundaries for nonlinear, nonparametric prediction problems, particularly when deployed as emulators for computer experiments. Their flavor of spatially independent computation accommodates massive parallelization, meaning that they can handle designs two or more orders of magnitude larger than previously. However, accomplishing that feat can still require massive supercomputing resources. Here we aim to ease that burden. We study how predictive variance is reduced as local designs are built up for prediction. We then observe how the exhaustive and discrete nature of an important search subroutine involved in building such local designs may be overly conservative. Rather, we suggest that searching the space radially, i.e., continuously along rays emanating from the predictive location of interest, is a far thriftier alternative. Our empirical work demonstrates that raybased search yields predictors with accuracy comparable to exhaustive search, but in a fraction of the time—bringing a supercomputer implementation back onto the desktop. Key words: approximate kriging, nonparametric regression, nearest neighbor, sequential design of experiments, active learning, big data 1
and Management Sciences
"... We provide a new approach to approximate emulation of large computer experiments. By focusing expressly on desirable properties of the predictive equations, we derive a family of local sequential design schemes that dynamically define the support of a Gaussian process predictor based on a local sub ..."
Abstract
 Add to MetaCart
(Show Context)
We provide a new approach to approximate emulation of large computer experiments. By focusing expressly on desirable properties of the predictive equations, we derive a family of local sequential design schemes that dynamically define the support of a Gaussian process predictor based on a local subset of the data. We further derive expressions for fast sequential updating of all needed quantities as the local designs are builtup iteratively. Then we show how independent application of our local design strategy across the elements of a vast predictive grid facilitates a trivially parallel implementation. The end result is a global predictor able to take advantage of modern multicore architectures, while at the same time allowing for a nonstationary modeling feature as a bonus. We demonstrate our method on two examples utilizing designs sized in the thousands, and tens of thousands of data points. Comparisons are made to the method of compactly supported covariances. Key words: sequential design, sequential updating, active learning, surrogate model, emulator, compactly supported covariance, local kriging neighborhoods 1
Submitted to the Annals of Statistics PRINCIPLES OF EXPERIMENTAL DESIGN FOR GAUSSIAN PROCESS EMULATORS OF DETERMINISTIC COMPUTER EXPERIMENTS
"... Abstract Computer experiments have become ubiquitous in science and engineering. Commonly, runs of these simulations demand considerable time and computing, making experimental design extremely important in gaining high quality information with limited time and resources. Principles of experimenta ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract Computer experiments have become ubiquitous in science and engineering. Commonly, runs of these simulations demand considerable time and computing, making experimental design extremely important in gaining high quality information with limited time and resources. Principles of experimental design are proposed and justified which ensure high nominal, numeric, and parameter estimation accuracy for Gaussian process emulation of deterministic simulations. The spacefilling properties “small fill distance ” and “large separation distance ” are only weakly conflicting and ensure wellcontrolled nominal, numeric, and parameter estimation error, while nonstationarity requires a greater density of experimental inputs in regions of the input space with more quickly decaying correlation. This work will provide scientists and engineers with robust and practically useful overarching principles for selecting combinations of simulation inputs with high information content.
MOVING LEAST SQUARES REGRESSION FOR HIGH DIMENSIONAL SIMULATION METAMODELING
"... Interpolation and smoothing methods form the basis of simulation metamodeling. In high dimensional metamodeling problems, larger numbers of design points are needed to build an accurate metamodel. This paper introduces a procedure to implement a smoothing method called Moving Least Squares regressio ..."
Abstract
 Add to MetaCart
(Show Context)
Interpolation and smoothing methods form the basis of simulation metamodeling. In high dimensional metamodeling problems, larger numbers of design points are needed to build an accurate metamodel. This paper introduces a procedure to implement a smoothing method called Moving Least Squares regression in high dimensional metamodeling problems with a large number of design points. We test the procedure with two queueing examples: a multiproduct M/G/1 queue and a multiproduct tandem Jackson network. 1