Results 1  10
of
59
Gaussian processes for machine learning
, 2003
"... We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperparameters us ..."
Abstract

Cited by 720 (2 self)
 Add to MetaCart
(Show Context)
We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperparameters using the marginal likelihood. We explain the practical advantages of Gaussian Process and end with conclusions and a look at the current trends in GP work.
Gaussian Process Priors with Uncertain Inputs  Application to MultipleStep Ahead Time Series Forecasting
, 2003
"... We consider the problem of multistep ahead prediction in time series analysis using the nonparametric Gaussian process model. kstep ahead forecasting of a discretetime nonlinear dynamic system can be performed by doing repeated onestep ahead predictions. For a statespace model of the form ..."
Abstract

Cited by 77 (15 self)
 Add to MetaCart
(Show Context)
We consider the problem of multistep ahead prediction in time series analysis using the nonparametric Gaussian process model. kstep ahead forecasting of a discretetime nonlinear dynamic system can be performed by doing repeated onestep ahead predictions. For a statespace model of the form y t # f#y t## ;:::;y t#L #, the prediction of y at time t # k is based on the point estimates of the previous outputs. In this paper, we show how, using an analytical Gaussian approximation, we can formally incorporate the uncertainty about intermediate regressor values, thus updating the uncertainty on the current prediction.
Accelerating Evolutionary Algorithms with Gaussian Process Fitness Function Models
 IEEE Transactions on Systems, Man and Cybernetics
, 2004
"... We present an overview of evolutionary algorithms that use empirical models of the fitness function to accelerate convergence, distinguishing between Evolution Control and the Surrogate Approach. We describe the Gaussian process model and propose using it as an inexpensive fitness function surrogate ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
(Show Context)
We present an overview of evolutionary algorithms that use empirical models of the fitness function to accelerate convergence, distinguishing between Evolution Control and the Surrogate Approach. We describe the Gaussian process model and propose using it as an inexpensive fitness function surrogate. Implementation issues such as efficient and numerically stable computation, exploration vs. exploitation, local modeling, multiple objectives and constraints, and failed evaluations are addressed. Our resulting Gaussian Process Optimization Procedure (GPOP) clearly outperforms other evolutionary strategies on standard test functions as well as on a realworld problem: the optimization of stationary gas turbine compressor profiles.
Dependent Gaussian processes
 In NIPS
, 2004
"... Gaussian processes are usually parameterised in terms of their covariance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white no ..."
Abstract

Cited by 50 (0 self)
 Add to MetaCart
(Show Context)
Gaussian processes are usually parameterised in terms of their covariance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noise sources convolved with smoothing kernels, and to parameterise the kernel instead. Using this, we extend Gaussian processes to handle multiple, coupled outputs. 1
Healing the relevance vector machine through augmentation
 In Proc. of ICML 22
, 2005
"... The Relevance Vector Machine (RVM) is a sparse approximate Bayesian kernel method. It provides full predictive distributions for test cases. However, the predictive uncertainties have the unintuitive property, that they get smaller the further you move away from the training cases. We give a thoroug ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
(Show Context)
The Relevance Vector Machine (RVM) is a sparse approximate Bayesian kernel method. It provides full predictive distributions for test cases. However, the predictive uncertainties have the unintuitive property, that they get smaller the further you move away from the training cases. We give a thorough analysis. Inspired by the analogy to nondegenerate Gaussian Processes, we suggest augmentation to solve the problem. The purpose of the resulting model, RVM*, is primarily to corroborate the theoretical and experimental analysis. Although RVM * could be used in practical applications, it is no longer a truly sparse model. Experiments show that sparsity comes at the expense of worse predictive distributions. Bayesian inference based on Gaussian Processes (GPs) has become widespread in the machine learning community. However, their naïve applicability is marred by computational constraints. A number of recent publications have addressed this issue by means of sparse approximations, although ideologically sparseness is at variance with Bayesian principles1. In this paper we view sparsity purely as a way to achieve computational convenience and not as under other nonBayesian paradigms where sparseness itself is seen as a means to ensure good generalization.
Gaussian Process Implicit Surfaces
"... Many applications in computer vision and computer graphics require the definition of curves and surfaces. Implicit surfaces [7] are a popular choice for this because they are smooth, can be appropriately constrained by known geometry, and require no special treatment for topology changes. Given a sc ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
(Show Context)
Many applications in computer vision and computer graphics require the definition of curves and surfaces. Implicit surfaces [7] are a popular choice for this because they are smooth, can be appropriately constrained by known geometry, and require no special treatment for topology changes. Given a scalar function f: R d ↦ → R, one can define a manifold S of dimension d − 1 wherever f(x) passes through a certain value (e.g., 0) S0 � {x ∈ R d f(x) = 0}. (1) In this paper we introduce Gaussian processes (GPs) to this area by deriving a covariance function equivalent to the thin plate spline regularizer [2] in which smoothness of a function f(x) is encouraged by the energy � � T 2 E(f) = ∇ ∇f(x) dx (2)
Incremental Gaussian Processes
"... In this paper, we consider Tipping's relevance vector machine (RVM) [1] and formalize an incremental training strategy as a variant of the expectationmaximization (EM) algorithm that we call subspace EM. Working with ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
In this paper, we consider Tipping's relevance vector machine (RVM) [1] and formalize an incremental training strategy as a variant of the expectationmaximization (EM) algorithm that we call subspace EM. Working with
Multiplestep ahead prediction for non linear dynamic systems  A Gaussian Process treatment with propagation of the uncertainty
, 2003
"... We consider the problem of multistep ahead prediction in time series analysis using the nonparametric Gaussian process model. kstep ahead forecasting of a discretetime nonlinear dynamic system can be performed by doing repeated onestep ahead predictions. For a statespace model of the form ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
We consider the problem of multistep ahead prediction in time series analysis using the nonparametric Gaussian process model. kstep ahead forecasting of a discretetime nonlinear dynamic system can be performed by doing repeated onestep ahead predictions. For a statespace model of the form y t = f(y t 1 ; : : : ; y t L ), the prediction of y at time t + k is based on the estimates ^ y t+k 1 ; : : : ; ^ y t+k L of the previous outputs.
Accelerating Evolutionary Algorithms Using Fitness Function Models
 Proc. Workshops Genetic and Evolutionary Computation Conference
, 2003
"... An optimization procedure using empirical models as an approximation of expensive functions is presented. The model is trained on the current set of evaluated solutions and can be used to search for promising solutions. ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
An optimization procedure using empirical models as an approximation of expensive functions is presented. The model is trained on the current set of evaluated solutions and can be used to search for promising solutions.