Results 1  10
of
272,076
Equivalent kernels for smoothing splines
 J. Integral Equations Appl
, 2006
"... To Ken Atkinson on the occasion of his 65th birthday ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
To Ken Atkinson on the occasion of his 65th birthday
Understanding Gaussian Process Regression Using the Equivalent Kernel
"... Abstract. The equivalent kernel [1] is a way of understanding how Gaussian process regression works for large sample sizes based on a continuum limit. In this paper we show how to approximate the equivalent kernel of the widelyused squared exponential (or Gaussian) kernel and related kernels. This ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. The equivalent kernel [1] is a way of understanding how Gaussian process regression works for large sample sizes based on a continuum limit. In this paper we show how to approximate the equivalent kernel of the widelyused squared exponential (or Gaussian) kernel and related kernels
C.: Using the equivalent kernel to understand Gaussian process regression
 In: NIPS
, 2005
"... The equivalent kernel [1] is a way of understanding how Gaussian process regression works for large sample sizes based on a continuum limit. In this paper we show (1) how to approximate the equivalent kernel of the widelyused squared exponential (or Gaussian) kernel and related kernels, and (2) how ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
The equivalent kernel [1] is a way of understanding how Gaussian process regression works for large sample sizes based on a continuum limit. In this paper we show (1) how to approximate the equivalent kernel of the widelyused squared exponential (or Gaussian) kernel and related kernels, and (2
Derivation Of Equivalent Kernel For General Spline Smoothing: A Systematic Approach
"... We consider first the spline smoothing nonparametric estimation with variable smoothing parameter and arbitrary design density function and show that the corresponding equivalent kernel can be approximated by the Green's function of a certain linear differential operator. Furthermore, we propo ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We consider first the spline smoothing nonparametric estimation with variable smoothing parameter and arbitrary design density function and show that the corresponding equivalent kernel can be approximated by the Green's function of a certain linear differential operator. Furthermore, we
Learning the Kernel Matrix with SemiDefinite Programming
, 2002
"... Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information ..."
Abstract

Cited by 780 (22 self)
 Add to MetaCart
Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information
Nonlinear component analysis as a kernel eigenvalue problem

, 1996
"... We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all ..."
Abstract

Cited by 1554 (85 self)
 Add to MetaCart
We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all
The pyramid match kernel: Discriminative classification with sets of image features
 IN ICCV
, 2005
"... Discriminative learning is challenging when examples are sets of features, and the sets vary in cardinality and lack any sort of meaningful ordering. Kernelbased classification methods can learn complex decision boundaries, but a kernel over unordered set inputs must somehow solve for correspondenc ..."
Abstract

Cited by 546 (29 self)
 Add to MetaCart
Discriminative learning is challenging when examples are sets of features, and the sets vary in cardinality and lack any sort of meaningful ordering. Kernelbased classification methods can learn complex decision boundaries, but a kernel over unordered set inputs must somehow solve
On µkernel construction
 Symposium on Operating System Principles
, 1995
"... From a softwaretechnology point of view, thekernel concept is superior to large integrated kernels. On the other hand, it is widely believed that (a)kernel based systems are inherently inefficient and (b) they are not sufficiently flexible. Contradictory to this belief, we show and support by doc ..."
Abstract

Cited by 424 (25 self)
 Add to MetaCart
From a softwaretechnology point of view, thekernel concept is superior to large integrated kernels. On the other hand, it is widely believed that (a)kernel based systems are inherently inefficient and (b) they are not sufficiently flexible. Contradictory to this belief, we show and support
Scheduler Activations: Effective Kernel Support for the UserLevel Management of Parallelism
 ACM Transactions on Computer Systems
, 1992
"... Threads are the vehicle,for concurrency in many approaches to parallel programming. Threads separate the notion of a sequential execution stream from the other aspects of traditional UNIXlike processes, such as address spaces and I/O descriptors. The objective of this separation is to make the expr ..."
Abstract

Cited by 475 (21 self)
 Add to MetaCart
the expression and control of parallelism sufficiently cheap that the programmer or compiler can exploit even finegrained parallelism with acceptable overhead. Threads can be supported either by the operating system kernel or by userlevel library code in the application address space, but neither approach has
Mean shift: A robust approach toward feature space analysis
 In PAMI
, 2002
"... A general nonparametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure, the mean shift. We prove for discrete data the convergence ..."
Abstract

Cited by 2375 (40 self)
 Add to MetaCart
the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and thus its utility in detecting the modes of the density. The equivalence of the mean shift procedure to the Nadaraya–Watson estimator from kernel regression and the robust M
Results 1  10
of
272,076