Results 1  10
of
194
Interpolation of Scattered Data: Distance Matrices and Conditionally Positive Definite Functions
 CONSTRUCTIVE APPROXIMATION
, 1986
"... Among other things, we prove that multiquadric surface interpolation is always solvable, thereby settling a conjecture of R. Franke. ..."
Abstract

Cited by 359 (3 self)
 Add to MetaCart
Among other things, we prove that multiquadric surface interpolation is always solvable, thereby settling a conjecture of R. Franke.
A Theory of Networks for Approximation and Learning
 Laboratory, Massachusetts Institute of Technology
, 1989
"... Learning an inputoutput mapping from a set of examples, of the type that many neural networks have been constructed to perform, can be regarded as synthesizing an approximation of a multidimensional function, that is solving the problem of hypersurface reconstruction. From this point of view, t ..."
Abstract

Cited by 235 (24 self)
 Add to MetaCart
Learning an inputoutput mapping from a set of examples, of the type that many neural networks have been constructed to perform, can be regarded as synthesizing an approximation of a multidimensional function, that is solving the problem of hypersurface reconstruction. From this point of view, this form of learning is closely related to classical approximation techniques, such as generalized splines and regularization theory. This paper considers the problems of an exact representation and, in more detail, of the approximation of linear and nonlinear mappings in terms of simpler functions of fewer variables. Kolmogorov's theorem concerning the representation of functions of several variables in terms of functions of one variable turns out to be almost irrelevant in the context of networks for learning. Wedevelop a theoretical framework for approximation based on regularization techniques that leads to a class of threelayer networks that we call Generalized Radial Basis Functions (GRBF), since they are mathematically related to the wellknown Radial Basis Functions, mainly used for strict interpolation tasks. GRBF networks are not only equivalent to generalized splines, but are also closely related to pattern recognition methods suchasParzen windows and potential functions and to several neural network algorithms, suchas Kanerva's associative memory,backpropagation and Kohonen's topology preserving map. They also haveaninteresting interpretation in terms of prototypes that are synthesized and optimally combined during the learning stage. The paper introduces several extensions and applications of the technique and discusses intriguing analogies with neurobiological data.
The connection between regularization operators and support vector kernels
, 1998
"... In this paper a correspondence is derived between regularization operators used in regularization networks and support vector kernels. We prove that the Green’s Functions associated with regularization operators are suitable support vector kernels with equivalent regularization properties. Moreover, ..."
Abstract

Cited by 175 (40 self)
 Add to MetaCart
In this paper a correspondence is derived between regularization operators used in regularization networks and support vector kernels. We prove that the Green’s Functions associated with regularization operators are suitable support vector kernels with equivalent regularization properties. Moreover, the paper provides an analysis of currently used support vector kernels in the view of regularization theory and corresponding operators associated with the classes of both polynomial kernels and translation invariant kernels. The latter are also analyzed on periodical domains. As a byproduct we show that a large number of radial basis functions, namely conditionally positive definite
The kernel trick for distances
 TR MSR 200051, Microsoft Research
, 1993
"... A method is described which, like the kernel trick in support vector machines (SVMs), lets us generalize distancebased algorithms to operate in feature spaces, usually nonlinearly related to the input space. This is done by identifying a class of kernels which can be represented as normbased dista ..."
Abstract

Cited by 114 (0 self)
 Add to MetaCart
(Show Context)
A method is described which, like the kernel trick in support vector machines (SVMs), lets us generalize distancebased algorithms to operate in feature spaces, usually nonlinearly related to the input space. This is done by identifying a class of kernels which can be represented as normbased distances in Hilbert spaces. It turns out that common kernel algorithms, such as SVMs and kernel PCA, are actually really distance based algorithms and can be run with that class of kernels, too. As well as providing a useful new insight into how these algorithms work, the present work can form the basis for conceiving new algorithms.
Metric cotype
, 2005
"... We introduce the notion of metric cotype, a property of metric spaces related to a property of normed spaces, called Rademacher cotype. Apart from settling a long standing open problem in metric geometry, this property is used to prove the following dichotomy: A family of metric spaces F is either a ..."
Abstract

Cited by 41 (19 self)
 Add to MetaCart
We introduce the notion of metric cotype, a property of metric spaces related to a property of normed spaces, called Rademacher cotype. Apart from settling a long standing open problem in metric geometry, this property is used to prove the following dichotomy: A family of metric spaces F is either almost universal (i.e., contains any finite metric space with any distortion> 1), or there exists α> 0, and arbitrarily large npoint metrics whose distortion when embedded in any member of F is at least Ω ((log n) α). The same property is also used to prove strong nonembeddability theorems of Lq into Lp, when q> max{2, p}. Finally we use metric cotype to obtain a new type of isoperimetric inequality on the discrete torus. 1
Jensenshannon divergence and hilbert space embedding.
 In Proceedings of International Symposium on Information Theory, 2004, ISIT 2004.
, 2004
"... ..."
(Show Context)
Euclidean Quotients of Finite Metric Spaces
"... This paper is devoted to the study of quotients of finite metric spaces. The basic typeof question we ask is: Given a finite metric space M and ff> = 1, what is the largest quotientof (a subset of) M which well embeds into Hilbert space. We obtain asymptotically tightbounds for these questions, ..."
Abstract

Cited by 40 (19 self)
 Add to MetaCart
(Show Context)
This paper is devoted to the study of quotients of finite metric spaces. The basic typeof question we ask is: Given a finite metric space M and ff> = 1, what is the largest quotientof (a subset of) M which well embeds into Hilbert space. We obtain asymptotically tightbounds for these questions, and prove that they exhibit phase transitions. We also study the analogous problem for embedings into `p, and the particular case of the hypercube.
Kernel methods on the Riemannian manifold of symmetric positive definite matrices.
 In CVPR,
, 2013
"... ..."
(Show Context)