Results 1  10
of
15,221
Diffusion kernels on graphs and other discrete input spaces
 in: Proceedings of the 19th International Conference on Machine Learning
, 2002
"... The application of kernelbased learning algorithms has, so far, largely been confined to realvalued data and a few special data types, such as strings. In this paper we propose a general method of constructing natural families of kernels over discrete structures, based on the matrix exponentiation ..."
Abstract

Cited by 223 (5 self)
 Add to MetaCart
idea. In particular, we focus on generating kernels on graphs, for which we propose a special class of exponential kernels called diffusion kernels, which are based on the heat equation and can be regarded as the discretization of the familiar Gaussian kernel of Euclidean space.
Partitioning Input Space for ControlLearning
"... This paper considers the effect of inputspace partitioning on reinforcement learning for control. In many such learning systems, the input space is partitioned by the system designer. However, inputspace partitioning could be learned. Our objective is to compare learned and programmed inputspace ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper considers the effect of inputspace partitioning on reinforcement learning for control. In many such learning systems, the input space is partitioned by the system designer. However, inputspace partitioning could be learned. Our objective is to compare learned and programmed inputspace
Active Learning in Discrete Input Spaces
 In Proceedings of the 34th Interface Symposium
, 2002
"... Traditional design of experiments (DOE) from the statistics literature focuses on optimizing an output parameter over a space of continuous input parameters. Here we consider DOE, or active learning, for discrete input spaces. A trivial example of this is the karmed bandit problem, which is the ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Traditional design of experiments (DOE) from the statistics literature focuses on optimizing an output parameter over a space of continuous input parameters. Here we consider DOE, or active learning, for discrete input spaces. A trivial example of this is the karmed bandit problem, which
Input Space Versus Feature Space in KernelBased Methods
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1999
"... This paper collects some ideas targeted at advancing our understanding of the feature spaces associated with support vector (SV) kernel functions. We first discuss the geometry of feature space. In particular, we review what is known about the shape of the image of input space under the feature spac ..."
Abstract

Cited by 130 (3 self)
 Add to MetaCart
This paper collects some ideas targeted at advancing our understanding of the feature spaces associated with support vector (SV) kernel functions. We first discuss the geometry of feature space. In particular, we review what is known about the shape of the image of input space under the feature
A theory of shape by space carving
 In Proceedings of the 7th IEEE International Conference on Computer Vision (ICCV99), volume I, pages 307– 314, Los Alamitos, CA
, 1999
"... In this paper we consider the problem of computing the 3D shape of an unknown, arbitrarilyshaped scene from multiple photographs taken at known but arbitrarilydistributed viewpoints. By studying the equivalence class of all 3D shapes that reproduce the input photographs, we prove the existence of a ..."
Abstract

Cited by 566 (14 self)
 Add to MetaCart
In this paper we consider the problem of computing the 3D shape of an unknown, arbitrarilyshaped scene from multiple photographs taken at known but arbitrarilydistributed viewpoints. By studying the equivalence class of all 3D shapes that reproduce the input photographs, we prove the existence
Fisher Discriminant Analysis With Kernels
, 1999
"... A nonlinear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) nonlinear decision f ..."
Abstract

Cited by 503 (18 self)
 Add to MetaCart
function in input space. Large scale simulations demonstrate the competitiveness of our approach.
Learning the Kernel Matrix with SemiDefinite Programming
, 2002
"... Kernelbased learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information ..."
Abstract

Cited by 775 (21 self)
 Add to MetaCart
is contained in the socalled kernel matrix, a symmetric and positive definite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input spaceclassical model selection
Nonlinear component analysis as a kernel eigenvalue problem

, 1996
"... We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all ..."
Abstract

Cited by 1573 (83 self)
 Add to MetaCart
We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all
Mean shift: A robust approach toward feature space analysis
 In PAMI
, 2002
"... A general nonparametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure, the mean shift. We prove for discrete data the convergence ..."
Abstract

Cited by 2395 (37 self)
 Add to MetaCart
A general nonparametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure, the mean shift. We prove for discrete data
Estimating the Support of a HighDimensional Distribution
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propo ..."
Abstract

Cited by 783 (29 self)
 Add to MetaCart
Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We
Results 1  10
of
15,221