Results 1  10
of
21,450
Distance metric learning, with application to clustering with sideinformation,”
 in Advances in Neural Information Processing Systems 15,
, 2002
"... Abstract Many algorithms rely critically on being given a good metric over their inputs. For instance, data can often be clustered in many "plausible" ways, and if a clustering algorithm such as Kmeans initially fails to find one that is meaningful to a user, the only recourse may be for ..."
Abstract

Cited by 818 (13 self)
 Add to MetaCart
to provide examples. In this paper, we present an algorithm that, given examples of similar (and, if desired, dissimilar) pairs of points in Ê Ò , learns a distance metric over Ê Ò that respects these relationships. Our method is based on posing metric learning as a convex optimization problem, which allows
Online distance metric learning . . .
, 2011
"... Tracking an object without any prior information regarding its appearance is a challenging problem. Modern tracking algorithms treat tracking as a binary classification problem between the object class and the background class. The binary classifier can be learned offline, if a specific object mod ..."
Abstract
 Add to MetaCart
model is available, or online, if there is no prior information about the object’s appearance. In this paper, we propose the use of online distance metric learning in combination with nearest neighbor classification for object tracking. We assume that the previous appearances of the object
Distance metric learning for large margin nearest neighbor classification
 In NIPS
, 2006
"... We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven ..."
Abstract

Cited by 695 (14 self)
 Add to MetaCart
We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin
Distance Metric Learning with Kernels
 Proceedings of the International Conference on Artificial Neural Networks
, 2003
"... In this paper, we propose a feature weighting method that works in both the input space and the kernelinduced feature space. It assumes only the availability of similarity (dissimilarity) information, and the number of parameters in the transformation does not depend on the number of features. Besi ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
In this paper, we propose a feature weighting method that works in both the input space and the kernelinduced feature space. It assumes only the availability of similarity (dissimilarity) information, and the number of parameters in the transformation does not depend on the number of features. Besides feature weighting, it can also be regarded as performing nonparametric kernel adaptation. Experimental results on both toy and realworld datasets show promising results.
Hamming Distance Metric Learning
"... Motivated by largescale multimedia applications we propose to learn mappings from highdimensional data to binary codes that preserve semantic similarity. Binary codes are well suited to largescale applications as they are storage efficient and permit exact sublinear kNN search. The framework is ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
Motivated by largescale multimedia applications we propose to learn mappings from highdimensional data to binary codes that preserve semantic similarity. Binary codes are well suited to largescale applications as they are storage efficient and permit exact sublinear kNN search. The framework
Bayesian active distance metric learning
 UAI
, 2007
"... Distance metric learning is an important component for many tasks, such as statistical classification and contentbased image retrieval. Existing approaches for learning distance metrics from pairwise constraints typically suffer from two major problems. First, most algorithms only offer point estim ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
Distance metric learning is an important component for many tasks, such as statistical classification and contentbased image retrieval. Existing approaches for learning distance metrics from pairwise constraints typically suffer from two major problems. First, most algorithms only offer point
Distance Metric Learning Revisited
 In Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML/PKDD
"... Abstract The success of many machine learning algorithms (e.g. the nearest neighborhood classification and kmeans clustering) depends on the representation of the data as elements in a metric space. Learning an appropriate distance metric from data is usually superior to the default Euclidean dist ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract The success of many machine learning algorithms (e.g. the nearest neighborhood classification and kmeans clustering) depends on the representation of the data as elements in a metric space. Learning an appropriate distance metric from data is usually superior to the default Euclidean
Distance Metric Learning with Kernels
"... Abstract — In this paper, we propose a feature weighting method that works in both the input space and the kernelinduced feature space. It assumes only the availability of similarity (dissimilarity) information, and the number of parameters in the transformation does not depend on the number of fea ..."
Abstract
 Add to MetaCart
Abstract — In this paper, we propose a feature weighting method that works in both the input space and the kernelinduced feature space. It assumes only the availability of similarity (dissimilarity) information, and the number of parameters in the transformation does not depend on the number of features. Besides feature weighting, it can also be regarded as performing nonparametric kernel adaptation. Experimental results on both toy and realworld datasets show promising results. I.
Adaptive Distance Metric Learning for Clustering
"... A good distance metric is crucial for unsupervised learning from highdimensional data. To learn a metric without any constraint or class label information, most unsupervised metric learning algorithms appeal to projecting observed data onto a lowdimensional manifold, where geometric relationships ..."
Abstract
 Add to MetaCart
A good distance metric is crucial for unsupervised learning from highdimensional data. To learn a metric without any constraint or class label information, most unsupervised metric learning algorithms appeal to projecting observed data onto a lowdimensional manifold, where geometric relationships
Regularized Distance Metric Learning: Theory and Algorithm
"... In this paper, we examine the generalization error of regularized distance metric learning. We show that with appropriate constraints, the generalization error of regularized distance metric learning could be independent from the dimensionality, making it suitable for handling high dimensional data. ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
In this paper, we examine the generalization error of regularized distance metric learning. We show that with appropriate constraints, the generalization error of regularized distance metric learning could be independent from the dimensionality, making it suitable for handling high dimensional data
Results 1  10
of
21,450