Results 1 - 10
of
3,511
Adaptive Metric Nearest Neighbor Classification
- IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2000
"... Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a ..."
Abstract
-
Cited by 104 (4 self)
- Add to MetaCart
a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chisquared distance analysis to compute a flexible metric for producing neighborhoods that are highly adaptive to query locations. Neighborhoods are elongated along less relevant feature dimensions
Discriminant adaptive nearest neighbor classification,
, 1995
"... Abstract Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear discrimin ..."
Abstract
-
Cited by 321 (1 self)
- Add to MetaCart
Abstract Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear
Distance metric learning for large margin nearest neighbor classification
- In NIPS
, 2006
"... We show how to learn a Mahanalobis distance metric for k-nearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven ..."
Abstract
-
Cited by 695 (14 self)
- Add to MetaCart
We show how to learn a Mahanalobis distance metric for k-nearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin
Nearest neighbor queries.
- ACM SIGMOD Record,
, 1995
"... Abstract A frequently encountered type of query in Geographic Information Systems is to nd the k nearest neighbor objects to a given point in space. Processing such queries requires substantially di erent search algorithms than those for location or range queries. In this paper we present a n e cie ..."
Abstract
-
Cited by 592 (1 self)
- Add to MetaCart
cient branch-and-bound R-tree traversal algorithm to nd the nearest neighbor object to a point, and then generalize it to nding the k nearest neighbors. We also discuss metrics for an optimistic and a pessimistic search ordering strategy as well as for pruning. Finally, w e present the results
An Optimal Algorithm for Approximate Nearest Neighbor Searching in Fixed Dimensions
- ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS
, 1994
"... Consider a set S of n data points in real d-dimensional space, R d , where distances are measured using any Minkowski metric. In nearest neighbor searching we preprocess S into a data structure, so that given any query point q 2 R d , the closest point of S to q can be reported quickly. Given any po ..."
Abstract
-
Cited by 984 (32 self)
- Add to MetaCart
Consider a set S of n data points in real d-dimensional space, R d , where distances are measured using any Minkowski metric. In nearest neighbor searching we preprocess S into a data structure, so that given any query point q 2 R d , the closest point of S to q can be reported quickly. Given any
Flexible Metric Nearest Neighbor Classification
, 1994
"... The K-nearest-neighbor decision rule assigns an object of unknown class to the plurality class among the K labeled "training" objects that are closest to it. Closeness is usually defined in terms of a metric distance on the Euclidean space with the input measurement variables as axes. The ..."
Abstract
-
Cited by 133 (2 self)
- Add to MetaCart
The K-nearest-neighbor decision rule assigns an object of unknown class to the plurality class among the K labeled "training" objects that are closest to it. Closeness is usually defined in terms of a metric distance on the Euclidean space with the input measurement variables as axes
Comparison of discrimination methods for the classification of tumors using gene expression data
- JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2002
"... A reliable and precise classification of tumors is essential for successful diagnosis and treatment of cancer. cDNA microarrays and high-density oligonucleotide chips are novel biotechnologies increasingly used in cancer research. By allowing the monitoring of expression levels in cells for thousand ..."
Abstract
-
Cited by 770 (6 self)
- Add to MetaCart
gene expression data is an important aspect of this novel approach to cancer classification. This article compares the performance of different discrimination methods for the classification of tumors based on gene expression data. The methods include nearest-neighbor classifiers, linear discriminant
Data Structures and Algorithms for Nearest Neighbor Search in General Metric Spaces
, 1993
"... We consider the computational problem of finding nearest neighbors in general metric spaces. Of particular interest are spaces that may not be conveniently embedded or approximated in Euclidian space, or where the dimensionality of a Euclidian representation is very high. Also relevant are high-dim ..."
Abstract
-
Cited by 358 (5 self)
- Add to MetaCart
We consider the computational problem of finding nearest neighbors in general metric spaces. Of particular interest are spaces that may not be conveniently embedded or approximated in Euclidian space, or where the dimensionality of a Euclidian representation is very high. Also relevant are high
Metric Learning for Nearest Neighbor Classification
"... We develop methods for constructing an-weighted metric ( ) ( ) that improves the performance of-nearest neighbor (KNN) classifiers. KNN is known to be highly flexible, but can be somewhat inefficient and unstable. By incorporating a parametrically optimized metric into KNN, global dimension reduc ..."
Abstract
- Add to MetaCart
We develop methods for constructing an-weighted metric ( ) ( ) that improves the performance of-nearest neighbor (KNN) classifiers. KNN is known to be highly flexible, but can be somewhat inefficient and unstable. By incorporating a parametrically optimized metric into KNN, global dimension
Adaptive Kernel Metric Nearest Neighbor Classification
- IN PROCEEDINGS OF THE SIXTEENTH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION
, 2002
"... Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-ofdimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest ne ..."
Abstract
-
Cited by 17 (0 self)
- Add to MetaCart
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-ofdimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest
Results 1 - 10
of
3,511