Results 1  10
of
653,027
An adaptive nearest neighbor classification algorithm for data streams
 In PKDD
, 2005
"... Abstract. In this paper, we propose an incremental classification algorithm which uses a multiresolution data representation to find adaptive nearest neighbors of a test point. The algorithm achieves excellent performance by using small classifier ensembles where approximation error bounds are guar ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Abstract. In this paper, we propose an incremental classification algorithm which uses a multiresolution data representation to find adaptive nearest neighbors of a test point. The algorithm achieves excellent performance by using small classifier ensembles where approximation error bounds
Nearest Neighbor Classification with a Local Asymmetrically Weighted Metric
, 1996
"... This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classification algorithm. It is shown both with theoretical arguments and computer experiments that good compression rates can be achieved outperforming the accuracy of the standard nearest neighbor classification ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classification algorithm. It is shown both with theoretical arguments and computer experiments that good compression rates can be achieved outperforming the accuracy of the standard nearest neighbor
Nearest Neighbor Queries
, 1995
"... A frequently encountered type of query in Geographic Information Systems is to find the k nearest neighbor objects to a given point in space. Processing such queries requires substantially different search algorithms than those for location or range queries. In this paper we present an efficient bra ..."
Abstract

Cited by 594 (1 self)
 Add to MetaCart
A frequently encountered type of query in Geographic Information Systems is to find the k nearest neighbor objects to a given point in space. Processing such queries requires substantially different search algorithms than those for location or range queries. In this paper we present an efficient
Discriminant Adaptive Nearest Neighbor Classification
, 1994
"... Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions. We propose a locally adaptive form of nearest neighbor classification to try to ameliorate this curse of dimensionality. We use a local linear discriminant an ..."
Abstract

Cited by 322 (1 self)
 Add to MetaCart
Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions. We propose a locally adaptive form of nearest neighbor classification to try to ameliorate this curse of dimensionality. We use a local linear discriminant
A Classification Learning Algorithm
"... Presence of irrelevant features is a fact of life in many realworld applications of classification learning. Although nearestneighbor classification algorithms have emerged as a promising approach to machine learning tasks with their high predictive accuracy, they are adversely affected by the ..."
Abstract
 Add to MetaCart
Presence of irrelevant features is a fact of life in many realworld applications of classification learning. Although nearestneighbor classification algorithms have emerged as a promising approach to machine learning tasks with their high predictive accuracy, they are adversely affected
Distance metric learning for large margin nearest neighbor classification
 In NIPS
, 2006
"... We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven ..."
Abstract

Cited by 685 (15 self)
 Add to MetaCart
We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin
Approximate Nearest Neighbors: Towards Removing the Curse of Dimensionality
, 1998
"... The nearest neighbor problem is the following: Given a set of n points P = fp 1 ; : : : ; png in some metric space X, preprocess P so as to efficiently answer queries which require finding the point in P closest to a query point q 2 X. We focus on the particularly interesting case of the ddimens ..."
Abstract

Cited by 1017 (40 self)
 Add to MetaCart
, there has been some interest in the approximate nearest neighbors problem, which is: Find a point p 2 P that is an fflapproximate nearest neighbor of the query q in that for all p 0 2 P , d(p; q) (1 + ffl)d(p 0 ; q). We present two algorithmic results for the approximate version that significantly
A Hybrid Genetic Algorithm for Classification
 In Proceedings of the Twelfth International Joint Conference on Artificial Intelligence (pp. 645650
, 1991
"... In this paper we describe a method for hybridizing a genetic algorithm and a k nearest neighbors classification algorithm. We use the genetic algorithm and a training data set to learn realvalued weights associated with individual attributes in the data set. We use the k nearest neighbors algorithm ..."
Abstract

Cited by 45 (0 self)
 Add to MetaCart
In this paper we describe a method for hybridizing a genetic algorithm and a k nearest neighbors classification algorithm. We use the genetic algorithm and a training data set to learn realvalued weights associated with individual attributes in the data set. We use the k nearest neighbors
An Optimal Algorithm for Approximate Nearest Neighbor Searching in Fixed Dimensions
 ACMSIAM SYMPOSIUM ON DISCRETE ALGORITHMS
, 1994
"... Consider a set S of n data points in real ddimensional space, R d , where distances are measured using any Minkowski metric. In nearest neighbor searching we preprocess S into a data structure, so that given any query point q 2 R d , the closest point of S to q can be reported quickly. Given any po ..."
Abstract

Cited by 983 (32 self)
 Add to MetaCart
Consider a set S of n data points in real ddimensional space, R d , where distances are measured using any Minkowski metric. In nearest neighbor searching we preprocess S into a data structure, so that given any query point q 2 R d , the closest point of S to q can be reported quickly. Given any
When Is "Nearest Neighbor" Meaningful?
 In Int. Conf. on Database Theory
, 1999
"... . We explore the effect of dimensionality on the "nearest neighbor " problem. We show that under a broad set of conditions (much broader than independent and identically distributed dimensions), as dimensionality increases, the distance to the nearest data point approaches the distance ..."
Abstract

Cited by 402 (1 self)
 Add to MetaCart
. We explore the effect of dimensionality on the "nearest neighbor " problem. We show that under a broad set of conditions (much broader than independent and identically distributed dimensions), as dimensionality increases, the distance to the nearest data point approaches
Results 1  10
of
653,027