Results 1  10
of
4,630
A kNearest Neighbor Based Algorithm for
"... Multilabel Classification Abstract — In multilabel learning, each instance in the training set is associated with a set of labels, and the task is to output a label set whose size is unknown a priori for each unseen instance. In this paper, a multilabel lazy learning approach named MLkNN is prese ..."
Abstract
 Add to MetaCart
NN is presented, which is derived from the traditional knearest neighbor (kNN) algorithm. In detail, for each new instance, its k nearest neighbors are firstly identified. After that, according to the label sets of these neighboring instances, maximum a posteriori (MAP) principle is utilized to determine
KNearest Neighbor Classification on Spatial Data Streams Using PTrees
, 2002
"... Classification of spatial data has become important due to the fact that there are huge volumes of spatial data now available holding a wealth of valuable information. In this paper we consider the classification of spatial data streams, where the training dataset changes often. New training data ar ..."
Abstract

Cited by 33 (12 self)
 Add to MetaCart
of time. For that reason KNN is called a lazy classifier. KNN is extremely simple to implement and lends itself to a wide variety of variations. The traditional knearest neighbor classifier finds the k nearest neighbors based on some distance metric by finding the distance of the target data point from
Profiles and Fuzzy kNearest Neighbor Algorithm for Protein Secondary Structure Prediction
 IN PROC. OF THE THIRD ASIA PACIFIC BIOINFORMATICS CONFERENCE
, 2005
"... We introduce a new approach for predicting the secondary structure of proteins using profiles and the Fuzzy KNearest Neighbor algorithm. KNearest Neighbor methods give relatively better performance than Neural Networks or Hidden Markov models when the query protein has few homologs in the sequence ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
in the sequence database to build sequence profile. Although the traditional KNearest Neighbor algorithms are a good choice for this situation, one of the difficulties in utilizing these techniques is that all the labeled sam ples are given equal importance while deciding the secondary structure class
MKNN: Modified KNearest Neighbor
"... Abstract — In this paper, a new classification method for enhancing the performance of KNearest Neighbor is proposed which uses robust neighbors in training data. This new classification method is called Modified KNearest Neighbor, MKNN. Inspired the traditional KNN algorithm, the main idea is cla ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract — In this paper, a new classification method for enhancing the performance of KNearest Neighbor is proposed which uses robust neighbors in training data. This new classification method is called Modified KNearest Neighbor, MKNN. Inspired the traditional KNN algorithm, the main idea
Nearest neighbor queries.
 ACM SIGMOD Record,
, 1995
"... Abstract A frequently encountered type of query in Geographic Information Systems is to nd the k nearest neighbor objects to a given point in space. Processing such queries requires substantially di erent search algorithms than those for location or range queries. In this paper we present a n e cie ..."
Abstract

Cited by 592 (1 self)
 Add to MetaCart
Abstract A frequently encountered type of query in Geographic Information Systems is to nd the k nearest neighbor objects to a given point in space. Processing such queries requires substantially di erent search algorithms than those for location or range queries. In this paper we present a n e
Distance metric learning for large margin nearest neighbor classification
 In NIPS
, 2006
"... We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven ..."
Abstract

Cited by 695 (14 self)
 Add to MetaCart
We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin
NEIGHBORSIGNIFICANCE KNEAREST NEIGHBOR METHOD
"... In this paper, we use NeighborSignificance KNearest Neighbor (NSKNN) algorithm to deals with the problem that " The file is classified automatically ". Simultaneously, we design many kinds of different characteristic choose experiment of tactics, in order to prove different chara ..."
Abstract
 Add to MetaCart
In this paper, we use NeighborSignificance KNearest Neighbor (NSKNN) algorithm to deals with the problem that " The file is classified automatically ". Simultaneously, we design many kinds of different characteristic choose experiment of tactics, in order to prove different
An Optimal Algorithm for Approximate Nearest Neighbor Searching in Fixed Dimensions
 ACMSIAM SYMPOSIUM ON DISCRETE ALGORITHMS
, 1994
"... Consider a set S of n data points in real ddimensional space, R d , where distances are measured using any Minkowski metric. In nearest neighbor searching we preprocess S into a data structure, so that given any query point q 2 R d , the closest point of S to q can be reported quickly. Given any po ..."
Abstract

Cited by 984 (32 self)
 Add to MetaCart
query point q 2 R d , and ffl ? 0, a (1 + ffl)approximate nearest neighbor of q can be computed in O(c d;ffl log n) time, where c d;ffl d d1 + 6d=ffle d is a factor depending only on dimension and ffl. In general, we show that given an integer k 1, (1 + ffl)approximations to the k nearest neighbors
Fastmap: A fast algorithm for indexing, datamining and visualization of traditional and multimedia datasets
, 1995
"... A very promising idea for fast searching in traditional and multimedia databases is to map objects into points in kd space, using k featureextraction functions, provided by a domain expert [Jag91]. Thus, we can subsequently use highly finetuned spatial access methods (SAMs), to answer several ..."
Abstract

Cited by 502 (22 self)
 Add to MetaCart
A very promising idea for fast searching in traditional and multimedia databases is to map objects into points in kd space, using k featureextraction functions, provided by a domain expert [Jag91]. Thus, we can subsequently use highly finetuned spatial access methods (SAMs), to answer several
Fast approximate nearest neighbors with automatic algorithm configuration
 In VISAPP International Conference on Computer Vision Theory and Applications
, 2009
"... nearestneighbors search, randomized kdtrees, hierarchical kmeans tree, clustering. For many computer vision problems, the most time consuming component consists of nearest neighbor matching in highdimensional spaces. There are no known exact algorithms for solving these highdimensional problems ..."
Abstract

Cited by 455 (2 self)
 Add to MetaCart
nearestneighbors search, randomized kdtrees, hierarchical kmeans tree, clustering. For many computer vision problems, the most time consuming component consists of nearest neighbor matching in highdimensional spaces. There are no known exact algorithms for solving these high
Results 1  10
of
4,630