#### DMCA

## Collaborative-Representation-Based Nearest Neighbor Classifier for Hyperspectral Imagery

### Citations

107 | Sparse representation or collaborative representation: Which helps face recognition
- Zhang, Yang, et al.
(Show Context)
Citation Context ...th a sparseness constraint, and the representation is recovered by an 1-norm minimization, and the final label is assigned by checking against each class with minimum reconstruction error. Reference =-=[8]-=- argued that it is the “collaborative” nature of the approximation instead of “competitive” nature imposed by the sparseness constraint that actually improves classification accuracy. Collaborative re... |

29 | Hyperspectral image classification using dictionary-based sparse representation
- Chen, Nasrabadi, et al.
- 2011
(Show Context)
Citation Context ...and the weight matrix is column-wise sparse; whereas in the former, they are vectors, and the weight vector is not sparse. The joint sparse models considering neighboring pixels for classification in =-=[11]-=- and [12] belong to the latter, where the term “joint” has the same meaning as “collaborative” as in “collaborative sparse unmxing” in [10]. In this letter, we propose two novel representation-based N... |

19 |
Nearest neighbor classification of remote sensing images with the maximal margin principle.
- Blanzieri, Melgani
- 2008
(Show Context)
Citation Context ...for the k-NN classifier. In [5], cosine-based nonparametric feature extraction was developed to include a weight function in the within- and between-class scatter matrices for the k-NN classifier. In =-=[6]-=-, a variant of the k-NN classifier based on the maximal margin principle has been discussed. In [3], local manifold learning has been combined with k-NN to improve HSI classification. Recently, sparse... |

18 | Locality-preserving dimensionality reduction and classification for hyperspectral image analysis
- Li, Prasad, et al.
- 2012
(Show Context)
Citation Context ...ion that data abide by a normal or multimodal distribution. Thus, popular choices of statistical classifiers are the maximum-likelihood estimation classifier and the Gaussian mixture model classifier =-=[1]-=-. However, a single Gaussian or mixture Gaussian distribution under small training sample size situations may not be true, which often happens in hyperspectral imagery (HSI). The nearest neighbor (NN)... |

15 |
A local mean-based nonparametric classifier,”
- Mitani, Hamamoto
- 2006
(Show Context)
Citation Context ...sign the majority category label according to its k nearest training samples. The distance usually employs the standard Euclidean distance. Several extensions of this classifier have been studied. In =-=[4]-=-, the k-NN rule was extended Manuscript received March 18, 2014; revised May 26, 2014 and July 15, 2014; accepted July 25, 2014. This work was supported by the National Natural Science Foundation of C... |

11 | Nearest regularized subspace for hyperspectral classification
- Li, Tramel, et al.
- 2014
(Show Context)
Citation Context ...ture imposed by the sparseness constraint that actually improves classification accuracy. Collaborative representation (CR)-based classification has also been successfully applied in HSI analysis. In =-=[9]-=-, a CR-based classifier, called nearest regularized subspace, was proposed for HSI classification. Note that the collaboration mentioned here means the atoms in a dictionary collaborate together to re... |

11 | Collaborative Sparse Regression for Hyperspectral Unmixing
- Iordache, Bioucas-Dias, et al.
- 2014
(Show Context)
Citation Context ... Note that the collaboration mentioned here means the atoms in a dictionary collaborate together to represent a single pixel; it is different from the “collaboration” reinforced in sparse unmixing in =-=[10]-=-, where all the pixels collaborate together to choose the same set of atoms in the dictionary, if possible. In the latter, the data and the corresponding weights under consideration have to be matrice... |

7 | Decision fusion in kernel-induced spaces for hyperspectral image classification,” Geos
- Li, Prasad, et al.
- 2014
(Show Context)
Citation Context ...representation are estimated by an 2-norm minimization-derived closed-form solution with Tikhonov regularization; in the second step, the label of the testing sample is determined by majority voting =-=[13]-=- of those with k largest representation weights. As an alternative, the proposed LRNN calculates the representation of the testing sample using the local class-specific training samples, which are obt... |

6 |
A nonparametric feature extraction and its application to nearest neighbor classification for hyperspectral image data
- Yang, Yu, et al.
- 2010
(Show Context)
Citation Context ...09/LGRS.2014.2343956 to a local mean-based NN (LMNN) classifier. In [2], Euclidean metric in a low-dimensional space was modified to minimize the variance of given classes for the k-NN classifier. In =-=[5]-=-, cosine-based nonparametric feature extraction was developed to include a weight function in the within- and between-class scatter matrices for the k-NN classifier. In [6], a variant of the k-NN clas... |

4 |
Supervised classification of recovery sensed image using a modified k-NN technique
- Samaniego, Bardossy, et al.
- 2008
(Show Context)
Citation Context ... However, a single Gaussian or mixture Gaussian distribution under small training sample size situations may not be true, which often happens in hyperspectral imagery (HSI). The nearest neighbor (NN) =-=[2]-=-, [3] classifier, one of the simplest yet effective classification methods, has been widely used in HSI analysis. This nonparametric classifier does not need any prior knowledge about the density dist... |

4 |
Local manifold learning based k-nearest-neighbor for hyperspectral image classification
- Ma, Crawford, et al.
- 2010
(Show Context)
Citation Context ...ver, a single Gaussian or mixture Gaussian distribution under small training sample size situations may not be true, which often happens in hyperspectral imagery (HSI). The nearest neighbor (NN) [2], =-=[3]-=- classifier, one of the simplest yet effective classification methods, has been widely used in HSI analysis. This nonparametric classifier does not need any prior knowledge about the density distribut... |

2 |
Robust face recognition via space representation
- Wright, Yang, et al.
- 2009
(Show Context)
Citation Context ...sed on the maximal margin principle has been discussed. In [3], local manifold learning has been combined with k-NN to improve HSI classification. Recently, sparse representation-based classification =-=[7]-=- has been proposed for robust face recognition. The basic idea is that a testing sample can be represented as a linear combination of all the training samples with a sparseness constraint, and the rep... |

2 |
Hyperspectral image classification by nonlocal joint collaborative representation with a locally adaptive dictionary
- Li, Zhang, et al.
- 2014
(Show Context)
Citation Context ...eight matrix is column-wise sparse; whereas in the former, they are vectors, and the weight vector is not sparse. The joint sparse models considering neighboring pixels for classification in [11] and =-=[12]-=- belong to the latter, where the term “joint” has the same meaning as “collaborative” as in “collaborative sparse unmxing” in [10]. In this letter, we propose two novel representation-based NN classif... |