Results 1 - 10
of
30
Hyperspectral Remote Sensing Data Analysis and Future Challenges
"... Abstract—Hyperspectral remote sen-sing ..."
(Show Context)
Fast conical hull algorithms for near-separable non-negative matrix factorization
- In ACM/IEEE conference on Supercomputing
, 2009
"... The separability assumption (Donoho & Stodden, 2003; Arora et al., 2012a) turns non-negative matrix factorization (NMF) into a tractable problem. Recently, a new class of provably-correct NMF algorithms have emerged under this assumption. In this paper, we reformulate the separable NMF problem a ..."
Abstract
-
Cited by 21 (1 self)
- Add to MetaCart
(Show Context)
The separability assumption (Donoho & Stodden, 2003; Arora et al., 2012a) turns non-negative matrix factorization (NMF) into a tractable problem. Recently, a new class of provably-correct NMF algorithms have emerged under this assumption. In this paper, we reformulate the separable NMF problem as that of finding the extreme rays of the conical hull of a finite set of vectors. From this geometricperspective, we derive new separable NMF algorithms that are highly scalable and empirically noise robust, and haveseveralotherfavorablepropertiesin relation to existing methods. A parallel implementation of our algorithm demonstrates high scalability on shared- and distributedmemory machines. 1.
A signal processing perspective on hyperspectral unmixing: Insights from remote sensing
- IEEE Signal Processing Magazine
, 2014
"... Blind hyperspectral unmixing (HU), also known as unsuper-vised HU, is one of the most prominent research topics in sig-nal processing for hyperspectral remote sensing [1, 2]. Blind HU aims at identifying materials present in a captured scene, ..."
Abstract
-
Cited by 14 (7 self)
- Add to MetaCart
(Show Context)
Blind hyperspectral unmixing (HU), also known as unsuper-vised HU, is one of the most prominent research topics in sig-nal processing for hyperspectral remote sensing [1, 2]. Blind HU aims at identifying materials present in a captured scene,
Sparse and unique nonnegative matrix factorization through data preprocessing
- Journal of Machine Learning Research
"... Nonnegative matrix factorization (NMF) has become a very popular technique in machine learning because it automatically extracts meaningful features through a sparse and part-based representation. However, NMF has the drawback of being highly ill-posed, that is, there typically exist many different ..."
Abstract
-
Cited by 14 (6 self)
- Add to MetaCart
Nonnegative matrix factorization (NMF) has become a very popular technique in machine learning because it automatically extracts meaningful features through a sparse and part-based representation. However, NMF has the drawback of being highly ill-posed, that is, there typically exist many different but equivalent factorizations. In this paper, we introduce a completely new way to obtaining more well-posed NMF problems whose solutions are sparser. Our technique is based on the preprocessing of the nonnegative input data matrix, and relies on the theory of M-matrices and the geometric interpretation of NMF. This approach provably leads to optimal and sparse solutions under the separability assumption of Donoho and Stodden (2003), and, for rank-three matrices, makes the number of exact factorizations finite. We illustrate the effectiveness of our technique on several image data sets.
R.: Robust near-separable nonnegative matrix factorization using linear optimization
- Journal of Machine Learning Research
, 2014
"... ar ..."
(Show Context)
Robustness analysis of Hottopixx, a linear programming model for factoring nonnegative matrices
- SIAM Journal on Matrix Analysis and Applications
, 2013
"... ar ..."
(Show Context)
The why and how of nonnegative matrix factorization
- REGULARIZATION, OPTIMIZATION, KERNELS, AND SUPPORT VECTOR MACHINES. CHAPMAN & HALL/CRC
, 2014
"... ..."
(Show Context)
Greedy algorithms for pure pixels identification in hyperspectral unmixing: A multiple-measurement vector viewpoint
- in Proc. EUSIPCO’13
"... This paper studies a multiple-measurement vector (MMV)-based sparse regression approach to blind hyperspectral un-mixing. In general, sparse regression requires a dictionary. The considered approach uses the measured hyperspectral data as the dictionary, thereby intending to represent the whole meas ..."
Abstract
-
Cited by 6 (3 self)
- Add to MetaCart
(Show Context)
This paper studies a multiple-measurement vector (MMV)-based sparse regression approach to blind hyperspectral un-mixing. In general, sparse regression requires a dictionary. The considered approach uses the measured hyperspectral data as the dictionary, thereby intending to represent the whole measured data using the fewest number of measured hyperspectral vectors. We tackle this self-dictionary MMV (SD-MMV) approach using greedy pursuit. It is shown that the resulting greedy algorithms are identical or very similar to some representative pure pixels identification algorithms, such as vertex component analysis. Hence, our study pro-vides a new dimension on understanding and interpreting pure pixels identification methods. We also prove that in the noiseless case, the greedy SD-MMV algorithms guaran-tee perfect identification of pure pixels when the pure pixel assumption holds. 1.
Hyperspectral image unmixing via bilinear generalized approximate message passing
- Proc. SPIE
, 2013
"... In hyperspectral unmixing, the objective is to decompose an electromagnetic spectral dataset measured over M spectral bands and T pixels, into N constituent material spectra (or “endmembers”) with corresponding spatial abundances. In this paper, we propose a novel approach to hyperspectral unmixing ..."
Abstract
-
Cited by 5 (3 self)
- Add to MetaCart
In hyperspectral unmixing, the objective is to decompose an electromagnetic spectral dataset measured over M spectral bands and T pixels, into N constituent material spectra (or “endmembers”) with corresponding spatial abundances. In this paper, we propose a novel approach to hyperspectral unmixing (i.e., joint estimation of endmembers and abundances) based on loopy belief propagation. In particular, we employ the bilinear generalized approximate message passing algorithm (BiG-AMP), a recently proposed belief-propagation-based approach to matrix factorization, in a “turbo ” framework that enables the exploitation of spectral coherence in the endmembers, as well as spatial coherence in the abundances. In conjunction, we propose an expectationmaximization (EM) technique that can be used to automatically tune the prior statistics assumed by turbo BiG-AMP. Numerical experiments on synthetic and real-world data confirm the state-of-the-art performance of our approach.
Hierarchical Clustering of Hyperspectral Images Using Rank-Two Nonnegative Matrix Factorization
- IEEE, Transactions on Geoscience and Remote Sensing
, 2015
"... In this paper, we design a hierarchical clustering algorithm for high-resolution hyperspectral images. At the core of the algorithm, a new rank-two nonnegative matrix factorizations (NMF) algorithm is used to split the clusters, which is motivated by convex geometry concepts. The method starts with ..."
Abstract
-
Cited by 3 (2 self)
- Add to MetaCart
(Show Context)
In this paper, we design a hierarchical clustering algorithm for high-resolution hyperspectral images. At the core of the algorithm, a new rank-two nonnegative matrix factorizations (NMF) algorithm is used to split the clusters, which is motivated by convex geometry concepts. The method starts with a single cluster containing all pixels, and, at each step, (i) selects a cluster in such a way that the error at the next step is minimized, and (ii) splits the selected cluster into two disjoint clusters using rank-two NMF in such a way that the clusters are well balanced and stable. The proposed method can also be used as an endmember extraction algorithm in the presence of pure pixels. The effectiveness of this approach is illustrated on several synthetic and real-world hyperspectral images, and shown to outperform standard clustering techniques such as k-means, spherical k-means and standard NMF.