Results 1  10
of
76
When Does NonNegative Matrix Factorization Give a Correct Decomposition into Parts?
"... We interpret nonnegative matrix factorization geometrically, as the problem of finding a simplicial cone which contains a cloud of data points and which is contained in the positive orthant. We show that under certain conditions, basically requiring that some of the data are spread across the faces ..."
Abstract

Cited by 200 (1 self)
 Add to MetaCart
(Show Context)
We interpret nonnegative matrix factorization geometrically, as the problem of finding a simplicial cone which contains a cloud of data points and which is contained in the positive orthant. We show that under certain conditions, basically requiring that some of the data are spread across the faces of the positive orthant, there is a unique such simplicial cone. We give examples of synthetic image articulation databases which obey these conditions; these require separated support and factorial sampling. For such databases there is a generative model in terms of ‘parts ’ and NMF correctly identifies the ‘parts’. We show that our theoretical results are predictive of the performance of published NMF code, by running the published algorithms on one of our synthetic image articulation databases.
Vertex component analysis: A fast algorithm to unmix hyperspectral data
 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING
, 2005
"... Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for ..."
Abstract

Cited by 193 (16 self)
 Add to MetaCart
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: 1) the endmembers are the vertices of a simplex and 2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with stateoftheart methods, with a computational complexity between one and two orders of magnitude lower than the best available method.
Hyperspectral unmixing overview: Geometrical, statistical, and sparse regressionbased approaches
 IEEE J. SEL. TOPICS APPL. EARTH OBSERV. REMOTE SENS
, 2012
"... Imaging spectrometers measure electromagnetic energy scattered in their instantaneous field view in hundreds or thousands of spectral channels with higher spectral resolution than multispectral cameras. Imaging spectrometers are therefore often referred to as hyperspectral cameras (HSCs). Higher sp ..."
Abstract

Cited by 103 (34 self)
 Add to MetaCart
(Show Context)
Imaging spectrometers measure electromagnetic energy scattered in their instantaneous field view in hundreds or thousands of spectral channels with higher spectral resolution than multispectral cameras. Imaging spectrometers are therefore often referred to as hyperspectral cameras (HSCs). Higher spectral resolution enables material identification via spectroscopic analysis, which facilitates countless applications that require identifying materials in scenarios unsuitable for classical spectroscopic analysis. Due to low spatial resolution of HSCs, microscopic material mixing, and multiple scattering, spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus, accurate estimation requires unmixing. Pixels are assumed to be mixtures of a few materials, called endmembers. Unmixing involves estimating all or some of: the number of endmembers, their spectral signatures, and their abundances at each pixel. Unmixing is a challenging, illposed
Joint Bayesian Endmember Extraction and Linear Unmixing for Hyperspectral Imagery
"... Abstract—This paper studies a fully Bayesian algorithm for endmember extraction and abundance estimation for hyperspectral imagery. Each pixel of the hyperspectral image is decomposed as a linear combination of pure endmember spectra following the linear mixing model. The estimation of the unknown e ..."
Abstract

Cited by 67 (29 self)
 Add to MetaCart
(Show Context)
Abstract—This paper studies a fully Bayesian algorithm for endmember extraction and abundance estimation for hyperspectral imagery. Each pixel of the hyperspectral image is decomposed as a linear combination of pure endmember spectra following the linear mixing model. The estimation of the unknown endmember spectra is conducted in a unified manner by generating the posterior distribution of abundances and endmember parameters under a hierarchical Bayesian model. This model assumes conjugate prior distributions for these parameters, accounts for nonnegativity and fulladditivity constraints, and exploits the fact that the endmember proportions lie on a lower dimensional simplex. A Gibbs sampler is proposed to overcome the complexity of evaluating the resulting posterior distribution. This sampler generates samples distributed according to the posterior distribution and estimates the unknown parameters using these generated samples. The accuracy of the joint Bayesian estimator is illustrated by simulations conducted on synthetic and real AVIRIS images. Index Terms—Bayesian inference, endmember extraction, hyperspectral imagery, linear spectral unmixing, MCMC methods. I.
Does independent component analysis play a role in unmixing hyperspectral data
 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING
, 2005
"... Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2) ..."
Abstract

Cited by 61 (13 self)
 Add to MetaCart
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2) sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signaltonoise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.
MINIMUM VOLUME SIMPLEX ANALYSIS: A FAST ALGORITHM TO UNMIX HYPERSPECTRAL DATA
"... This paper presents a new method of minimum volume class for hyperspectral unmixing, termed minimum volume simplex analysis (MVSA). The underlying mixing model is linear; i.e., the mixed hyperspectral vectors are modeled by a linear mixture of the endmember signatures weighted by the correspondent a ..."
Abstract

Cited by 55 (11 self)
 Add to MetaCart
(Show Context)
This paper presents a new method of minimum volume class for hyperspectral unmixing, termed minimum volume simplex analysis (MVSA). The underlying mixing model is linear; i.e., the mixed hyperspectral vectors are modeled by a linear mixture of the endmember signatures weighted by the correspondent abundance fractions. MVSA approaches hyperspectral unmixing by fitting a minimum volume simplex to the hyperspectral data, constraining the abundance fractions to belong to the probability simplex. The resulting optimization problem is solved by implementing a sequence of quadratically constrained subproblems. In a final step, the hard constraint on the abundance fractions is replaced with a hinge type loss function to account for outliers and noise. We illustrate the stateoftheart performance of the MVSA algorithm in unmixing simulated data sets. We are mainly concerning with the realistic scenario in which the pure pixel assumption (i.e., there exists at least one pure pixel per endmember) is not fulfilled. In these conditions, the MVSA yields much better performance than the pure pixel based algorithms. Index Terms — Hyperspectral unmixing, Minimum volume simplex, Source separation.
Semisupervised linear spectral unmixing using a hierarchical Bayesian model for hyperspectral imagery
, 2008
"... This paper proposes a hierarchical Bayesian model that can be used for semisupervised hyperspectral image unmixing. The model assumes that the pixel reflectances result from linear combinations of pure component spectra contaminated by an additive Gaussian noise. The abundance parameters appearing ..."
Abstract

Cited by 50 (25 self)
 Add to MetaCart
(Show Context)
This paper proposes a hierarchical Bayesian model that can be used for semisupervised hyperspectral image unmixing. The model assumes that the pixel reflectances result from linear combinations of pure component spectra contaminated by an additive Gaussian noise. The abundance parameters appearing in this model satisfy positivity and additivity constraints. These constraints are naturally expressed in a Bayesian context by using appropriate abundance prior distributions. The posterior distributions of the unknown model parameters are then derived. A Gibbs sampler allows one to draw samples distributed according to the posteriors of interest and to estimate the unknown abundances. An extension of the algorithm is finally studied for mixtures with unknown numbers of spectral components belonging to a know library. The performance of the different unmixing strategies is evaluated via simulations conducted on synthetic and real data.
Ice: a statistical approach to identifying endmembers in hyperspectral images
 IEEE Trans. Geosci. Remote Sensing
"... Abstract—Several of the more important endmemberfinding algorithms for hyperspectral data are discussed and some of their shortcomings highlighted. A new algorithm—iterated constrained endmembers (ICE)—which attempts to address these shortcomings is introduced. An example of its use is given. There ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
Abstract—Several of the more important endmemberfinding algorithms for hyperspectral data are discussed and some of their shortcomings highlighted. A new algorithm—iterated constrained endmembers (ICE)—which attempts to address these shortcomings is introduced. An example of its use is given. There is also a discussion of the advantages and disadvantages of normalizing spectra before the application of ICE or other endmemberfinding algorithms. Index Terms—Convex geometry, endmember, hyperspectral, normalization, simplex.
Multispectral and hyperspectral image analysis with convex cones
 IEEE Trans. Geosci. Remote Sens
, 1999
"... Abstract—A new approach to multispectral and hyperspectral image analysis is presented. This method, called convex cone analysis (CCA), is based on the fact that some physical quantities such as radiance are nonnegative. The vectors formed by discrete radiance spectra are linear combinations of non ..."
Abstract

Cited by 43 (0 self)
 Add to MetaCart
(Show Context)
Abstract—A new approach to multispectral and hyperspectral image analysis is presented. This method, called convex cone analysis (CCA), is based on the fact that some physical quantities such as radiance are nonnegative. The vectors formed by discrete radiance spectra are linear combinations of nonnegative components, and they lie inside a nonnegative, convex region. The object of CCA is to find the boundary points of this region, which can be used as endmember spectra for unmixing or as target vectors for classification. To implement this concept, we find the eigenvectors of the sample spectral correlation matrix of the image. Given the number of endmembers or classes, we select as many eigenvectors corresponding to the largest eigenvalues. These eigenvectors are used as a basis to form linear combinations that have only nonnegative elements, and thus they lie inside a convex cone. The vertices of the convex cone will be those points whose spectral vector contains as many zero elements as the number of eigenvectors minus one. Accordingly, a mixed pixel can be decomposed by identifying the vertices that were used to form its spectrum. An algorithm for finding the convex cone boundaries is presented, and applications to unsupervised unmixing and classification are demonstrated with simulated data as well as experimental data from the hyperspectral digital imagery collection experiment (HYDICE). Index Terms—Classification, convex cone analysis, hyperspectral digital imagery collection experiment (HYDICE), hyperspectral image, multispectral image, unmixing. I.
Nonlinear unmixing of hyperspectral images using a generalized bilinear model
 IEEE Trans. Geosci. and Remote Sensing
"... Nonlinear models have recently shown interesting properties for spectral unmixing. This paper considers a generalized bilinear model recently introduced for unmixing hyperspectral images. Different algorithms are studied to estimate the parameters of this bilinear model. The positivity and sumtoon ..."
Abstract

Cited by 41 (21 self)
 Add to MetaCart
Nonlinear models have recently shown interesting properties for spectral unmixing. This paper considers a generalized bilinear model recently introduced for unmixing hyperspectral images. Different algorithms are studied to estimate the parameters of this bilinear model. The positivity and sumtoone constraints for the abundances are ensured by the proposed algorithms. The performance of the resulting unmixing strategy is evaluated via simulations conducted on synthetic and real data. Index Terms — hyperspectral imagery, spectral unmixing, bilinear model, Bayesian inference, MCMC methods, gradient descent algorithm, least square algorithm. 1.