Results 1 
7 of
7
Blind Separation of QuasiStationary Sources: Exploiting Convex Geometry in Covariance Domain
, 2015
"... This paper revisits blind source separation of instantaneously mixed quasistationary sources (BSSQSS), motivated by the observation that in certain applications (e.g., speech) there exist time frames during which only one source is active, or locally dominant. Combined with nonnegativity of sourc ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
This paper revisits blind source separation of instantaneously mixed quasistationary sources (BSSQSS), motivated by the observation that in certain applications (e.g., speech) there exist time frames during which only one source is active, or locally dominant. Combined with nonnegativity of source powers, this endows the problem with a nice convex geometry that enables elegant and efficient BSS solutions. Local dominance is tantamount to the socalled pure pixel/separability assumption in hyperspectral unmixing/nonnegative matrix factorization, respectively. Building on this link, a very simple algorithm called successive projection algorithm (SPA) is considered for estimating the mixing system in closed form. To complement SPA in the specific BSSQSS context, an algebraic preprocessing procedure is proposed to suppress shortterm source crosscorrelation interference. The proposed procedure is simple, effective, and supported by theoretical analysis. Solutions based on volume minimization (VolMin) are also considered. By theoretical analysis, it is shown that VolMin guarantees perfect mixing system identifiability under an assumption more relaxed than (exact) local dominance—which means wider applicability in practice. Exploiting the specific structure of BSSQSS, a fast VolMin algorithm is proposed for the overdetermined case. Careful simulations using real speech sources showcase the simplicity, efficiency, and accuracy of the proposed algorithms.
Nonnegative matrix factorization under heavy noise.
 In Proceedings of the 33nd International Conference on Machine Learning,
, 2016
"... Abstract The Noisy Nonnegative Matrix factorization (NMF) is: given a data matrix A (d × n), find nonnegative matrices B, C (d × k, k × n respy.) so that A = BC + N , where N is a noise matrix. Existing polynomial time algorithms with proven error guarantees require each column N ·,j to have l 1 ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract The Noisy Nonnegative Matrix factorization (NMF) is: given a data matrix A (d × n), find nonnegative matrices B, C (d × k, k × n respy.) so that A = BC + N , where N is a noise matrix. Existing polynomial time algorithms with proven error guarantees require each column N ·,j to have l 1 norm much smaller than (BC) ·,j  1 , which could be very restrictive. In important applications of NMF such as Topic Modeling as well as theoretical noise models (eg. Gaussian with high σ), almost every column of N ·j violates this condition. We introduce the heavy noise model which only requires the average noise over large subsets of columns to be small. We initiate a study of Noisy NMF under the heavy noise model. We show that our noise model subsumes noise models of theoretical and practical interest (for eg. Gaussian noise of maximum possible σ). We then devise an algorithm TSVDNMF which under certain assumptions on B, C, solves the problem under heavy noise. Our error guarantees match those of previous algorithms. Our running time of O((n + d) 2 k) is substantially better than the O(n 3 d) for the previous best. Our assumption on B is weaker than the "Separability" assumption made by all previous results. We provide empirical justification for our assumptions on C. We also provide the first proof of identifiability (uniqueness of B) for noisy NMF which is not based on separability and does not use hard to check geometric conditions. Our algorithm outperforms earlier polynomial time algorithms both in time and error, particularly in the presence of high noise.
Intersecting Faces: Nonnegative Matrix Factorization With New Guarantees Rong Ge
"... Abstract Nonnegative matrix factorization (NMF) is a natural model of admixture and is widely used in science and engineering. A plethora of algorithms have been developed to tackle NMF, but due to the nonconvex nature of the problem, there is little guarantee on how well these methods work. Rece ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract Nonnegative matrix factorization (NMF) is a natural model of admixture and is widely used in science and engineering. A plethora of algorithms have been developed to tackle NMF, but due to the nonconvex nature of the problem, there is little guarantee on how well these methods work. Recently a surge of research have focused on a very restricted class of NMFs, called separable NMF, where provably correct algorithms have been developed. In this paper, we propose the notion of subsetseparable NMF, which substantially generalizes the property of separability. We show that subsetseparability is a natural necessary condition for the factorization to be unique or to have minimum volume. We developed the FaceIntersect algorithm which provably and efficiently solves subsetseparable NMF under natural conditions, and we prove that our algorithm is robust to small noise. We explored the performance of FaceIntersect on simulations and discuss settings where it empirically outperformed the stateofart methods. Our work is a step towards finding provably correct algorithms that solve large classes of NMF problems.
Successive Nonnegative Projection Algorithm for Robust Nonnegative Blind Source Separation
"... ar ..."
(Show Context)
Sampling Versus Unambiguous Nondeterminism in Communication Complexity
, 2014
"... In this note, we investigate the relationship between the following two communication complexity measures for a 2party function f: X × Y → {0, 1}: on one hand, sampling a uniformly random 1entry of the matrix of f such that Alice learns the row index and Bob learns the column index, and on the ot ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this note, we investigate the relationship between the following two communication complexity measures for a 2party function f: X × Y → {0, 1}: on one hand, sampling a uniformly random 1entry of the matrix of f such that Alice learns the row index and Bob learns the column index, and on the other hand, unambiguous nondeterminism, which corresponds to partitioning the 1’s of the matrix into combinatorial rectangles. The former complexity measure equals the log of the nonnegative rank of the matrix, and the latter equals the log of the binary rank of the matrix (which is always at least the nonnegative rank). Thus we consider the relationship between these two ranks of 01 matrices. We prove that if the nonnegative rank is at most 3 then the two ranks are equal. We also show a separation by exhibiting a matrix with nonnegative rank 4 and binary rank 5, as well as a family of matrices for which the binary rank is 4/3 times the nonnegative rank. 1
A rank estimation criterion using an NMF algorithm under an inner dimension condition
, 2014
"... We introduce a rank selection criterion for nonnegative factorization algorithms, for the cases where the rank of the matrix coincides with the inner dimension of the matrix. The criteria is motivated by noting that provided that a unique factorization exists, the factorization is a solution to a f ..."
Abstract
 Add to MetaCart
(Show Context)
We introduce a rank selection criterion for nonnegative factorization algorithms, for the cases where the rank of the matrix coincides with the inner dimension of the matrix. The criteria is motivated by noting that provided that a unique factorization exists, the factorization is a solution to a fixed point iteration formula that can be obtained by rewriting nonnegative factorization together with singular value decomposition. We characterize the asymptotic error rate for our fixed point formula when the nonnegative matrix is observed with noise generated according to the socalled random dot product model for graphs. 1