Results 1  10
of
19
Automatic relevance determination in nonnegative matrix factorization
 in SPARS, (StMalo
, 2009
"... This paper addresses the problem of estimating the latent dimensionality in nonnegative matrix fatorization (NMF) via automatic relevance determination (ARD). Uncovering the latent dimensionality is necessary for striking the right balance between data fidelity and overfitting. We propose a Bayesian ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
(Show Context)
This paper addresses the problem of estimating the latent dimensionality in nonnegative matrix fatorization (NMF) via automatic relevance determination (ARD). Uncovering the latent dimensionality is necessary for striking the right balance between data fidelity and overfitting. We propose a Bayesian model for NMF and two algorithms known as ℓ1 and ℓ2ARD, each assuming different priors on the basis and the coefficients. The proposed algorithms leverage on the recent algorithmic advances in NMF with the βdivergence using majorizationminimization (MM) methods. We show by using auxiliary functions that the cost function decreases monotonically to a local minimum. We demonstrate the efficacy and robustness of our algorithms by performing experiments on the swimmer dataset. 1
Nonnegative Matrix Factorization: A Comprehensive Review
 IEEE TRANS. KNOWLEDGE AND DATA ENG
, 2013
"... Nonnegative Matrix Factorization (NMF), a relatively novel paradigm for dimensionality reduction, has been in the ascendant since its inception. It incorporates the nonnegativity constraint and thus obtains the partsbased representation as well as enhancing the interpretability of the issue corres ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
Nonnegative Matrix Factorization (NMF), a relatively novel paradigm for dimensionality reduction, has been in the ascendant since its inception. It incorporates the nonnegativity constraint and thus obtains the partsbased representation as well as enhancing the interpretability of the issue correspondingly. This survey paper mainly focuses on the theoretical research into NMF over the last 5 years, where the principles, basic models, properties, and algorithms of NMF along with its various modifications, extensions, and generalizations are summarized systematically. The existing NMF algorithms are divided into four categories: Basic NMF (BNMF),
Clustering by Nonnegative Matrix Factorization Using Graph Random Walk
"... Nonnegative Matrix Factorization (NMF) is a promising relaxation technique for clustering analysis. However, conventional NMF methods that directly approximate the pairwise similarities using the least square error often yield mediocre performance for data in curved manifolds because they can captur ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
Nonnegative Matrix Factorization (NMF) is a promising relaxation technique for clustering analysis. However, conventional NMF methods that directly approximate the pairwise similarities using the least square error often yield mediocre performance for data in curved manifolds because they can capture only the immediate similarities between data samples. Here we propose a new NMF clustering method which replaces the approximated matrix with its smoothed version using random walk. Our method can thus accommodate farther relationships between data samples. Furthermore, we introduce a novel regularization in the proposed objective function in order to improve over spectral clustering. The new learning objective is optimized by a multiplicative MajorizationMinimization algorithm with a scalable implementation for learning the factorizing matrix. Extensive experimental results on realworld datasets show that our method has strong performance in terms of cluster purity. 1
The why and how of nonnegative matrix factorization
 Regularization, Optimization, Kernels, and Support Vector Machines. Chapman & Hall/CRC
, 2014
"... ar ..."
(Show Context)
Online nonnegative convolutive pattern learning for speech signals
, 2013
"... The unsupervised learning of spectrotemporal patterns within speech signals is of interest in a broad range of applications. Where patterns are nonnegative and convolutive in nature, relevant learning algorithms include convolutive nonnegative matrix factorization (CNMF) and its sparse alternativ ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
The unsupervised learning of spectrotemporal patterns within speech signals is of interest in a broad range of applications. Where patterns are nonnegative and convolutive in nature, relevant learning algorithms include convolutive nonnegative matrix factorization (CNMF) and its sparse alternative, convolutive nonnegative sparse coding (CNSC). Both algorithms, however, place unrealistic demands on computing power and memory which prohibit their application in large scale tasks. This technical report presents a new online implementation of CNMF and CNSC which processes input data piecebypiece and updates learned patterns gradually with accumulated statistics. The proposed approach facilitates pattern learning with huge volumes of training data that are beyond the capability of existing alternatives. We show that, with unlimited data and computing resources, the new online learning algorithm almost surely converges to a local minimum of the objective cost function. In more realistic situations, where the amount of data is large and computing power is limited, online learning tends to obtain lower empirical cost than conventional batch learning. Index Terms Nonnegative matrix factorization, convolutive NMF, online pattern learning, sparse coding, speech processing, speech recognition vCONTENTS
Two Algorithms for Orthogonal Nonnegative Matrix Factorization with Application to Clustering. arXiv:1201.0901v1
, 2011
"... ar ..."
(Show Context)
USING SUBCLASSES IN DISCRIMINANT NONNEGATIVE SUBSPACE LEARNING FOR FACIAL EXPRESSION RECOGNITION
"... Nonnegative Matrix Factorization (NMF) is among the most popular subspace methods, widely used in a variety of image processing problems. To achieve an efficient decomposition of the provided data to its discriminant parts, thus enhancing classification performance, we regard that data inside each ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Nonnegative Matrix Factorization (NMF) is among the most popular subspace methods, widely used in a variety of image processing problems. To achieve an efficient decomposition of the provided data to its discriminant parts, thus enhancing classification performance, we regard that data inside each class form clusters and use criteria inspired by Clustering based Discriminant Analysis. The proposed method combines these discriminant criteria as constraints in the NMF decomposition cost function in order to address the problem of finding discriminant projections that enhance class separability in the reduced dimensional projection space. The developed algorithm has been applied to the facial expression recognition problem and experimental results verified that it successfully identified discriminant facial parts, thus enhancing recognition performance. 1.
Nonnegative Matrix Factorizations for Clustering: A Survey
"... Recently there has been significant development in the use of nonnegative matrix factorization (NMF) methods for various clustering tasks. NMF factorizes an input nonnegative matrix into two nonnegative matrices of lower rank. Although NMF can be used for conventional data analysis, the recent over ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Recently there has been significant development in the use of nonnegative matrix factorization (NMF) methods for various clustering tasks. NMF factorizes an input nonnegative matrix into two nonnegative matrices of lower rank. Although NMF can be used for conventional data analysis, the recent overwhelming interest in NMF is due to the newly discovered ability of NMF to solve challenging data mining and machine learning problems. In particular, NMF with the sum of squared error cost function is equivalent to a relaxed Kmeans clustering, the most widely used unsupervised learning algorithm. In addition, NMF with the Idivergence cost function is equivalent to probabilistic latent semantic indexing, another unsupervised learning method popularly used in text analysis. Many other data mining and machine learning problems can be reformulated as an NMF problem. This chapter aims to provide a comprehensive review of nonnegative matrix factorization methods for clustering. In particular, we outline the theoretical foundations on NMF for clustering, provide an overview of different variants on NMF formulations, and examine
Tight Continuous Relaxation of the Balanced kCut Problem
"... Spectral Clustering as a relaxation of the normalized/ratio cut has become one of the standard graphbased clustering methods. Existing methods for the computation of multiple clusters, corresponding to a balanced kcut of the graph, are either based on greedy techniques or heuristics which have we ..."
Abstract
 Add to MetaCart
(Show Context)
Spectral Clustering as a relaxation of the normalized/ratio cut has become one of the standard graphbased clustering methods. Existing methods for the computation of multiple clusters, corresponding to a balanced kcut of the graph, are either based on greedy techniques or heuristics which have weak connection to the original motivation of minimizing the normalized cut. In this paper we propose a new tight continuous relaxation for any balanced kcut problem and show that a related recently proposed relaxation is in most cases loose leading to poor performance in practice. For the optimization of our tight continuous relaxation we propose a new algorithm for the difficult sumofratios minimization problem which achieves monotonic descent. Extensive comparisons show that our method outperforms all existing approaches for ratio cut and other balanced kcut criteria. 1