Results

**11 - 13**of**13**### 1 Introduction – Problem Statements and Models

"... Matrix factorization is an important and unifying topic in signal processing and linear algebra, which has found numerous applications in many other areas. This chapter introduces basic linear and multi-linear 1 models for matrix and tensor factorizations and decompositions, and formulates the analy ..."

Abstract
- Add to MetaCart

(Show Context)
Matrix factorization is an important and unifying topic in signal processing and linear algebra, which has found numerous applications in many other areas. This chapter introduces basic linear and multi-linear 1 models for matrix and tensor factorizations and decompositions, and formulates the analysis framework for

### Chapter 1 Bounded Matrix Low Rank Approximation

"... It is common in recommender systems rating matrix, where the input ma-trix R is bounded in between [rmin, rmax] such as [1, 5]. In this chapter, we propose a new improved scalable low rank approximation algorithm for such bounded matrices called Bounded Matrix Low Rank Approximation(BMA) that bounds ..."

Abstract
- Add to MetaCart

(Show Context)
It is common in recommender systems rating matrix, where the input ma-trix R is bounded in between [rmin, rmax] such as [1, 5]. In this chapter, we propose a new improved scalable low rank approximation algorithm for such bounded matrices called Bounded Matrix Low Rank Approximation(BMA) that bounds every element of the approximation PQ. We also present an al-ternate formulation to bound existing recommender system algorithms called BALS and discuss its convergence. Our experiments on real world datasets illustrate that the proposed method BMA outperforms the state of the art algorithms for recommender system such as Stochastic Gradient Descent, Al-ternating Least Squares with regularization, SVD++ and Bias-SVD on real

### Efficient Rank-one Residue Approximation Method for Graph Regularized Non-negative Matrix Factorization

"... Abstract. Nonnegative matrix factorization (NMF) aims to decompose a given data matrix X into the product of two lower-rank nonnegative factor matrices UV T. Graph regularized NMF (GNMF) is a recently proposed NMF method that preserves the geometric structure of X dur-ing such decomposition. Althoug ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. Nonnegative matrix factorization (NMF) aims to decompose a given data matrix X into the product of two lower-rank nonnegative factor matrices UV T. Graph regularized NMF (GNMF) is a recently proposed NMF method that preserves the geometric structure of X dur-ing such decomposition. Although GNMF has been widely used in com-puter vision and data mining, its multiplicative update rule (MUR) based solver suffers from both slow convergence and non-stationarity problems. In this paper, we propose a new efficient GNMF solver called rank-one residue approximation (RRA). Different from MUR, which updates both factor matrices (U and V) as a whole in each iteration round, RRA up-dates each of their columns by approximating the residue matrix by their outer product. Since each column of both factor matrices is updated op-timally in an analytic formulation, RRA is theoretical and empirically proven to converge rapidly to a stationary point. Moreover, since RRA needs neither extra computational cost nor parametric tuning, it enjoys a similar simplicity to MUR but performs much faster. Experimental re-sults on real-world datasets show that RRA is much more efficient than MUR for GNMF. To confirm the stationarity of the solution obtained by RRA, we conduct clustering experiments on real-world image datasets by comparing with the representative solvers such as MUR and NeNMF for GNMF. The experimental results confirm the effectiveness of RRA.