Results 1  10
of
1,228
Certifiably Optimal Low Rank Factor Analysis
, 2017
"... Abstract Factor Analysis (FA) is a technique of fundamental importance that is widely used in classical and modern multivariate statistics, psychometrics, and econometrics. In this paper, we revisit the classical rankconstrained FA problem which seeks to approximate an observed covariance matrix ( ..."
Abstract
 Add to MetaCart
(Σ) by the sum of a Positive Semidefinite (PSD) lowrank component (Θ) and a diagonal matrix (Φ) (with nonnegative entries) subject to Σ − Φ being PSD. We propose a flexible family of rankconstrained, nonlinear Semidefinite Optimization based formulations for this task. We introduce a reformulation
A Nonlinear Programming Algorithm for Solving Semidefinite Programs via Lowrank Factorization
 Mathematical Programming (series B
, 2001
"... In this paper, we present a nonlinear programming algorithm for solving semidefinite programs (SDPs) in standard form. The algorithm's distinguishing feature is a change of variables that replaces the symmetric, positive semidefinite variable X of the SDP with a rectangular variable R according ..."
Abstract

Cited by 159 (10 self)
 Add to MetaCart
computational results on some largescale test problems are also presented. Keywords: semidefinite programming, lowrank factorization, nonlinear programming, augmented Lagrangian, limited memory BFGS. 1 Introduction In the past few years, the topic of semidefinite programming, or SDP, has received
IMAGE TAG COMPLETION BY LOWRANK FACTORIZATION WITH DUAL RECONSTRUCTION STRUCTURE PRESERVED
"... A novel tag completion algorithm is proposed in this paper, which is designed with the following features: 1) Lowrank and error sparsity: the incomplete initial tagging matrix D is decomposed into the complete tagging matrix A and a sparse error matrix E. However, instead of minimizing its nuclear ..."
Abstract
 Add to MetaCart
nuclear norm, A is further factorized into a basis matrix U and a sparse coefficient matrix V, i.e. D = UV +E. This lowrank formulation encapsulating sparse coding enables our algorithm to recover latent structures from noisy initial data and avoid performing too much denoising; 2) Local reconstruction
SOLVING A LOWRANK FACTORIZATION MODEL FOR MATRIX COMPLETION BY A NONLINEAR SUCCESSIVE OVERRELAXATION ALGORITHM
"... Abstract. The matrix completion problem is to recover a lowrank matrix from a subset of its entries. The main solution strategy for this problem has been based on nuclearnorm minimization which requires computing singular value decompositions – a task that is increasingly costly as matrix sizes an ..."
Abstract

Cited by 91 (10 self)
 Add to MetaCart
and ranks increase. To improve the capacity of solving largescale problems, we propose a lowrank factorization model and construct a nonlinear successive overrelaxation (SOR) algorithm that only requires solving a linear least squares problem per iteration. Convergence of this nonlinear SOR algorithm
Augmented Lagrangian alternating direction method for matrix separation based on lowrank factorization
, 2011
"... The matrix separation problem aims to separate a lowrank matrix and a sparse matrix from their sum. This problem has recently attracted considerable research attention due to its wide range of potential applications. Nuclearnorm minimization models have been proposed for matrix separation and prov ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
propose and investigate an alternative approach based on solving a nonconvex, lowrank factorization model by an augmented Lagrangian alternating direction method. Numerical studies indicate that the effectiveness of the proposed model is limited to problems where the sparse matrix does not dominate
Approximate lowrank factorization with structured factors, Computational Statistics & Data Analysis
 in IEEE 11th Int. Conf. on Computer Vision
, 2007
"... An approximate rank revealing factorization problem with structure constraints on the normalized factors is considered. Examples of structure, motivated by an application in microarray data analysis, are sparsity, nonnegativity, periodicity, and smoothness. In general, the approximate rank revealing ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
words: rank revealing factorization; numerical rank; lowrank approximation; maximum likelihood PCA; total least squares; errorsinvariables; microarray data. 1.
Global convergence of stochastic gradient descent for some nonconvex matrix problems. arXiv preprint arXiv:1411.1134,
, 2014
"... Abstract Stochastic gradient descent (SGD) on a lowrank factorization ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract Stochastic gradient descent (SGD) on a lowrank factorization
Randomized Algorithms for LowRank Matrix Decomposition
, 2011
"... Lowrank matrix factorization is one of the most useful tools in scientific computing and data analysis. The goal of lowrank factorization is to decompose a matrix into a product of two smaller matrices of lower rank that approximates the original matrix well. Such a decomposition exposes the lowr ..."
Abstract
 Add to MetaCart
Lowrank matrix factorization is one of the most useful tools in scientific computing and data analysis. The goal of lowrank factorization is to decompose a matrix into a product of two smaller matrices of lower rank that approximates the original matrix well. Such a decomposition exposes the lowrank
MaximumMargin Matrix Factorization
 Advances in Neural Information Processing Systems 17
, 2005
"... We present a novel approach to collaborative prediction, using lownorm instead of lowrank factorizations. The approach is inspired by, and has strong connections to, largemargin linear discrimination. We show how to learn lownorm factorizations by solving a semidefinite program, and discuss ..."
Abstract

Cited by 264 (21 self)
 Add to MetaCart
We present a novel approach to collaborative prediction, using lownorm instead of lowrank factorizations. The approach is inspired by, and has strong connections to, largemargin linear discrimination. We show how to learn lownorm factorizations by solving a semidefinite program
Efficient SVM training using lowrank kernel representations
 Journal of Machine Learning Research
, 2001
"... SVM training is a convex optimization problem which scales with the training set size rather than the feature space dimension. While this is usually considered to be a desired quality, in large scale problems it may cause training to be impractical. The common techniques to handle this difficulty ba ..."
Abstract

Cited by 240 (3 self)
 Add to MetaCart
method (IPM) in terms of storage requirements as well as computational complexity. We then suggest an efficient use of a known factorization technique to approximate a given kernel matrix by a low rank matrix, which in turn will be used to feed the optimizer. Finally, we derive an upper bound
Results 1  10
of
1,228