Results 1  10
of
40
The geometry of algorithms using hierarchical tensors
, 2012
"... In this paper, the differential geometry of the novel hierarchical Tucker format for tensors is derived. The set HT,k of tensors with fixed tree T and hierarchical rank k is shown to be a smooth quotient manifold, namely the set of orbits of a Lie group action corresponding to the nonunique basis ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
In this paper, the differential geometry of the novel hierarchical Tucker format for tensors is derived. The set HT,k of tensors with fixed tree T and hierarchical rank k is shown to be a smooth quotient manifold, namely the set of orbits of a Lie group action corresponding to the nonunique basis representation of these hierarchical tensors. Explicit characterizations of the quotient manifold, its tangent space and the tangent space of HT,k are derived, suitable for highdimensional problems. The usefulness of a complete geometric description is demonstrated by two typical applications. First, new convergence results for the nonlinear Gauss– Seidel method on HT,k are given. Notably and in contrast to earlier works on this subject, the task of minimizing the Rayleigh quotient is also addressed. Second, evolution equations for dynamic tensor approximation are formulated in terms of an explicit projection operator onto the tangent space of HT,k. In addition, a numerical comparison is made between this dynamical approach and the standard one based on truncated singular value decompositions.
Scaled Gradients on Grassmann Manifolds for Matrix Completion
"... This paper describes gradient methods based on a scaled metric on the Grassmann manifold for lowrank matrix completion. The proposed methods significantly improve canonical gradient methods, especially on illconditioned matrices, while maintaining established global convegence and exact recovery g ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
(Show Context)
This paper describes gradient methods based on a scaled metric on the Grassmann manifold for lowrank matrix completion. The proposed methods significantly improve canonical gradient methods, especially on illconditioned matrices, while maintaining established global convegence and exact recovery guarantees. A connection between a form of subspace iteration for matrix completion and the scaled gradient descent procedure is also established. The proposed conjugate gradient method based on the scaled gradient outperforms several existing algorithms for matrix completion and is competitive with recently proposed methods. 1
LOWRANK OPTIMIZATION WITH TRACE NORM PENALTY∗
"... Abstract. The paper addresses the problem of lowrank trace norm minimization. We propose an algorithm that alternates between fixedrank optimization and rankone updates. The fixedrank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the sear ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
(Show Context)
Abstract. The paper addresses the problem of lowrank trace norm minimization. We propose an algorithm that alternates between fixedrank optimization and rankone updates. The fixedrank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the search space and the computation of duality gap numerically tractable. The search space is nonlinear but is equipped with a Riemannian structure that leads to efficient computations. We present a secondorder trustregion algorithm with a guaranteed quadratic rate of convergence. Overall, the proposed optimization scheme converges superlinearly to the global solution while maintaining complexity that is linear in the number of rows and columns of the matrix. To compute a set of solutions efficiently for a grid of regularization parameters we propose a predictorcorrector approach that outperforms the naive warmrestart approach on the fixedrank quotient manifold. The performance of the proposed algorithm is illustrated on problems of lowrank matrix completion and multivariate linear regression.
Fixedrank matrix factorizations and Riemannian lowrank optimization
, 2012
"... ar ..."
(Show Context)
Convergence results for projected linesearch methods on varieties of lowrank matrices via Łojasiewicz inequality
, 2014
"... Abstract. The aim of this paper is to derive convergence results for projected linesearch methods on the realalgebraic variety M≤k of real m × n matrices of rank at most k. Such methods extend successfully used Riemannian optimization methods on the smooth manifold Mk of rankk matrices to its clo ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
Abstract. The aim of this paper is to derive convergence results for projected linesearch methods on the realalgebraic variety M≤k of real m × n matrices of rank at most k. Such methods extend successfully used Riemannian optimization methods on the smooth manifold Mk of rankk matrices to its closure by taking steps along gradientrelated directions in the tangent cone, and afterwards projecting back to M≤k. Considering such a method circumvents the difficulties which arise from the nonclosedness and the unbounded curvature of Mk. The pointwise convergence is obtained for realanalytic functions on the basis of a Lojasiewicz inequality for the projection of the antigradient to the tangent cone. If the derived limit point lies on the smooth part of M≤k, i.e. in Mk, this boils down to more or less known results, but with the benefit that asymptotic convergence rate estimates (for specific stepsizes) can be obtained without an apriori curvature bound, simply from the fact that the limit lies on a smooth manifold. At the same time, one can give a convincing justification for assuming critical points to lie in Mk: if X is a critical point of f on M≤k, then either X has rank k, or ∇f(X) = 0. Key words. Convergence analysis, linesearch methods, lowrank matrices, Riemannian optimization, steepest descent, Lojasiewicz gradient inequality, tangent cones
Two Newton methods on the manifold of fixedrank matrices endowed with Riemannian quotient geometries
 Comput. Statist
, 2014
"... quotient geometries ∗ ..."
(Show Context)
Hierarchical Tucker Tensor Optimization Applications to Tensor Completion
"... Abstract—In this work, we develop an optimization framework for problems whose solutions are wellapproximated by Hierarchical Tucker (HT) tensors, an efficient structured tensor format based on recursive subspace factorizations. Using the differential geometric tools presented here, we construct st ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Abstract—In this work, we develop an optimization framework for problems whose solutions are wellapproximated by Hierarchical Tucker (HT) tensors, an efficient structured tensor format based on recursive subspace factorizations. Using the differential geometric tools presented here, we construct standard optimization algorithms such as Steepest Descent and Conjugate Gradient for interpolating tensors in HT format. We also empirically examine the importance of one’s choice of data organization in the success of tensor recovery by drawing upon insights from the matrix completion literature. Using these algorithms, we recover various seismic data sets with randomly missing sources. I.
An extrinsic look at the Riemannian Hessian
, 2013
"... Abstract. Let f be a realvalued function on a Riemannian submanifold of a Euclidean space, and let f ̄ be a local extension of f. We show that the Riemannian Hessian of f can be conveniently obtained from the Euclidean gradient and Hessian of f ̄ by means of two manifoldspecific objects: the ortho ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Let f be a realvalued function on a Riemannian submanifold of a Euclidean space, and let f ̄ be a local extension of f. We show that the Riemannian Hessian of f can be conveniently obtained from the Euclidean gradient and Hessian of f ̄ by means of two manifoldspecific objects: the orthogonal projector onto the tangent space and the Weingarten map. Expressions for the Weingarten map are provided on various specific submanifolds.