Results 1  10
of
637
Intrinsic statistics on Riemannian manifolds: Basic tools for geometric measurements
, 1999
"... Measurements of geometric primitives, such as rotations or rigid transformations, are often noisy and we need to use statistics either to reduce the uncertainty or to compare measurements. Unfortunately, geometric primitives often belong to manifolds and not vector spaces. We have already shown [9] ..."
Abstract

Cited by 198 (24 self)
 Add to MetaCart
(Show Context)
Measurements of geometric primitives, such as rotations or rigid transformations, are often noisy and we need to use statistics either to reduce the uncertainty or to compare measurements. Unfortunately, geometric primitives often belong to manifolds and not vector spaces. We have already shown [9] that generalizing too quickly even simple statistical notions could lead to paradoxes. In this article, we develop some basic probabilistic tools to work on Riemannian manifolds: the notion of mean value, covariance matrix, normal law, Mahalanobis distance and χ² test. We also present an efficient algorithm to compute the mean value and tractable approximations of the normal and χ² laws for small variances.
Matrix completion from a few entries
"... Let M be a random nα × n matrix of rank r ≪ n, and assume that a uniformly random subset E of its entries is observed. We describe an efficient algorithm that reconstructs M from E  = O(r n) observed entries with relative root mean square error RMSE ≤ C(α) ..."
Abstract

Cited by 197 (8 self)
 Add to MetaCart
Let M be a random nα × n matrix of rank r ≪ n, and assume that a uniformly random subset E of its entries is observed. We describe an efficient algorithm that reconstructs M from E  = O(r n) observed entries with relative root mean square error RMSE ≤ C(α)
Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method
 SIAM J. Sci. Comput
, 2001
"... We describe new algorithms of the locally optimal block preconditioned conjugate gradient (LOBPCG) method for symmetric eigenvalue problems, based on a local optimization of a threeterm recurrence, and suggest several other new methods. To be able to compare numerically different methods in the cla ..."
Abstract

Cited by 141 (19 self)
 Add to MetaCart
(Show Context)
We describe new algorithms of the locally optimal block preconditioned conjugate gradient (LOBPCG) method for symmetric eigenvalue problems, based on a local optimization of a threeterm recurrence, and suggest several other new methods. To be able to compare numerically different methods in the class, with different preconditioners, we propose a common system of model tests, using random preconditioners and initial guesses. As the "ideal" control algorithm, we advocate the standard preconditioned conjugate gradient method for finding an eigenvector as an element of the nullspace of the corresponding homogeneous system of linear equations under the assumption that the eigenvalue is known. We recommend that every new preconditioned eigensolver be compared with this "ideal" algorithm on our model test problems in terms of the speed of convergence, costs of every iteration, and memory requirements. We provide such comparison for our LOBPCG method. Numerical results establish that our algorithm is practically as efficient as the "ideal" algorithm when the same preconditioner is used in both methods. We also show numerically that the LOBPCG method provides approximations to first eigenpairs of about the same quality as those by the much more expensive global optimization method on the same generalized block Krylov subspace. We propose a new version of block Davidson's method as a generalization of the LOBPCG method. Finally, direct numerical comparisons with the JacobiDavidson method show that our method is more robust and converges almost two times faster.
Matrix Completion from Noisy Entries
"... Given a matrix M of lowrank, we consider the problem of reconstructing it from noisy observations of a small, random subset of its entries. The problem arises in a variety of applications, from collaborative filtering (the ‘Netflix problem’) to structurefrommotion and positioning. We study a low ..."
Abstract

Cited by 118 (6 self)
 Add to MetaCart
(Show Context)
Given a matrix M of lowrank, we consider the problem of reconstructing it from noisy observations of a small, random subset of its entries. The problem arises in a variety of applications, from collaborative filtering (the ‘Netflix problem’) to structurefrommotion and positioning. We study a low complexity algorithm introduced in [1], based on a combination of spectral techniques and manifold optimization, that we call here OPTSPACE. We prove performance guarantees that are orderoptimal in a number of circumstances. 1
Dynamic Texture Recognition
, 2001
"... Dynamic textures are sequences of images that exhibit some form of temporal stationarity, such as waves, steam, and foliage. We pose the problem of recognizing and classifying dynamic textures in the space of dynamical systems where each dynamic texture is uniquely represented. Since the space is no ..."
Abstract

Cited by 115 (7 self)
 Add to MetaCart
(Show Context)
Dynamic textures are sequences of images that exhibit some form of temporal stationarity, such as waves, steam, and foliage. We pose the problem of recognizing and classifying dynamic textures in the space of dynamical systems where each dynamic texture is uniquely represented. Since the space is nonlinear, a distance between models must be defined. We examine three different distances in the space of autoregressive models and assess their power. 1.
Means and Averaging in the Group of Rotations
, 2002
"... In this paper we give precise definitions of different, properly invariant notions of mean or average rotation. Each mean is associated with a metric in SO(3). The metric induced from the Frobenius inner product gives rise to a mean rotation that is given by the closest special orthogonal matrix to ..."
Abstract

Cited by 114 (3 self)
 Add to MetaCart
In this paper we give precise definitions of different, properly invariant notions of mean or average rotation. Each mean is associated with a metric in SO(3). The metric induced from the Frobenius inner product gives rise to a mean rotation that is given by the closest special orthogonal matrix to the usual arithmetic mean of the given rotation matrices. The mean rotation associated with the intrinsic metric on SO(3) is the Riemannian center of mass of the given rotation matrices. We show that the Riemannian mean rotation shares many common features with the geometric mean of positive numbers and the geometric mean of positive Hermitian operators. We give some examples with closedform solutions of both notions of mean.
Generalized Low Rank Approximations of Matrices
 MACHINE LEARNING
, 2004
"... We consider the problem of computing low rank approximations of matrices. The novelty of our approach is that the low rank approximations are on a sequence of matrices. Unlike the ..."
Abstract

Cited by 104 (6 self)
 Add to MetaCart
We consider the problem of computing low rank approximations of matrices. The novelty of our approach is that the low rank approximations are on a sequence of matrices. Unlike the
Optimization algorithms exploiting unitary constraints
 IEEE Trans. Signal Processing
, 2002
"... Abstract—This paper presents novel algorithms that iteratively converge to a local minimum of a realvalued function ( ) subject to the constraint that the columns of the complexvalued matrix are mutually orthogonal and have unit norm. The algorithms are derived by reformulating the constrained ..."
Abstract

Cited by 97 (13 self)
 Add to MetaCart
(Show Context)
Abstract—This paper presents novel algorithms that iteratively converge to a local minimum of a realvalued function ( ) subject to the constraint that the columns of the complexvalued matrix are mutually orthogonal and have unit norm. The algorithms are derived by reformulating the constrained optimization problem as an unconstrained one on a suitable manifold. This significantly reduces the dimensionality of the optimization problem. Pertinent features of the proposed framework are illustrated by using the framework to derive an algorithm for computing the eigenvector associated with either the largest or the smallest eigenvalue of a Hermitian matrix. Index Terms—Constrained optimization, eigenvalue problems, optimization on manifolds, orthogonal constraints. I.
Riemannian geometry of Grassmann manifolds with a view on algorithmic computation
 Acta Appl. Math
"... Abstract. We give simple formulas for the canonical metric, gradient, Lie ..."
Abstract

Cited by 96 (22 self)
 Add to MetaCart
Abstract. We give simple formulas for the canonical metric, gradient, Lie