Results 1 
4 of
4
Online Learning of Eigenvectors
"... Computing the leading eigenvector of a symmetric real matrix is a fundamental primitive of numerical linear algebra with numerous applications. We consider a natural online extension of the leading eigenvector problem: a sequence of matrices is presented and the goal is to predict for each matri ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Computing the leading eigenvector of a symmetric real matrix is a fundamental primitive of numerical linear algebra with numerous applications. We consider a natural online extension of the leading eigenvector problem: a sequence of matrices is presented and the goal is to predict for each matrix a unit vector, with the overall goal of competing with the leading eigenvector of the cumulative matrix. Existing regretminimization algorithms for this problem either require to compute an eigen decompostion every iteration, or suffer from a large dependency of the regret bound on the dimension. In both cases the algorithms are not practical for large scale applications. In this paper we present new algorithms that avoid both issues. On one hand they do not require any expensive matrix decompositions and on the other, they guarantee regret rates with a mild dependence on the dimension at most. In contrast to previous algorithms, our algorithms also admit implementations that enable to leverage sparsity in the data to further reduce computation. We extend our results to also handle nonsymmetric matrices. 1.
Online PCA with Spectral Bounds
"... This paper revisits the online PCA problem. Given a stream of n vectors xt ∈ Rd (columns of X) the algorithm must output yt ∈ R ` (columns of Y) before receiving xt+1. The goal of online PCA is to simultaneously minimize the target dimension ` and the error ‖X − (XY +)Y ‖2. We describe two simple an ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
This paper revisits the online PCA problem. Given a stream of n vectors xt ∈ Rd (columns of X) the algorithm must output yt ∈ R ` (columns of Y) before receiving xt+1. The goal of online PCA is to simultaneously minimize the target dimension ` and the error ‖X − (XY +)Y ‖2. We describe two simple and deterministic algorithms. The first, receives a parameter ∆ and guaranties that ‖X − (XY +)Y ‖2 is not significantly larger than ∆. It requires a target dimension of ` = O(k/ε) for any k, ε such that ∆ ≥ εσ21 +σ2k+1. The second receives k and ε and guaranties that ‖X − (XY +)Y ‖2 ≤ εσ21 + σ2k+1. It requires a target dimension of O(k logn/ε2). Different models and algorithms for Online PCA were considered in the past. This is the first that achieves a bound on the spectral norm of the residual matrix.
Learning a set of directions
 JMLR: WORKSHOP AND CONFERENCE PROCEEDINGS VOL 30:1–16, 2013
, 2013
"... Assume our data consists of unit vectors (directions) and we are to find a small orthogonal set of the “the most important directions” summarizing the data. We develop online algorithms for this type of problem. The techniques used are similar to Principal Component Analysis which finds the most imp ..."
Abstract
 Add to MetaCart
Assume our data consists of unit vectors (directions) and we are to find a small orthogonal set of the “the most important directions” summarizing the data. We develop online algorithms for this type of problem. The techniques used are similar to Principal Component Analysis which finds the most important small rank subspace of the data. The new problem is significantly more complex since the online algorithm maintains uncertainty over the most relevant subspace as well as directional information.