Results 1 
2 of
2
Solving ridge regression using sketched preconditioned svrg.
 In ICML,
, 2016
"... Abstract We develop a novel preconditioning method for ridge regression, based on recent linear sketching methods. By equipping Stochastic Variance Reduced Gradient (SVRG) with this preconditioning process, we obtain a significant speedup relative to fast stochastic methods such as SVRG, SDCA and ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract We develop a novel preconditioning method for ridge regression, based on recent linear sketching methods. By equipping Stochastic Variance Reduced Gradient (SVRG) with this preconditioning process, we obtain a significant speedup relative to fast stochastic methods such as SVRG, SDCA and SAG.
Faster Eigenvector Computation via ShiftandInvert Preconditioning
, 2016
"... Abstract We give faster algorithms and improved sample complexities for the fundamental problem of estimating the top eigenvector. Given an explicit matrix A ∈ R n×d , we show how to compute an approximate top eigenvector of · log 1/ . Here nnz(A) is the number of nonzeros in A, sr(A) is the stable ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract We give faster algorithms and improved sample complexities for the fundamental problem of estimating the top eigenvector. Given an explicit matrix A ∈ R n×d , we show how to compute an approximate top eigenvector of · log 1/ . Here nnz(A) is the number of nonzeros in A, sr(A) is the stable rank, and gap is the relative eigengap. We also consider an online setting in which, given a stream of i.i.d. samples from a distribution D with covariance matrix Σ and a vector x 0 which is an O(gap) approximate top eigenvector for Σ, we show how to refine x 0 to an approximation using O v(D) gap· samples from D. Here v(D) is a natural notion of variance. Combining our algorithm with previous work to initialize x 0 , we obtain improved sample complexities and runtimes under a variety of assumptions on D. We achieve our results via a robust analysis of the classic shiftandinvert preconditioning method. This technique lets us reduce eigenvector computation to approximately solving a series of linear systems with fast stochastic gradient methods.