• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 3,212
Next 10 →

CONVERGENCE ANALYSIS OF GRADIENT ITERATIONS FOR THE SYMMETRIC EIGENVALUE PROBLEM

by Klaus Neymeyrα, Evgueni Ovtchinnikovβ, Ming Zhouα
"... Abstract. Gradient iterations for the Rayleigh quotient are simple and robust solvers to determine a few of the smallest eigenvalues together with the associated eigenvectors of (generalized) matrix eigenvalue problems for symmetric matrices. Sharp convergence estimates for the Ritz values and Ritz ..."
Abstract - Add to MetaCart
Abstract. Gradient iterations for the Rayleigh quotient are simple and robust solvers to determine a few of the smallest eigenvalues together with the associated eigenvectors of (generalized) matrix eigenvalue problems for symmetric matrices. Sharp convergence estimates for the Ritz values and Ritz

Pegasos: Primal Estimated sub-gradient solver for SVM

by Shai Shalev-Shwartz, Yoram Singer, Nathan Srebro, Andrew Cotter
"... We describe and analyze a simple and effective stochastic sub-gradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy ɛ is Õ(1/ɛ), where each iteration operates on a singl ..."
Abstract - Cited by 542 (20 self) - Add to MetaCart
We describe and analyze a simple and effective stochastic sub-gradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy ɛ is Õ(1/ɛ), where each iteration operates on a

An iterative image registration technique with an application to stereo vision

by Bruce D. Lucas, Takeo Kanade - In IJCAI81 , 1981
"... Image registration finds a variety of applications in computer vision. Unfortunately, traditional image registration techniques tend to be costly. We present a new image registration technique that makes use of the spatial intensity gradient of the images to find a good match using a type of Newton- ..."
Abstract - Cited by 2897 (30 self) - Add to MetaCart
Image registration finds a variety of applications in computer vision. Unfortunately, traditional image registration techniques tend to be costly. We present a new image registration technique that makes use of the spatial intensity gradient of the images to find a good match using a type of Newton

BLOCK STOCHASTIC GRADIENT ITERATION FOR CONVEX AND NONCONVEX OPTIMIZATION

by Yangyang Xu, Wotao Yin , 2015
"... The stochastic gradient (SG) method can quickly solve a problem with a large number of components in the objective, or a stochastic optimization problem, to a moderate accuracy. The block coordinate descent/update (BCD) method, on the other hand, can quickly solve problems with multiple (blocks of ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
The stochastic gradient (SG) method can quickly solve a problem with a large number of components in the objective, or a stochastic optimization problem, to a moderate accuracy. The block coordinate descent/update (BCD) method, on the other hand, can quickly solve problems with multiple (blocks

Improved FOCUSS Method With Conjugate Gradient Iterations

by Zhaoshui He, Andrzej Cichocki, Rafal Zdunek, Shengli Xie
"... Abstract—FOCal Underdetermined System Solver (FOCUSS) is a powerful tool for sparse representation and underdetermined inverse problems. In this correspondence, we strengthen the FOCUSS method with the following main contributions: 1) we give a more rigorous derivation of the FO-CUSS for the sparsit ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
-CUSS for the sparsity parameter 0 1 by a nonlinear transform and 2) we develop the CG-FOCUSS by incorporating the conjugate gradient (CG) method to the FOCUSS, which significantly reduces a computational cost with respect to the standard FOCUSS and extends its availability for large scale problems. We justify the CG

Large steps in cloth simulation

by David Baraff, Andrew Witkin - SIGGRAPH 98 Conference Proceedings , 1998
"... The bottle-neck in most cloth simulation systems is that time steps must be small to avoid numerical instability. This paper describes a cloth simulation system that can stably take large time steps. The simulation system couples a new technique for enforcing constraints on individual cloth particle ..."
Abstract - Cited by 576 (5 self) - Add to MetaCart
gradient iterations, which is typically small. The resulting simulation system is significantly faster than previous accounts of cloth simulation systems in the literature. Keywords—Cloth, simulation, constraints, implicit integration, physically-based modeling. 1

CGIHT: Conjugate Gradient Iterative Hard Thresholding for Compressed Sensing and Matrix Completion

by Jeffrey D. Blanchard, Jared Tanner, Ke Wei , 2014
"... We introduce the Conjugate Gradient Iterative Hard Thresholding (CGIHT) family of algorithms for the efficient solution of constrained underdetermined linear systems of equations arising in compressed sensing, row sparse approximation, and matrix completion. CGIHT is designed to balance the low per ..."
Abstract - Cited by 5 (3 self) - Add to MetaCart
We introduce the Conjugate Gradient Iterative Hard Thresholding (CGIHT) family of algorithms for the efficient solution of constrained underdetermined linear systems of equations arising in compressed sensing, row sparse approximation, and matrix completion. CGIHT is designed to balance the low per

A scaled conjugate gradient algorithm for fast supervised learning

by Martin F. Møller - NEURAL NETWORKS , 1993
"... A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural netwo ..."
Abstract - Cited by 451 (0 self) - Add to MetaCart
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural

Policy gradient methods for reinforcement learning with function approximation.

by Richard S Sutton , David Mcallester , Satinder Singh , Yishay Mansour - In NIPS, , 1999
"... Abstract Function approximation is essential to reinforcement learning, but the standard approach of approximating a value function and determining a policy from it has so far proven theoretically intractable. In this paper we explore an alternative approach in which the policy is explicitly repres ..."
Abstract - Cited by 439 (20 self) - Add to MetaCart
that the gradient can be written in a form suitable for estimation from experience aided by an approximate action-value or advantage function. Using this result, we prove for the first time that a version of policy iteration with arbitrary differentiable function approximation is convergent to a locally optimal

DISCREPANCY PRINCIPLE FOR STATISTICAL INVERSE PROBLEMS WITH APPLICATION TO CONJUGATE GRADIENT ITERATION

by G. Blanchard, P. Math É, Dedicated To Ulrich Tautenhahn, A Friend
"... who passed away too early at the age of 60. Abstract. The authors discuss the use of the discrepancy principle for statistical inverse problems, when the underlying operator is of trace class. Under this assumption the discrepancy principle is well-defined, however a plain use of it may occasionally ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
occasionally fail and it will yield sub-optimal rates. Therefore, a modification of the discrepancy is introduced, which takes into account both of the above deficiencies. For a variety of linear regularization schemes as well as for conjugate gradient iteration it is shown to yield order optimal a priori
Next 10 →
Results 1 - 10 of 3,212
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University