Results 1  10
of
3,212
CONVERGENCE ANALYSIS OF GRADIENT ITERATIONS FOR THE SYMMETRIC EIGENVALUE PROBLEM
"... Abstract. Gradient iterations for the Rayleigh quotient are simple and robust solvers to determine a few of the smallest eigenvalues together with the associated eigenvectors of (generalized) matrix eigenvalue problems for symmetric matrices. Sharp convergence estimates for the Ritz values and Ritz ..."
Abstract
 Add to MetaCart
Abstract. Gradient iterations for the Rayleigh quotient are simple and robust solvers to determine a few of the smallest eigenvalues together with the associated eigenvectors of (generalized) matrix eigenvalue problems for symmetric matrices. Sharp convergence estimates for the Ritz values and Ritz
Pegasos: Primal Estimated subgradient solver for SVM
"... We describe and analyze a simple and effective stochastic subgradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy ɛ is Õ(1/ɛ), where each iteration operates on a singl ..."
Abstract

Cited by 542 (20 self)
 Add to MetaCart
We describe and analyze a simple and effective stochastic subgradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy ɛ is Õ(1/ɛ), where each iteration operates on a
An iterative image registration technique with an application to stereo vision
 In IJCAI81
, 1981
"... Image registration finds a variety of applications in computer vision. Unfortunately, traditional image registration techniques tend to be costly. We present a new image registration technique that makes use of the spatial intensity gradient of the images to find a good match using a type of Newton ..."
Abstract

Cited by 2897 (30 self)
 Add to MetaCart
Image registration finds a variety of applications in computer vision. Unfortunately, traditional image registration techniques tend to be costly. We present a new image registration technique that makes use of the spatial intensity gradient of the images to find a good match using a type of Newton
BLOCK STOCHASTIC GRADIENT ITERATION FOR CONVEX AND NONCONVEX OPTIMIZATION
, 2015
"... The stochastic gradient (SG) method can quickly solve a problem with a large number of components in the objective, or a stochastic optimization problem, to a moderate accuracy. The block coordinate descent/update (BCD) method, on the other hand, can quickly solve problems with multiple (blocks of ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The stochastic gradient (SG) method can quickly solve a problem with a large number of components in the objective, or a stochastic optimization problem, to a moderate accuracy. The block coordinate descent/update (BCD) method, on the other hand, can quickly solve problems with multiple (blocks
Improved FOCUSS Method With Conjugate Gradient Iterations
"... Abstract—FOCal Underdetermined System Solver (FOCUSS) is a powerful tool for sparse representation and underdetermined inverse problems. In this correspondence, we strengthen the FOCUSS method with the following main contributions: 1) we give a more rigorous derivation of the FOCUSS for the sparsit ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
CUSS for the sparsity parameter 0 1 by a nonlinear transform and 2) we develop the CGFOCUSS by incorporating the conjugate gradient (CG) method to the FOCUSS, which significantly reduces a computational cost with respect to the standard FOCUSS and extends its availability for large scale problems. We justify the CG
Large steps in cloth simulation
 SIGGRAPH 98 Conference Proceedings
, 1998
"... The bottleneck in most cloth simulation systems is that time steps must be small to avoid numerical instability. This paper describes a cloth simulation system that can stably take large time steps. The simulation system couples a new technique for enforcing constraints on individual cloth particle ..."
Abstract

Cited by 576 (5 self)
 Add to MetaCart
gradient iterations, which is typically small. The resulting simulation system is significantly faster than previous accounts of cloth simulation systems in the literature. Keywords—Cloth, simulation, constraints, implicit integration, physicallybased modeling. 1
CGIHT: Conjugate Gradient Iterative Hard Thresholding for Compressed Sensing and Matrix Completion
, 2014
"... We introduce the Conjugate Gradient Iterative Hard Thresholding (CGIHT) family of algorithms for the efficient solution of constrained underdetermined linear systems of equations arising in compressed sensing, row sparse approximation, and matrix completion. CGIHT is designed to balance the low per ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
We introduce the Conjugate Gradient Iterative Hard Thresholding (CGIHT) family of algorithms for the efficient solution of constrained underdetermined linear systems of equations arising in compressed sensing, row sparse approximation, and matrix completion. CGIHT is designed to balance the low per
A scaled conjugate gradient algorithm for fast supervised learning
 NEURAL NETWORKS
, 1993
"... A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural netwo ..."
Abstract

Cited by 451 (0 self)
 Add to MetaCart
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural
Policy gradient methods for reinforcement learning with function approximation.
 In NIPS,
, 1999
"... Abstract Function approximation is essential to reinforcement learning, but the standard approach of approximating a value function and determining a policy from it has so far proven theoretically intractable. In this paper we explore an alternative approach in which the policy is explicitly repres ..."
Abstract

Cited by 439 (20 self)
 Add to MetaCart
that the gradient can be written in a form suitable for estimation from experience aided by an approximate actionvalue or advantage function. Using this result, we prove for the first time that a version of policy iteration with arbitrary differentiable function approximation is convergent to a locally optimal
DISCREPANCY PRINCIPLE FOR STATISTICAL INVERSE PROBLEMS WITH APPLICATION TO CONJUGATE GRADIENT ITERATION
"... who passed away too early at the age of 60. Abstract. The authors discuss the use of the discrepancy principle for statistical inverse problems, when the underlying operator is of trace class. Under this assumption the discrepancy principle is welldefined, however a plain use of it may occasionally ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
occasionally fail and it will yield suboptimal rates. Therefore, a modification of the discrepancy is introduced, which takes into account both of the above deficiencies. For a variety of linear regularization schemes as well as for conjugate gradient iteration it is shown to yield order optimal a priori
Results 1  10
of
3,212