• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 2,835
Next 10 →

An introduction to the conjugate gradient method without the agonizing pain

by Jonathan Richard Shewchuk , 1994
"... ..."
Abstract - Cited by 481 (3 self) - Add to MetaCart
Abstract not found

Block preconditioning for the conjugate gradient method

by P. Concus, G. H. Golub, G. Meurant , 1982
"... Abstract. Block preconditionings for the conjugate gradient method are investigated for solving positive definite block tridiagonal systems of linear equations arising from discretization of boundary value problems for elliptic partial differential equations. The preconditionings rest on the use of ..."
Abstract - Cited by 102 (5 self) - Add to MetaCart
properties of tridiagonal matrix inverses, can be computationally more efficient for the same computer storage than other preconditionings, including the popular point incomplete Cholesky factorization. Key words, conjugate gradient method, elliptic partial differential equations, incomplete factoriza-tion

A scaled conjugate gradient algorithm for fast supervised learning

by Martin F. Møller - NEURAL NETWORKS , 1993
"... A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural netwo ..."
Abstract - Cited by 451 (0 self) - Add to MetaCart
A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural

LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares

by Christopher C. Paige, Michael A. Saunders - ACM Trans. Math. Software , 1982
"... An iterative method is given for solving Ax ~ffi b and minU Ax- b 112, where the matrix A is large and sparse. The method is based on the bidiagonalization procedure of Golub and Kahan. It is analytically equivalent to the standard method of conjugate gradients, but possesses more favorable numerica ..."
Abstract - Cited by 653 (21 self) - Add to MetaCart
An iterative method is given for solving Ax ~ffi b and minU Ax- b 112, where the matrix A is large and sparse. The method is based on the bidiagonalization procedure of Golub and Kahan. It is analytically equivalent to the standard method of conjugate gradients, but possesses more favorable

Large steps in cloth simulation

by David Baraff, Andrew Witkin - SIGGRAPH 98 Conference Proceedings , 1998
"... The bottle-neck in most cloth simulation systems is that time steps must be small to avoid numerical instability. This paper describes a cloth simulation system that can stably take large time steps. The simulation system couples a new technique for enforcing constraints on individual cloth particle ..."
Abstract - Cited by 576 (5 self) - Add to MetaCart
as well. The implicit integration method generates a large, unbanded sparse linear system at each time step which is solved using a modified conjugate gradient method that simultaneously enforces particles ’ constraints. The constraints are always maintained exactly, independent of the number of conjugate

The geometry of algorithms with orthogonality constraints

by Alan Edelman, Tomás A. Arias, Steven T. Smith - SIAM J. MATRIX ANAL. APPL , 1998
"... In this paper we develop new Newton and conjugate gradient algorithms on the Grassmann and Stiefel manifolds. These manifolds represent the constraints that arise in such areas as the symmetric eigenvalue problem, nonlinear eigenvalue problems, electronic structures computations, and signal proces ..."
Abstract - Cited by 640 (1 self) - Add to MetaCart
In this paper we develop new Newton and conjugate gradient algorithms on the Grassmann and Stiefel manifolds. These manifolds represent the constraints that arise in such areas as the symmetric eigenvalue problem, nonlinear eigenvalue problems, electronic structures computations, and signal

ATOMIC DECOMPOSITION BY BASIS PURSUIT

by Scott Shaobing Chen , David L. Donoho , Michael A. Saunders , 1995
"... The Time-Frequency and Time-Scale communities have recently developed a large number of overcomplete waveform dictionaries -- stationary wavelets, wavelet packets, cosine packets, chirplets, and warplets, to name a few. Decomposition into overcomplete systems is not unique, and several methods for d ..."
Abstract - Cited by 2728 (61 self) - Add to MetaCart
successfully only because of recent advances in linear programming by interior-point methods. We obtain reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.

Reconstruction and Representation of 3D Objects with Radial Basis Functions

by J. C. Carr, R. K. Beatson, J. B. Cherrie, T. J. Mitchell, W. R. Fright, B. C. McCallum, T. R. Evans - Computer Graphics (SIGGRAPH ’01 Conf. Proc.), pages 67–76. ACM SIGGRAPH , 2001
"... We use polyharmonic Radial Basis Functions (RBFs) to reconstruct smooth, manifold surfaces from point-cloud data and to repair incomplete meshes. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. Fast methods for fitting and evaluating RBFs al ..."
Abstract - Cited by 505 (1 self) - Add to MetaCart
We use polyharmonic Radial Basis Functions (RBFs) to reconstruct smooth, manifold surfaces from point-cloud data and to repair incomplete meshes. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. Fast methods for fitting and evaluating RBFs

A Flexible Inner-Outer Preconditioned GMRES Algorithm

by Youcef Saad , 1993
"... We present a variant of the GMRES lgorithm which l]ows changes in the prcconditioning at every step. There arc many possible applications o the new lgorithm some o which arc briefly discussed. In particular, a result o the flexibility o the new variant is that any iterative method can bc used as a p ..."
Abstract - Cited by 358 (30 self) - Add to MetaCart
prcconditioncr. For example, the standard GMRES lgorithm itself can bc used as a prcconditioncr, as can CGNR (or CGNE) the conjugate gradient method applied to the normal equations. However, the more appealing utilization o the method is in conjunction with relaxation techniques, possibly multi-level techniques

SEMIDEFINITE PROGRAMMING RELAXATIONS FOR THE GRAPH PARTITIONING PROBLEM

by Henry Wolkowicz , Qing Zhao , 1999
"... A new semidefinite programming, SDP, relaxation for the general graph partitioning problem, GP, is derived. The relaxation arises from the dual of the (homogenized) Lagrangian dual of an appropriate quadratic representation of GP. The quadratic representation includes a representation of the 0,1 co ..."
Abstract - Cited by 31 (6 self) - Add to MetaCart
-dual interior-point solution technique. A gangster operator is the key to providing an efficient representation of the constraints in the relaxation. An incomplete preconditioned conjugate gradient method is used for solving the large linear systems which arise when finding the Newton direction. Only dual
Next 10 →
Results 1 - 10 of 2,835
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University