Results 1  10
of
10,764
Superlinear Convergence And Implicit Filtering
, 1999
"... . In this note we show how the implicit filtering algorithm can be coupled with the BFGS quasiNewton update to obtain a superlinearly convergent iteration if the noise in the objective function decays sufficiently rapidly as the optimal point is approached. We show how known theory for the noisefr ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
. In this note we show how the implicit filtering algorithm can be coupled with the BFGS quasiNewton update to obtain a superlinearly convergent iteration if the noise in the objective function decays sufficiently rapidly as the optimal point is approached. We show how known theory for the noise
SUPERLINEAR CONVERGENCE OF CONJUGATE GRADIENTS
, 2001
"... We give a theoretical explanation for superlinear convergence behavior observed while solving large symmetric systems of equations using the conjugate gradient method or other Krylov subspace methods. We present a new bound on the relative error after n iterations. This bound is valid in an asympto ..."
Abstract

Cited by 24 (6 self)
 Add to MetaCart
We give a theoretical explanation for superlinear convergence behavior observed while solving large symmetric systems of equations using the conjugate gradient method or other Krylov subspace methods. We present a new bound on the relative error after n iterations. This bound is valid
Superlinear convergence of the affine scaling algorithm
, 1993
"... In this paper we show that a variant of the longstep affine scaling algorithm (with variable stepsizes) is twostep superlinearly convergent when applied to general linear programming (LP) problems. Superlinear convergence of the sequence of dual estimates is also established. For homogeneous LP pr ..."
Abstract
 Add to MetaCart
In this paper we show that a variant of the longstep affine scaling algorithm (with variable stepsizes) is twostep superlinearly convergent when applied to general linear programming (LP) problems. Superlinear convergence of the sequence of dual estimates is also established. For homogeneous LP
Superlinear Convergence of InteriorPoint Algorithms for Semidefinite Programming
 Journal of Optimization Theory and Applications
, 1996
"... We prove the superlinear convergence of the primaldual infeasibleinteriorpoint pathfollowing algorithm proposed recently by Kojima, Shida and Shindoh and the present authors, under two conditions: (1) the SDP problem has a strictly complementary solution, and (2) the size of the central path nei ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
We prove the superlinear convergence of the primaldual infeasibleinteriorpoint pathfollowing algorithm proposed recently by Kojima, Shida and Shindoh and the present authors, under two conditions: (1) the SDP problem has a strictly complementary solution, and (2) the size of the central path
A superlinearly convergent trust region bundle method
, 1998
"... Abstract Bundle methods for the minimization of nonsmooth functions have been around for almost 20 years. Numerous variations have been proposed. But until very recently they all suffered from the drawback of only linear convergence. The aim of this paper is to show how exploiting an analogy with S ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
with SQP gives rise to a superlinearly convergent bundle method. Our algorithm features a trust region philosophy and is expected to converge superlinearly even for nonconvex problems. 1 Introduction Our aim will be to present a superlinearly convergent algorithm to solve the problem minx2IRn f (x); (1
Superlinear convergence of an interiorpoint method despite dependent constraints
 Preprint ANL/MCSP6221196, Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, Ill
, 1996
"... Abstract. We show that an interiorpoint method for monotone variational inequalities exhibits superlinear convergence provided that all the standard assumptions hold except for the wellknown assumption that the Jacobian of the active constraints has full rank at the solution. We show that superlin ..."
Abstract

Cited by 39 (11 self)
 Add to MetaCart
Abstract. We show that an interiorpoint method for monotone variational inequalities exhibits superlinear convergence provided that all the standard assumptions hold except for the wellknown assumption that the Jacobian of the active constraints has full rank at the solution. We show
On the occurrence of superlinear convergence of exact and inexact Krylov subspace methods
 SIAM Review
, 2003
"... Abstract. Krylov subspace methods often exhibit superlinear convergence. We present a general analytic model which describes this superlinear convergence, when it occurs. We take an invariant subspace approach, so that our results apply also to inexact methods, and to nondiagonalizable matrices. Thu ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. Krylov subspace methods often exhibit superlinear convergence. We present a general analytic model which describes this superlinear convergence, when it occurs. We take an invariant subspace approach, so that our results apply also to inexact methods, and to nondiagonalizable matrices
On the occurrence of superlinear convergence of exact and inexact Krylov subspace methods
 SIAM Rev
, 2005
"... We present a general analytical model which describes the superlinear convergence of Krylov subspace methods. We take an invariant subspace approach, so that our results apply also to inexact methods, and to nondiagonalizable matrices. Thus, we provide a unified treatment of the superlinear conve ..."
Abstract

Cited by 29 (9 self)
 Add to MetaCart
We present a general analytical model which describes the superlinear convergence of Krylov subspace methods. We take an invariant subspace approach, so that our results apply also to inexact methods, and to nondiagonalizable matrices. Thus, we provide a unified treatment of the superlinear
Superlinear Convergence of Krylov Subspace Methods for SelfAdjoint Problems in Hilbert Space
, 2014
"... The conjugate gradient and minimum residual methods for selfadjoint problems in Hilbert space are considered. Linear and superlinear convergence results both with respect to Q and Rrates are reviewed. New results on `step Qsuperlinear and Rsuperlinear convergence for the minimum residual metho ..."
Abstract
 Add to MetaCart
The conjugate gradient and minimum residual methods for selfadjoint problems in Hilbert space are considered. Linear and superlinear convergence results both with respect to Q and Rrates are reviewed. New results on `step Qsuperlinear and Rsuperlinear convergence for the minimum residual
Superlinear convergence of a stabilized SQP method to a degenerate solution
 COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
, 1998
"... We describe a slight modification of the wellknown sequential quadratic programming method for nonlinear programming that attains superlinear convergence to a primaldual solution even when the Jacobian of the active constraints is rank deficient at the solution. We show that rapid convergence occu ..."
Abstract

Cited by 54 (7 self)
 Add to MetaCart
We describe a slight modification of the wellknown sequential quadratic programming method for nonlinear programming that attains superlinear convergence to a primaldual solution even when the Jacobian of the active constraints is rank deficient at the solution. We show that rapid convergence
Results 1  10
of
10,764