Results 1  10
of
54
Interior methods for nonlinear optimization
 SIAM REVIEW
, 2002
"... Interior methods are an omnipresent, conspicuous feature of the constrained optimization landscape today, but it was not always so. Primarily in the form of barrier methods, interiorpoint techniques were popular during the 1960s for solving nonlinearly constrained problems. However, their use for ..."
Abstract

Cited by 127 (6 self)
 Add to MetaCart
(Show Context)
Interior methods are an omnipresent, conspicuous feature of the constrained optimization landscape today, but it was not always so. Primarily in the form of barrier methods, interiorpoint techniques were popular during the 1960s for solving nonlinearly constrained problems. However, their use for linear programming was not even contemplated because of the total dominance of the simplex method. Vague but continuing anxiety about barrier methods eventually led to their abandonment in favor of newly emerging, apparently more efficient alternatives such as augmented Lagrangian and sequential quadratic programming methods. By the early 1980s, barrier methods were almost without exception regarded as a closed chapter in the history of optimization. This picture changed dramatically with Karmarkar’s widely publicized announcement in 1984 of a fast polynomialtime interior method for linear programming; in 1985, a formal connection was established between his method and classical barrier methods. Since then, interior methods have advanced so far, so fast, that their influence has transformed both the theory and practice of constrained optimization. This article provides a condensed, selective look at classical material and recent research about interior methods for nonlinearly constrained optimization.
On the constant positive linear dependence condition and its application to SQP methods
 SIAM Journal on Optimization
, 2000
"... Abstract. In this paper, we introduce a constant positive linear dependence condition (CPLD), which is weaker than the Mangasarian–Fromovitz constraint qualification (MFCQ) and the constant rank constraint qualification (CRCQ). We show that a limit point of a sequence of approximating Karush–Kuhn–Tu ..."
Abstract

Cited by 50 (3 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we introduce a constant positive linear dependence condition (CPLD), which is weaker than the Mangasarian–Fromovitz constraint qualification (MFCQ) and the constant rank constraint qualification (CRCQ). We show that a limit point of a sequence of approximating Karush–Kuhn–Tucker (KKT) points is a KKT point if the CPLD holds there. We show that a KKT point satisfying the CPLD and the strong secondorder sufficiency conditions (SSOSC) is an isolated KKT point. We then establish convergence of a general sequential quadratical programming (SQP) method under the CPLD and the SSOSC. Finally, we apply these results to analyze the feasible SQP method proposed by Panier and Tits in 1993 for inequality constrained optimization problems. We establish its global convergence under the SSOSC and a condition slightly weaker than the Mangasarian–Fromovitz constraint qualification, and we prove superlinear convergence of a modified version of this algorithm under the SSOSC and a condition slightly weaker than the linear independence constraint qualification.
Modifying SQP for degenerate problems
 Preprint ANL/MCSP6991097, Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, Ill
, 1997
"... Abstract. Most local convergence analyses of the sequential quadratic programming (SQP) algorithm for nonlinear programming make strong assumptions about the solution, namely, that the active constraint gradients are linearly independent and that there are no weakly active constraints. In this paper ..."
Abstract

Cited by 38 (5 self)
 Add to MetaCart
Abstract. Most local convergence analyses of the sequential quadratic programming (SQP) algorithm for nonlinear programming make strong assumptions about the solution, namely, that the active constraint gradients are linearly independent and that there are no weakly active constraints. In this paper, we establish a framework for variants of SQP that retain the characteristic superlinear convergence rate even when these assumptions are relaxed, proving general convergence results and placing some recently proposed SQP variants in this framework. We discuss the reasons for which implementations of SQP often continue to exhibit good local convergence behavior even when the assumptions commonly made in the analysis are violated. Finally, we describe a new algorithm that formalizes and extends standard SQP implementation techniques, and we prove convergence results for this method also. AMS subject classifications. 90C33, 90C30, 49M45 1. Introduction. We
Stability in the presence of degeneracy and error estimation
 Math. Program
"... ..."
(Show Context)
Local behavior of an iterative framework for generalized equations with nonisolated solutions
 MATH. PROGRAM., SER. A
, 2002
"... ..."
An interior point method for mathematical programs with complementarity constraints (MPCCs)
 SIAM JOURNAL ON OPTIMIZATION
, 2003
"... Interior point methods for nonlinear programs (NLP) are adapted for solution of mathematical programs with complementarity constraints (MPCCs). The constraints of the MPCC are suitably relaxed so as to guarantee a strictly feasible interior for the inequality constraints. The standard primaldual ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
Interior point methods for nonlinear programs (NLP) are adapted for solution of mathematical programs with complementarity constraints (MPCCs). The constraints of the MPCC are suitably relaxed so as to guarantee a strictly feasible interior for the inequality constraints. The standard primaldual algorithm has been adapted with a modified step calculation. The algorithm is shown to be superlinearly convergent in the neighborhood of the solution set under assumptions of MPCCLICQ, strong stationarity and upper level strict complementarity. The modification can be easily accommodated within most nonlinear programming interior point algorithms with identical local behavior. Numerical experience is also presented and holds promise for the proposed method.
STABILIZED SEQUENTIAL QUADRATIC PROGRAMMING FOR OPTIMIZATION AND A STABILIZED NEWTONTYPE METHOD FOR VARIATIONAL PROBLEMS WITHOUT CONSTRAINT QUALIFICATIONS
, 2007
"... The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve fast convergence despite possible degeneracy of constraints of optimization problems, when the Lagrange multipliers associated to a solution are not unique. Superlinear convergence ..."
Abstract

Cited by 24 (14 self)
 Add to MetaCart
(Show Context)
The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve fast convergence despite possible degeneracy of constraints of optimization problems, when the Lagrange multipliers associated to a solution are not unique. Superlinear convergence of sSQP had been previously established under the secondorder sufficient condition for optimality (SOSC) and the MangasarianFromovitz constraint qualification, or under the strong secondorder sufficient condition for optimality (in that case, without constraint qualification assumptions). We prove a stronger superlinear convergence result than the above, assuming SOSC only. In addition, our analysis is carried out in the more general setting of variational problems, for which we introduce a natural extension of sSQP techniques. In the process, we also obtain a new error bound for KarushKuhnTucker systems for variational problems.
An algorithm for degenerate nonlinear programming with rapid local convergence
 SIAM J. Optim
, 2005
"... Abstract. The paper describes and analyzes an algorithmic framework for solving nonlinear programming problems in which strict complementarity conditions and constraint qualifications are not necessarily satisfied at a solution. The framework is constructed from three main algorithmic ingredients. T ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The paper describes and analyzes an algorithmic framework for solving nonlinear programming problems in which strict complementarity conditions and constraint qualifications are not necessarily satisfied at a solution. The framework is constructed from three main algorithmic ingredients. The first is any conventional method for nonlinear programming that produces estimates of the Lagrange multipliers at each iteration; the second is a technique for estimating the set of active constraint indices; the third is stabilized LagrangeNewton algorithm with rapid local convergence properties. Results concerning rapid local convergence and global convergence of the proposed framework are proved. The approach improves on existing approaches in that less restrictive assumptions are needed for convergence and/or the computational workload at each iteration is lower.
On attraction of Newtontype iterates to multipliers violating secondorder sufficiency conditions
, 2009
"... Assuming that the primal part of the sequence generated by a Newtontype (e.g., SQP) method applied to an equalityconstrained problem converges to a solution where the constraints are degenerate, we investigate whether the dual part of the sequence is attracted by those Lagrange multipliers which s ..."
Abstract

Cited by 20 (15 self)
 Add to MetaCart
Assuming that the primal part of the sequence generated by a Newtontype (e.g., SQP) method applied to an equalityconstrained problem converges to a solution where the constraints are degenerate, we investigate whether the dual part of the sequence is attracted by those Lagrange multipliers which satisfy secondorder sufficient condition (SOSC) for optimality, or by those multipliers which violate it. This question is relevant at least for two reasons: one is speed of convergence of standard methods; the other is applicability of some recently proposed approaches for handling degenerate constraints. We show that for the class of damped Newton methods, convergence of the dual sequence to multipliers satisfying SOSC is unlikely to occur. We support our findings by numerical experiments. We also suggest a simple auxiliary procedure for computing multiplier estimates, which does not have this