Results 1 
8 of
8
Tilt stability in nonlinear programming under MangasarianFromovitz constraint qualification, Kybernetika
, 2013
"... Tilt stability in nonlinear programming under MangasarianFromovitz constraint qualification ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
Tilt stability in nonlinear programming under MangasarianFromovitz constraint qualification
Local convergence of the method of multipliers for variational and optimization problems under the sole noncriticality assumption. August 2013. Available at http://pages.cs.wisc.edu/˜solodov/solodov.html
"... We present local convergence analysis of the method of multipliers for equalityconstrained variational problems (in the special case of optimization, also called the augmented Lagrangian method) under the sole assumption that the dual starting point is close to a noncritical Lagrange multiplier (wh ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
We present local convergence analysis of the method of multipliers for equalityconstrained variational problems (in the special case of optimization, also called the augmented Lagrangian method) under the sole assumption that the dual starting point is close to a noncritical Lagrange multiplier (which is weaker than secondorder sufficiency). Local superlinear convergence is established under the appropriate control of the penalty parameter values. For optimization problems, we demonstrate in addition local linear convergence for sufficiently large fixed penalty parameters. Both exact and inexact versions of the method are considered. Contributions with respect to previous stateoftheart analyses for equalityconstrained problems consist in the extension to the variational setting, in using the weaker noncriticality assumption instead of the usual secondorder sufficient optimality condition, and in relaxing the smoothness requirements on the problem data. In the context of optimization problems, this gives the first local convergence results for the augmented Lagrangian method under the assumptions that do not include any constraint qualifications and are weaker than the secondorder sufficient optimality condition. We also show that the analysis under the noncriticality assumption cannot be extended to the case with inequality constraints, unless the strict complementarity condition is added (this, however, still gives a new result).
PRONEX–Optimization, and by FAPERJ.
, 2013
"... Local convergence of the method of multipliers for variational and optimization problems under the noncriticality assumption ..."
Abstract
 Add to MetaCart
(Show Context)
Local convergence of the method of multipliers for variational and optimization problems under the noncriticality assumption
CONVERGENCE CONDITIONS FOR NEWTONTYPE METHODS APPLIED TO COMPLEMENTARITY SYSTEMS WITH NONISOLATED SOLUTIONS∗
, 2015
"... Abstract. We consider a class of Newtontype methods for constrained systems of equations that involve complementarity conditions. In particular, at issue are the constrained Levenberg–Marquardt method and the recently introduced LinearProgrammingNewton method, designed for the difficult case when ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. We consider a class of Newtontype methods for constrained systems of equations that involve complementarity conditions. In particular, at issue are the constrained Levenberg–Marquardt method and the recently introduced LinearProgrammingNewton method, designed for the difficult case when solutions need not be isolated, and the equation mapping need not be differentiable at the solutions. We show that the only structural assumption needed for rapid local convergence of those algorithms is the piecewise error bound, i.e., a local error bound holding for the branches of the solution set resulting from partitions of the biactive complementarity indices. The latter error bound is implied by various piecewise constraint qualifications, including some relatively weak ones. As further applications of our results, we consider Karush–Kuhn–Tucker systems arising from optimization or variational problems, and from generalized Nash equilibrium problems. In the first case, we show convergence under the assumption that the dual part of the solution is a noncritical Lagrange multiplier, and in the second case convergence follows under a certain relaxed constant rank condition. In both cases, this improves on results previously available for the corresponding problems.
SOME COMPOSITESTEP CONSTRAINED OPTIMIZATION METHODS INTERPRETED VIA THE PERTURBED SEQUENTIAL QUADRATIC PROGRAMMING FRAMEWORK
, 2013
"... We consider the inexact restoration and the compositestep sequential quadratic programming (SQP) methods, and relate them to the socalled perturbed SQP framework. In particular, iterations of the methods in question are interpreted as certain structured perturbations of the basic SQP iterations. T ..."
Abstract
 Add to MetaCart
(Show Context)
We consider the inexact restoration and the compositestep sequential quadratic programming (SQP) methods, and relate them to the socalled perturbed SQP framework. In particular, iterations of the methods in question are interpreted as certain structured perturbations of the basic SQP iterations. This gives a different insight into local behaviour of those algorithms, as well as improved or different local convergence and rate of convergence results. Key words: sequential quadratic programming; inexact restoration; perturbed SQP; compositestep SQP; superlinear convergence.
Invited “Discussion Paper ” for TOP CRITICAL LAGRANGE MULTIPLIERS: WHAT WE CURRENTLY KNOW ABOUT THEM, HOW THEY SPOIL OUR LIFE, AND WHAT WE CAN DO ABOUT IT∗
, 2014
"... We discuss a certain special subset of Lagrange multipliers, called critical, which usually exist when multipliers associated to a given solution are not unique. This kind of multipliers appear to be important for a number of reasons, some understood better, some (currently) not fully. What is clear ..."
Abstract
 Add to MetaCart
(Show Context)
We discuss a certain special subset of Lagrange multipliers, called critical, which usually exist when multipliers associated to a given solution are not unique. This kind of multipliers appear to be important for a number of reasons, some understood better, some (currently) not fully. What is clear, is that Newton and Newtonrelated methods have an amazingly strong tendency to generate sequences with dual components converging to critical multipliers. This is quite striking because, typically, the set of critical multipliers is “thin ” (the set of noncritical ones is relatively open and dense, meaning that its closure is the whole set). Apart from mathematical curiosity to understand the phenomenon for something as classical as the Newton method, the attraction to critical multipliers is relevant computationally. This is because convergence to such multipliers is the reason for slow convergence of the Newton method in degenerate cases, as convergence to noncritical limits (if it were to happen) would have given the superlinear rate. Moreover, the attraction phenomenon shows up not only for the basic Newton method, but also for other related techniques (for example, quasiNewton, and the linearlyconstrained augmented Lagrangian method). In spite of clear computational
COMBINING STABILIZED SQP WITH THE AUGMENTED LAGRANGIAN ALGORITHM∗
, 2014
"... For an optimization problem with general equality and inequality constraints, we propose an algorithm which uses subproblems of the stabilized SQP (sSQP) type for approximately solving subproblems of the augmented Lagrangian method. The motivation is to take advantage of the wellknown robust behav ..."
Abstract
 Add to MetaCart
(Show Context)
For an optimization problem with general equality and inequality constraints, we propose an algorithm which uses subproblems of the stabilized SQP (sSQP) type for approximately solving subproblems of the augmented Lagrangian method. The motivation is to take advantage of the wellknown robust behavior of the augmented Lagrangian algorithm, including on problems with degenerate constraints, and at the same time try to reduce the overall algorithm locally to sSQP (which gives fast local convergence rate under weak assumptions). Specifically, the algorithm first verifies whether the primaldual sSQP step (with unit stepsize) makes good progress towards decreasing the violation of optimality conditions for the original problem, and if so, makes this step. Otherwise, the primal part of the sSQP direction is used for linesearch that decreases the augmented Lagrangian, keeping the multiplier estimate fixed for the time being. The overall algorithm has reasonable global convergence guarantees, and inherits strong convergence rate properties of sSQP under the same weak assumptions. Numerical results on degenerate problems and comparisons with some alternatives are reported. Key words: stabilized sequential quadratic programming; augmented Lagrangian; superlinear convergence; global convergence.
STABILIZED SEQUENTIAL QUADRATIC PROGRAMMING: A SURVEY
, 2013
"... We review the motivation for, the current stateoftheart in convergence results, and some open questions concerning the stabilized version of the sequential quadratic programming algorithm for constrained optimization. We also discuss the tools required for its local convergence analysis, globaliz ..."
Abstract
 Add to MetaCart
(Show Context)
We review the motivation for, the current stateoftheart in convergence results, and some open questions concerning the stabilized version of the sequential quadratic programming algorithm for constrained optimization. We also discuss the tools required for its local convergence analysis, globalization challenges, and extentions of the method to the more general variational problems.