Results 1 
8 of
8
REGULARIZED SEQUENTIAL QUADRATIC PROGRAMMING METHODS
, 2011
"... We present the formulation and analysis of a new sequential quadratic programming (SQP) method for general nonlinearly constrained optimization. The method pairs a primaldual generalized augmented Lagrangian merit function with a flexible line search to obtain a sequence of improving estimates of t ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
We present the formulation and analysis of a new sequential quadratic programming (SQP) method for general nonlinearly constrained optimization. The method pairs a primaldual generalized augmented Lagrangian merit function with a flexible line search to obtain a sequence of improving estimates of the solution. This function is a primaldual variant of the augmented Lagrangian proposed by Hestenes and Powell in the early 1970s. A crucial feature of the method is that the QP subproblems are convex, but formed from the exact second derivatives of the original problem. This is in contrast to methods that use a less accurate quasiNewton approximation. Additional benefits of this approach include the following: (i) each QP subproblem is regularized; (ii) the QP subproblem always has a known feasible point; and (iii) a projected gradient method may be used to identify the QP active set when far from the solution.
A GLOBALLY CONVERGENT STABILIZED SQP METHOD
, 2013
"... Sequential quadratic programming (SQP) methods are a popular class of methods for nonlinearly constrained optimization. They are particularly effective for solving a sequence of related problems, such as those arising in mixedinteger nonlinear programming and the optimization of functions subject t ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods are a popular class of methods for nonlinearly constrained optimization. They are particularly effective for solving a sequence of related problems, such as those arising in mixedinteger nonlinear programming and the optimization of functions subject to differential equation constraints. Recently, there has been considerable interest in the formulation of stabilized SQP methods, which are specifically designed to handle degenerate optimization problems. Existing stabilized SQP methods are essentially local, in the sense that both the formulation and analysis focus on the properties of the methods in a neighborhood of a solution. A new SQP method is proposed that has favorable global convergence properties yet, under suitable assumptions, is equivalent to a variant of the conventional stabilized SQP method in the neighborhood of a solution. The method combines a primaldual generalized augmented Lagrangian function with a flexible line search to obtain a sequence
A SECOND DERIVATIVE SQP METHOD: THEORETICAL ISSUES
, 2008
"... Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exactHessian SQP methods. In particul ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exactHessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding their global solutions may be computationally nonviable. This paper presents a secondderivative SQP method based on quadratic subproblems that are either convex, and thus may be solved efficiently, or need not be solved globally. Additionally, an explicit descentconstraint is imposed on certain QP subproblems, which “guides” the iterates through areas in which nonconvexity is a concern. Global convergence of the resulting algorithm is established.
A SECOND DERIVATIVE SQP METHOD WITH IMPOSED DESCENT
, 2008
"... Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exactHessian SQP methods. In particul ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exactHessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding their global solutions may be computationally nonviable. This paper presents a secondderivative Sℓ1QP method based on quadratic subproblems that are either convex, and thus may be solved efficiently, or need not be solved globally. Additionally, an explicit descent constraint is imposed on certain QP subproblems, which “guides ” the iterates through areas in which nonconvexity is a concern. Global convergence of the resulting algorithm is established.
A REGULARIZED SQP METHOD WITH CONVERGENCE TO SECONDORDER OPTIMAL POINTS
, 2013
"... Regularized and stabilized sequential quadratic programming methods are two classes of sequential quadratic programming (SQP) methods designed to resolve the numerical and theoretical difficulties associated with illposed or degenerate nonlinear optimization problems. Recently, a regularized SQP me ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Regularized and stabilized sequential quadratic programming methods are two classes of sequential quadratic programming (SQP) methods designed to resolve the numerical and theoretical difficulties associated with illposed or degenerate nonlinear optimization problems. Recently, a regularized SQP method has been proposed that provides a strong connection between augmented Lagrangian methods and stabilized SQP methods. The method is formulated as a regularized SQP method with an implicit safeguarding strategy based on minimizing a boundconstrained primaldual augmented Lagrangian. Each iteration involves the solution of a regularized quadratic program (QP) that is equivalent to a strictly convex boundconstrained QP based on minimizing a quadratic model of the augmented Lagrangian. The solution of the QP subproblem defines a descent direction for a flexible line search that provides a sufficient decrease in a primaldual augmented Lagrangian merit function. Under certain conditions, the method is guaranteed to converge to a point satisfying the firstorder KarushKuhnTucker (KKT) conditions. In this paper, the regularized SQP method is extended to allow convergence to points satisfying certain secondorder KKT
Comput Optim Appl A primaldual augmented Lagrangian
"... Abstract Nonlinearly constrained optimization problems can be solved by minimizing a sequence of simpler unconstrained or linearly constrained subproblems. In this paper, we consider the formulation of subproblems in which the objective function is a generalization of the HestenesPowell augmented ..."
Abstract
 Add to MetaCart
Abstract Nonlinearly constrained optimization problems can be solved by minimizing a sequence of simpler unconstrained or linearly constrained subproblems. In this paper, we consider the formulation of subproblems in which the objective function is a generalization of the HestenesPowell augmented Lagrangian function. The main feature of the generalized function is that it is minimized with respect to both the primal and the dual variables simultaneously. The benefits of this approach include: (i) the ability to control the quality of the dual variables during the solution of the subproblem; (ii) the availability of improved dual estimates on early termination of the subproblem; and (iii) the ability to regularize the subproblem by imposing explicit bounds on the dual variables. We propose two primaldual variants of conventional primal methods: a primaldual bound constrained Lagrangian (pdBCL) method and a primaldual 1 linearly constrained Lagrangian (pd 1 LCL) method. Finally, a new sequential quadratic programming (pdSQP) method is proposed that uses the primaldual augmented Lagrangian as a merit function.
(will be inserted by the editor) On the Implementation of an InteriorPoint Algorithm for Nonlinear Optimization with Inexact Step Computations
, 2012
"... Abstract This paper describes a practical implementation of a linesearch interiorpoint algorithm for largescale nonlinear optimization. It is based on the algorithm proposed by Curtis, Schenk, and Wächter [SIAM J. Sci. Comput., 32 (2010), pp. 34473475], a method that possesses global convergence ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract This paper describes a practical implementation of a linesearch interiorpoint algorithm for largescale nonlinear optimization. It is based on the algorithm proposed by Curtis, Schenk, and Wächter [SIAM J. Sci. Comput., 32 (2010), pp. 34473475], a method that possesses global convergence guarantees to firstorder stationary points with the novel feature that inexact search direction calculations are allowed in order to save computational expense during each iteration. The implementation follows the proposed algorithm, except that additional functionality is included to avoid the explicit computation of a normal step during every iteration. It also contains further enhancements that have not been studied along with the previous theoretical analysis. The implementation has been included in the IPOPT software package paired with an iterative linear system solver and preconditioner provided in the PARDISO software. Numerical results on a large nonlinear optimization test set and two PDEconstrained optimization problems with control and state constraints are presented to illustrate that the implementation is robust and efficient for largescale applications.
unknown title
"... Abstract. Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exactHessian SQP methods. I ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exactHessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding their global solutions may be computationally nonviable. This paper presents a second derivative SQP method based on quadratic subproblems that are either convex, and thus may be solved efficiently, or need not be solved globally. Additionally, an explicit descentconstraint is imposed on certain QP subproblems, which “guides ” the iterates through areas in which nonconvexity is a concern. Global convergence of the resulting algorithm is established.