Results 1  10
of
73
An InteriorPoint Algorithm For Nonconvex Nonlinear Programming
 COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
, 1997
"... The paper describes an interiorpoint algorithm for nonconvex nonlinear programming which is a direct extension of interiorpoint methods for linear and quadratic programming. Major modifications include a merit function and an altered search direction to ensure that a descent direction for the mer ..."
Abstract

Cited by 196 (14 self)
 Add to MetaCart
The paper describes an interiorpoint algorithm for nonconvex nonlinear programming which is a direct extension of interiorpoint methods for linear and quadratic programming. Major modifications include a merit function and an altered search direction to ensure that a descent direction for the merit function is obtained. Preliminary numerical testing indicates that the method is robust. Further, numerical comparisons with MINOS and LANCELOT show that the method is efficient, and has the promise of greatly reducing solution times on at least some classes of models.
Interior methods for nonlinear optimization
 SIAM REVIEW
, 2002
"... Interior methods are an omnipresent, conspicuous feature of the constrained optimization landscape today, but it was not always so. Primarily in the form of barrier methods, interiorpoint techniques were popular during the 1960s for solving nonlinearly constrained problems. However, their use for ..."
Abstract

Cited by 120 (5 self)
 Add to MetaCart
(Show Context)
Interior methods are an omnipresent, conspicuous feature of the constrained optimization landscape today, but it was not always so. Primarily in the form of barrier methods, interiorpoint techniques were popular during the 1960s for solving nonlinearly constrained problems. However, their use for linear programming was not even contemplated because of the total dominance of the simplex method. Vague but continuing anxiety about barrier methods eventually led to their abandonment in favor of newly emerging, apparently more efficient alternatives such as augmented Lagrangian and sequential quadratic programming methods. By the early 1980s, barrier methods were almost without exception regarded as a closed chapter in the history of optimization. This picture changed dramatically with Karmarkar’s widely publicized announcement in 1984 of a fast polynomialtime interior method for linear programming; in 1985, a formal connection was established between his method and classical barrier methods. Since then, interior methods have advanced so far, so fast, that their influence has transformed both the theory and practice of constrained optimization. This article provides a condensed, selective look at classical material and recent research about interior methods for nonlinearly constrained optimization.
An interior point algorithm for largescale nonlinear . . .
, 2002
"... Nonlinear programming (NLP) has become an essential tool in process engineering, leading to prot gains through improved plant designs and better control strategies. The rapid advance in computer technology enables engineers to consider increasingly complex systems, where existing optimization codes ..."
Abstract

Cited by 62 (3 self)
 Add to MetaCart
Nonlinear programming (NLP) has become an essential tool in process engineering, leading to prot gains through improved plant designs and better control strategies. The rapid advance in computer technology enables engineers to consider increasingly complex systems, where existing optimization codes reach their practical limits. The objective of this dissertation is the design, analysis, implementation, and evaluation of a new NLP algorithm that is able to overcome the current bottlenecks, particularly in the area of process engineering. The proposed algorithm follows an interior point approach, thereby avoiding the combinatorial complexity of identifying the active constraints. Emphasis is laid on exibility in the computation of search directions, which allows the tailoring of the method to individual applications and is mandatory for the solution of very large problems. In a fullspace version the method can be used as general purpose NLP solver, for example in modeling environments such as Ampl. The reduced space version, based on coordinate decomposition, makes it possible to tailor linear algebra
A Computationally Efficient Feasible Sequential Quadratic Programming Algorithm
 SIAM Journal on Optimization
, 2001
"... . A sequential quadratic programming (SQP) algorithm generating feasible iterates is described and analyzed. What distinguishes this algorithm from previous feasible SQP algorithms proposed by various authors is a reduction in the amount of computation required to generate a new iterate while the pr ..."
Abstract

Cited by 55 (0 self)
 Add to MetaCart
(Show Context)
. A sequential quadratic programming (SQP) algorithm generating feasible iterates is described and analyzed. What distinguishes this algorithm from previous feasible SQP algorithms proposed by various authors is a reduction in the amount of computation required to generate a new iterate while the proposed scheme still enjoys the same global and fast local convergence properties. A preliminary implementation has been tested and some promising numerical results are reported. Key words. sequential quadratic programming, SQP, feasible iterates, feasible SQP, FSQP AMS subject classifications. 49M37, 65K05, 65K10, 90C30, 90C53 PII. S1052623498344562 1.
A Globally Convergent PrimalDual InteriorPoint Filter Method for Nonlinear Programming
, 2002
"... In this paper, the filter technique of Fletcher and Leyffer (1997) is used to globalize the primaldual interiorpoint algorithm for nonlinear programming, avoiding the use of merit functions and the updating of penalty parameters. The new algorithm decomposes the primaldual step obtained from the p ..."
Abstract

Cited by 51 (4 self)
 Add to MetaCart
In this paper, the filter technique of Fletcher and Leyffer (1997) is used to globalize the primaldual interiorpoint algorithm for nonlinear programming, avoiding the use of merit functions and the updating of penalty parameters. The new algorithm decomposes the primaldual step obtained from the perturbed firstorder necessary conditions into a normal and a tangential step, whose sizes are controlled by a trustregion type parameter. Each entry in the filter is a pair of coordinates: one resulting from feasibility and centrality, and associated with the normal step; the other resulting from optimality (complementarity and duality), and related with the tangential step. Global convergence to firstorder critical points is proved for the new primaldual interiorpoint filter algorithm.
Failure of global convergence for a class of interior point methods for nonlinear programming
 Math. Program
"... ..."
(Show Context)
A class of globally convergent optimization methods based on conservative convex separable approximations
 SIAM Journal on Optimization
"... Abstract. This paper deals with a certain class of optimization methods, based on conservative convexseparable approximations (CCSA), for solving inequalityconstrained nonlinear programming problems. Each generated iteration point is a feasible solution with lower objective value than the previous ..."
Abstract

Cited by 39 (2 self)
 Add to MetaCart
(Show Context)
Abstract. This paper deals with a certain class of optimization methods, based on conservative convexseparable approximations (CCSA), for solving inequalityconstrained nonlinear programming problems. Each generated iteration point is a feasible solution with lower objective value than the previous one, and it is proved that the sequence of iteration points converges toward the set of Karush–Kuhn–Tucker points. A major advantage of CCSA methods is that they can be applied to problems with a very large number of variables (say 10 4 –10 5) even if the Hessian matrices of the objective and constraint functions are dense.
A PrimalDual InteriorPoint Method for Nonlinear Programming with Strong Global and Local Convergence Properties
 SIAM Journal on Optimization
, 2002
"... An exactpenaltyfunctionbased schemeinspired from an old idea due to Mayne and Polak (Math. Prog., vol. 11, 1976, pp. 6780)is proposed for extending to general smooth constrained optimization problems any given feasible interiorpoint method for inequality constrained problems. It is s ..."
Abstract

Cited by 37 (5 self)
 Add to MetaCart
(Show Context)
An exactpenaltyfunctionbased schemeinspired from an old idea due to Mayne and Polak (Math. Prog., vol. 11, 1976, pp. 6780)is proposed for extending to general smooth constrained optimization problems any given feasible interiorpoint method for inequality constrained problems. It is shown that the primaldual interiorpoint framework allows for a simpler penalty parameter update rule than that discussed and analyzed by the originators of the scheme in the context of first order methods of feasible direction. Strong global and local convergence results are proved under mild assumptions. In particular, (i) the proposed algorithm does not su#er a common pitfall # Department of Electrical and Computer Engineering and Institute for Systems Research, University of Maryland, College Park, MD 20742, USA + IBM T.J. Watson Research Center, Yorktown Heights, NY 10598, USA # Applied Physics Laboratory, Laurel, MD 20723, USA Alphatech, Arlington, VA 22203, USA recently pointed out by Wachter and Biegler; and (ii) the positive definiteness assumption on the Hessian estimate, made in the original version of the algorithm, is relaxed, allowing for the use of exact Hessian information, resulting in local quadratic convergence. Promising numerical results are reported.
Advances in Simultaneous Strategies for Dynamic Process Optimization
 Optimization, Chemical Engineering Science
, 2001
"... Introduction Over the past decade, applications in dynamic simulation have increased signicantly in the process industries. These are driven by strong competitive markets faced by operating companies along with tighter specications on process performance and regulatory limits. Moreover, the develop ..."
Abstract

Cited by 33 (7 self)
 Add to MetaCart
(Show Context)
Introduction Over the past decade, applications in dynamic simulation have increased signicantly in the process industries. These are driven by strong competitive markets faced by operating companies along with tighter specications on process performance and regulatory limits. Moreover, the developmentofpowerful commercial modeling tools for dynamic simulation, such as ASPEN Custom # ####### ########################## #### ############### ################### 1 Modeler and gProms, has led to their introduction in industry alongside their widely used steady state counterparts. Dynamic optimization is the natural extension of these dynamic simulation tools because it automates many of the decisions required for engineering studies. Applications of dynamic simulation can be classied into oline and online tasks. Oline tasks include: # Design to avoid undesirable transients for chemical process