Results 1  10
of
13
A Computationally Efficient Feasible Sequential Quadratic Programming Algorithm
 SIAM Journal on Optimization
, 2001
"... . A sequential quadratic programming (SQP) algorithm generating feasible iterates is described and analyzed. What distinguishes this algorithm from previous feasible SQP algorithms proposed by various authors is a reduction in the amount of computation required to generate a new iterate while the pr ..."
Abstract

Cited by 56 (0 self)
 Add to MetaCart
. A sequential quadratic programming (SQP) algorithm generating feasible iterates is described and analyzed. What distinguishes this algorithm from previous feasible SQP algorithms proposed by various authors is a reduction in the amount of computation required to generate a new iterate while the proposed scheme still enjoys the same global and fast local convergence properties. A preliminary implementation has been tested and some promising numerical results are reported. Key words. sequential quadratic programming, SQP, feasible iterates, feasible SQP, FSQP AMS subject classifications. 49M37, 65K05, 65K10, 90C30, 90C53 PII. S1052623498344562 1.
An SQP Algorithm For Finely Discretized Continuous Minimax Problems And Other Minimax Problems With Many Objective Functions
, 1996
"... . A common strategy for achieving global convergence in the solution of semiinfinite programming (SIP) problems, and in particular of continuous minimax problems, is to (approximately) solve a sequence of discretized problems, with a progressively finer discretization meshes. Finely discretized min ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
. A common strategy for achieving global convergence in the solution of semiinfinite programming (SIP) problems, and in particular of continuous minimax problems, is to (approximately) solve a sequence of discretized problems, with a progressively finer discretization meshes. Finely discretized minimax and SIP problems, as well as other problems with many more objectives /constraints than variables, call for algorithms in which successive search directions are computed based on a small but significant subset of the objectives/constraints, with ensuing reduced computing cost per iteration and decreased risk of numerical difficulties. In this paper, an SQPtype algorithm is proposed that incorporates this idea in the particular case of minimax problems. The general case will be considered in a separate paper. The quadratic programming subproblem that yields the search direction involves only a small subset of the objective functions. This subset is updated at each iteration in such a wa...
Feasible Sequential Quadratic Programming For Finely Discretized Problems From Sip
, 1998
"... A Sequential Quadratic Programming algorithm designed to efficiently solve nonlinear optimization problems with many inequality constraints, e.g. problems arising from finely discretized SemiInfinite Programming, is described and analyzed. The key features of the algorithm are (i) that only a few o ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
A Sequential Quadratic Programming algorithm designed to efficiently solve nonlinear optimization problems with many inequality constraints, e.g. problems arising from finely discretized SemiInfinite Programming, is described and analyzed. The key features of the algorithm are (i) that only a few of the constraints are used in the QP subproblems at each iteration, and (ii) that every iterate satisfies all constraints. 1 INTRODUCTION Consider the SemiInfinite Programming (SIP) problem minimize f(x) subject to \Phi(x) 0; (SI) where f : IR n ! IR is continuously differentiable, and \Phi : IR n ! IR is defined by \Phi(x) \Delta = sup ¸2[0;1] OE(x; ¸); with OE : IR n \Theta [0; 1] ! IR continuously differentiable in the first argument. For an excellent survey of the theory behind the problem (SI), in addition to some algorithms and applications, see [9] as well as the other papers in the present volume. Many globally convergent algorithms designed to solve (SI) 2 Chapter 1...
Algebraic Approach to Robust Controller Design: A Geometric Interpretation
 Proceedings of the American Control Conference
, 1998
"... The problem of robust controller design is addressed for a singleinput singleoutput plant with a single uncertain parameter. Given one controller that stabilizes the nominal plant, the YoulaKucera parametrization of all stabilizing controllers and quadratic forms over HermiteFujiwara matrices ar ..."
Abstract

Cited by 8 (7 self)
 Add to MetaCart
The problem of robust controller design is addressed for a singleinput singleoutput plant with a single uncertain parameter. Given one controller that stabilizes the nominal plant, the YoulaKucera parametrization of all stabilizing controllers and quadratic forms over HermiteFujiwara matrices are used to provide clear and simple geometric answers to the following questions: Can the plant be robustly stabilized by a nominally stabilizing controller ? How can this robust controller be designed ? Thanks to recent results on bilinear matrix inequalities, this geometric interpretation allows to state the equivalence between robust controller design and the concave minimization problem. 1 Introduction Since the pioneering work of Kharitonov, significant results have been achieved through the polynomial approach to linear systems robustness. In his monograph [1], Barmish presents a clear and comprehensive survey of existing techniques. Given a nominally stable polynomial with a single un...
A Unifying Polyhedral Approximation Framework for Convex Optimization
"... We propose a unifying framework for polyhedral approximation in convex optimization. It subsumes classical methods, such as cutting plane and simplicial decomposition, but also includes new methods, and new versions/extensions of old methods, such as a simplicial decomposition method for nondifferen ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We propose a unifying framework for polyhedral approximation in convex optimization. It subsumes classical methods, such as cutting plane and simplicial decomposition, but also includes new methods, and new versions/extensions of old methods, such as a simplicial decomposition method for nondifferentiable optimization, and a new piecewise linear approximation method for convex single commodity network flow problems. Our framework is based on an extended form of monotropic programming, a broadly applicable model, which includes as special cases Fenchel duality and Rockafellar’s monotropic programming, and is characterized by an elegant and symmetric duality theory. Our algorithm combines flexibly outer and inner linearization of the cost function. The linearization is progressively refined by using primal and dual differentiation, and the roles of outer and inner linearization are reversed in a mathematically equivalent dual algorithm. We provide convergence results and error bounds for the general case where outer and inner linearization are combined in the same algorithm.
Dynamic Bundle Methods  Application to Combinatorial Optimization
 MATHEMATICAL PROGRAMMING
, 2005
"... Lagrangian relaxation is a popular technique to solve difficult optimization problems. However, the applicability of this technique depends on having a relatively low number of hard constraints to dualize. When there are exponentially many hard constraints, it is preferable to relax them dynamical ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Lagrangian relaxation is a popular technique to solve difficult optimization problems. However, the applicability of this technique depends on having a relatively low number of hard constraints to dualize. When there are exponentially many hard constraints, it is preferable to relax them dynamically, according to some rule depending on which multipliers are active. For instance, only the most violated constraints at a given iteration could be dualized. From the dual point of view, this approach yields multipliers with varying dimensions and a dual objective function that changes along iterations. We discuss how to apply a bundle methodology to solve this kind of dual problems. We analyze the resulting dynamic bundle method giving a positive answer for its convergence properties, including finite termination and a primal result for polyhedral problems. We also report preliminary numerical experience on Linear Ordering and Traveling Salesman Problems.
On an Approach to Optimization Problems with a Probabilistic Cost and or Constraints
, 1998
"... We present a new approach to a class of probability constrained optimization problems that arise in the context of optimal engineering design. These problems are characterized by the fact that the probability of failure of one or several components either must be minimized or must not exceed a preas ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We present a new approach to a class of probability constrained optimization problems that arise in the context of optimal engineering design. These problems are characterized by the fact that the probability of failure of one or several components either must be minimized or must not exceed a preassigned threshold. Our approach is interactive: it consists of replacing the original optimal design problem in which either the cost function or a constraint are expressed in terms of a probability of failure, by a constrained minimax problem. Once the minimax problem is solved, the actual probability of failure is computed. Depending on the outcome of this computation, we provide heuristic rules for modifying the minimax problem and repeating this process a couple of times. An important feature of our new approach is that it decouples optimization and probability of failure calculations. This decoupling allows independent selection of methods for the solution of the optimization and the rel...
A SMOOTHING SCHEME FOR OPTIMIZATION PROBLEMS WITH MAXMIN CONSTRAINTS
"... (Communicated by Adil Bagirov) Abstract. In this paper, we apply a smoothing approach to a minimization problem with a maxmin constraint (i.e., a minmaxmin problem). More specifically, we first rewrite the minmaxmin problem as an optimization problem with several minconstraints and then appro ..."
Abstract
 Add to MetaCart
(Show Context)
(Communicated by Adil Bagirov) Abstract. In this paper, we apply a smoothing approach to a minimization problem with a maxmin constraint (i.e., a minmaxmin problem). More specifically, we first rewrite the minmaxmin problem as an optimization problem with several minconstraints and then approximate each minconstraint function by a smooth function. As a result, the original minmaxmin optimization problem can be solved by solving a sequence of smooth optimization problems. We investigate the relationship between the global optimal value and optimal solutions of the original minmaxmin optimization problem and that of the approximate smooth problem. Under some conditions, we show that the limit points of the firstorder (secondorder) stationary points of the smooth optimization problems are firstorder (secondorder) stationary points of the original minmaxmin optimization problem. 1. Introduction. Consider
1FEASIBLE SEQUENTIAL QUADRATIC PROGRAMMING FOR FINELY DISCRETIZED PROBLEMS FROM SIP
"... A Sequential Quadratic Programming algorithm designed to eciently solve nonlinear optimization problems with many inequality constraints, e.g. problems arising from nely discretized SemiInnite Programming, is described and analyzed. The key features of the algorithm are (i) that only a few of the c ..."
Abstract
 Add to MetaCart
A Sequential Quadratic Programming algorithm designed to eciently solve nonlinear optimization problems with many inequality constraints, e.g. problems arising from nely discretized SemiInnite Programming, is described and analyzed. The key features of the algorithm are (i) that only a few of the constraints are used in the QP subproblems at each iteration, and (ii) that every iterate satises all constraints. 1