Results 1  10
of
38
A Computationally Efficient Feasible Sequential Quadratic Programming Algorithm
 SIAM Journal on Optimization
, 2001
"... . A sequential quadratic programming (SQP) algorithm generating feasible iterates is described and analyzed. What distinguishes this algorithm from previous feasible SQP algorithms proposed by various authors is a reduction in the amount of computation required to generate a new iterate while the pr ..."
Abstract

Cited by 56 (0 self)
 Add to MetaCart
. A sequential quadratic programming (SQP) algorithm generating feasible iterates is described and analyzed. What distinguishes this algorithm from previous feasible SQP algorithms proposed by various authors is a reduction in the amount of computation required to generate a new iterate while the proposed scheme still enjoys the same global and fast local convergence properties. A preliminary implementation has been tested and some promising numerical results are reported. Key words. sequential quadratic programming, SQP, feasible iterates, feasible SQP, FSQP AMS subject classifications. 49M37, 65K05, 65K10, 90C30, 90C53 PII. S1052623498344562 1.
Scatter search and local NLP solvers: A multistart framework for global optimization
 INFORMS Journal on Computing
"... doi 10.1287/ijoc.1060.0175 ..."
Evolutionary Search of Approximated NDimensional Landscapes
 International Journal of Knowledgebased Intelligent Engineering Systems
, 2000
"... Finding the global optimum on a large, multimodal, complex, and discontinuous (or nondifferentiable) landscape is usually very hard, even using the evolutionary approach. However, some of these complex landscapes can be approximated and smoothened without changing the nature of the problem, i.e., wi ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
(Show Context)
Finding the global optimum on a large, multimodal, complex, and discontinuous (or nondifferentiable) landscape is usually very hard, even using the evolutionary approach. However, some of these complex landscapes can be approximated and smoothened without changing the nature of the problem, i.e., without modifying the global optimum and its location. The approximated and smoothened landscape is often much easier to search than the original one. In this paper, we propose a new algorithm using landscape approximation and hybrid evolutionary and local search. We also list several algorithm design principles. Following the basic algorithm, an example algorithm is given from our previous work of the combination of landscape approximation and local search (LALS). Furthermore, we develop a novel evolutionary algorithm with ndimensional approximation (EANA), which shares the same rules as the basic algorithm, but remedies some of the drawbacks found in the LALS. Comparisons with evo...
Local Optima Smoothing for Global Optimization
 OPTIMIZATION METHODS AND SOFTWARE
, 2005
"... It is widely believed that in order to solve large scale global optimization probzat an appropriate mixture of local approximation and glob exploration is necessary. Local approximation, if first order information on theob jective function is availabU is efficiently performedb y means of local opti ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
It is widely believed that in order to solve large scale global optimization probzat an appropriate mixture of local approximation and glob exploration is necessary. Local approximation, if first order information on theob jective function is availabU is efficiently performedb y means of local optimization methods. Unfortunately, glob exploration, in ab of some kind of glob information on theprob is a"b procedure, aimed at placing ob ations as evenly as possibF in the search domain. Often this procedure reduces to uniform random sampling (like in Multistart algorithms, or in techniquesbiqu on clustering). In this paper we propose a new framework for glob exploration which tries to guide random exploration towards the region of attraction of lowlevel local optima. The main idea originatedb y the use of smoothing techniques(bque on gaussian convolutions): the possibF y of applying a smoothing transformation not to theob jective functionbn to the result of local searches seems to have neverb een explored yet. Although an exact smoothing of the results of local searches is impossib to implement, in this paper we propose a computational approximation scheme which has proven tob e very efficient and (mayb e more important) extremely robust in solving large scale global optimization probmiz with huge numbers of local optima.
Global Optimization for the Biaffine Matrix Inequality Problem
, 1995
"... It has recently been shown that an extremely wide array of robust controller design problems may be reduced to the problem of finding a feasible point under a Biaffine Matrix Inequality (BMI) constraint. The BMI feasibility problem is the bilinear version of the Linear (Affine) Matrix Inequality (L ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
It has recently been shown that an extremely wide array of robust controller design problems may be reduced to the problem of finding a feasible point under a Biaffine Matrix Inequality (BMI) constraint. The BMI feasibility problem is the bilinear version of the Linear (Affine) Matrix Inequality (LMI) feasibility problem, and may also be viewed as a bilinear extension to the Semidefinite Programming (SDP) problem. The BMI problem may be approached as a biconvex global optimization problem of minimizing the maximum eigenvalue of a biaffine combination of symmetric matrices. This paper presents a branch and bound global optimization algorithm for the BMI. A simple numerical example is included. The robust control problem, i.e., the synthesis of a controller for a dynamic physical system which guarantees stability and performance in the face of significant modelling error and worstcase disturbance inputs, is frequently encountered in a variety of complex engineering applications including the design of aircraft, satellites, chemical plants, and other precision positioning and tracking systems.
Computational Experience With The Molecular Distance Geometry Problem
"... In this work we consider the molecular distance geometry problem, which can be defined as the determination of the threedimensional structure of a molecule based on distances between some pairs of atoms. We address the problem as a nonconvex leastsquares problem. We apply three global optimization ..."
Abstract

Cited by 15 (14 self)
 Add to MetaCart
In this work we consider the molecular distance geometry problem, which can be defined as the determination of the threedimensional structure of a molecule based on distances between some pairs of atoms. We address the problem as a nonconvex leastsquares problem. We apply three global optimization algorithms (spatial BranchandBound, Variable Neighbourhood Search, Multi Level Single Linkage) to two sets of instances, one taken from the literature and the other new. Keywords: molecular conformation, distance geometry, global optimization, spatial BranchandBound, variable neighbourhood search, multi level single linkage.
Parallel Approaches to Stochastic Global Optimization
 In Parallel Computing: From Theory to Sound Practice, W. Joosen and E. Milgrom, Eds., IOS
, 1992
"... In this paper we review parallel implementations of some stochastic global optimization methods on MIMD computers. Moreover, we present a new parallel version of an Evolutionary Algorithm for global optimization, where the inherent parallelism can be scaled to obtain a reasonable processor utilizati ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
(Show Context)
In this paper we review parallel implementations of some stochastic global optimization methods on MIMD computers. Moreover, we present a new parallel version of an Evolutionary Algorithm for global optimization, where the inherent parallelism can be scaled to obtain a reasonable processor utilization. For this algorithm the convergence to the global optimum with probability one can be assured. Test results concerning speed up and reliability are given. 1 Introduction Many real world problems in engineering and economics can be formulated as optimization problems, in which the objective function is multimodal, i.e. the problem possesses many local minima. Compared to the number of methods designed to determine a local minimum, there are only a few methods which attempt to find the global minimum (see [52] for a survey). Although there are some special cases where the global optimum can be found (see [26]) the general case is unsolvable. This paper will be restricted to the more gener...
Comparison of Deterministic and Stochastic Approaches to global optimization
"... In this paper we compare two different approaches to nonconvex global optimization. The first one is a deterministic spatial BranchandBound algorithm (sBB), whereas the second approach is a quasi Monte Carlo (QMC) variant of a stochastic multi level single linkage (MLSL) algorithm. Both algorithms ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
In this paper we compare two different approaches to nonconvex global optimization. The first one is a deterministic spatial BranchandBound algorithm (sBB), whereas the second approach is a quasi Monte Carlo (QMC) variant of a stochastic multi level single linkage (MLSL) algorithm. Both algorithms apply to problems in a very general form and are not dependent on problem structure. The test suite we chose is fairly extensive in scope, in that it includes constrained and unconstrained problems, continuous and mixedinteger problems. The conclusion of the tests is that in general the QMC variant of the MLSL algorithm is more efficient, although in some instances the BranchandBound algorithm is capable of locating the global optimum of hard problems in just one iteration.
A trustregion algorithm for global optimization
 Comput. Optim. Appl
, 2004
"... We consider the global minimization of a boundconstrained function with a socalled funnel structure. We develop a twophase procedure that uses sampling, local optimization, and Gaussian smoothing to construct a smooth model of the underlying funnel. The procedure is embedded in a trustregion fra ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
We consider the global minimization of a boundconstrained function with a socalled funnel structure. We develop a twophase procedure that uses sampling, local optimization, and Gaussian smoothing to construct a smooth model of the underlying funnel. The procedure is embedded in a trustregion framework that avoids the pitfalls of a fixed sampling radius. We present a numerical comparison to three popular methods and show that the new algorithm is robust and uses up to 20 times fewer local minimizations steps.