Results 1  10
of
32
A LocallyBiased Form Of The Direct Algorithm
 Journal of Global Optimization
, 2001
"... . In this paper we propose a form of the DIRECT algorithm that is strongly biased toward local search. This form should do well for small problems with a single global minimizer and only a few local minimizers. We motivate our formulation with some results on how the original formulation of the DIRE ..."
Abstract

Cited by 40 (4 self)
 Add to MetaCart
(Show Context)
. In this paper we propose a form of the DIRECT algorithm that is strongly biased toward local search. This form should do well for small problems with a single global minimizer and only a few local minimizers. We motivate our formulation with some results on how the original formulation of the DIRECT algorithm clusters its search near a global minimizer. We report on the performance of our algorithm on a suite of test problems and observe that the algorithm performs particularly well when termination is based on a budget of function evaluations. Key words. DIRECT, local clustering, local bias 1. Introduction. The DIRECT (DIviding RECTangles) algorithm [13, 14] is a pattern search method (in the sense of [17]) that balances local and global search in a attempt to efficiently find a global optimizer. Other deterministic sampling methods, such as implicit filtering [9, 15], MDS [6], HookeJeeves [10], or NelderMead [16], drive an approximate gradient to zero and are not designed for g...
Dynamic Data Structures for a Direct Search Algorithm
 Computational Optimization and Applications
, 2002
"... The DIRECT (DIviding RECTangles) algorithm of Jones, Perttunen, and Stuckman (Journal of Optimization Theory and Applications, vol. 79, no. 1, pp. 157181, 1993), a variant of Lipschitzian methods for bound constrained global optimization, has proved effective even in higher dimensions. However, th ..."
Abstract

Cited by 31 (14 self)
 Add to MetaCart
(Show Context)
The DIRECT (DIviding RECTangles) algorithm of Jones, Perttunen, and Stuckman (Journal of Optimization Theory and Applications, vol. 79, no. 1, pp. 157181, 1993), a variant of Lipschitzian methods for bound constrained global optimization, has proved effective even in higher dimensions. However, the performance of a DIRECT implementation in real applications depends on the characteristics of the objective function, the problem dimension, and the desired solution accuracy. Implementations with static data structures often fail in practice, since it is difficult to predict memory resource requirements in advance. This is especially critical in multidisciplinary engineering design applications, where the DIRECT optimization is just one small component of a much larger computation, and any component failure aborts the entire design process. To make the DIRECT global optimization algorithm efficient and robust on largescale, multidisciplinary engineering problems, a set of dynamic data structures is proposed here to balance the memory requirements with execution time, while simultaneously adapting to arbitrary problem size. The focus of this paper is on design issues of the dynamic data structures, and related memory management strategies. Numerical computing techniques and modifications of Jones' original DIRECT algorithm in terms of stopping rules and box selection rules are also explored. Performance studies are done for synthetic test problems with multiple local optima. Results for application to a sitespecific system simulator for wireless communications systems (S W ) are also presented to demonstrate the effectiveness of the proposed dynamic data structures for an implementation of DIRECT.
Convergence analysis of the DIRECT algorithm
 North Carolina State University, Center for
, 2004
"... Abstract. The DIRECT algorithm is a deterministic sampling method for bound constrained Lipschitz continuous optimization. We prove a subsequential convergence result for the DIRECT algorithm that quantifies some of the convergence observations in the literature. Our results apply to several variati ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Abstract. The DIRECT algorithm is a deterministic sampling method for bound constrained Lipschitz continuous optimization. We prove a subsequential convergence result for the DIRECT algorithm that quantifies some of the convergence observations in the literature. Our results apply to several variations on the original method, including one that will handle general constraints. We use techniques from nonsmooth analysis, and our framework is based on recent results for the MADS sampling algorithms.
Using DIRECT to solve an aircraft routing problem
 Computational Optimization and Applications
, 2002
"... In this paper we discuss a global optimization problem arising in the calculation of aircraft flight paths. Since gradient information for this problem may not be readily available, a directsearch algorithm (DIRECT), proposed by Jones et al. [11] appears to be a promising solution technique. We des ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
(Show Context)
In this paper we discuss a global optimization problem arising in the calculation of aircraft flight paths. Since gradient information for this problem may not be readily available, a directsearch algorithm (DIRECT), proposed by Jones et al. [11] appears to be a promising solution technique. We describe some numerical experience in which DIRECT is used in several different ways to solve a sample problem. 1
A multipoints criterion for deterministic parallel global optimization based on gaussian processes
 Journal of Global Optimization, in revision
, 2009
"... The optimization of expensivetoevaluate functions generally relies on metamodelbased exploration strategies. Many deterministic global optimization algorithms used in the field of computer experiments are based on Kriging (Gaussian process regression). Starting with a spatial predictor including a ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
The optimization of expensivetoevaluate functions generally relies on metamodelbased exploration strategies. Many deterministic global optimization algorithms used in the field of computer experiments are based on Kriging (Gaussian process regression). Starting with a spatial predictor including a measure of uncertainty, they proceed by iteratively choosing the point maximizing a criterion which is a compromise between predicted performance and uncertainty. Distributing the evaluation of such numerically expensive objective functions on many processors is an appealing idea. Here we investigate a multipoints optimization criterion, the multipoints expected improvement (qEI), aimed at choosing several points at the same time. An analytical expression of the qEI is given when q = 2, and a consistent statistical estimate is given for the general case. We then propose two classes of heuristic strategies meant to approximately optimize the qEI, and apply them to Gaussian Processes and to the classical BraninHoo testcase function. It is finally demonstrated within the covered example that the latter strategies perform as good as the best Latin Hypercubes and Uniform Designs ever found by simulation (2000 designs drawn at random for every q ∈ [1, 10]).
Polynomial response surface approximations for the multidisciplinary design optimization of a high speed civil transport” Technical Reports of NCSTRL at Virginia Tech CS(TR0103
, 2001
"... Abstract. Surrogate functions have become an important tool in multidisciplinary design optimization to deal with noisy functions, high computational cost, and the practical difficulty of integrating legacy disciplinary computer codes. A combination of mathematical, statistical, and engineering tech ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
Abstract. Surrogate functions have become an important tool in multidisciplinary design optimization to deal with noisy functions, high computational cost, and the practical difficulty of integrating legacy disciplinary computer codes. A combination of mathematical, statistical, and engineering techniques, well known in other contexts, have made polynomial surrogate functions viable for MDO. Despite the obvious limitations imposed by sparse high fidelity data in high dimensions and the locality of low order polynomial approximations, the success of the panoply of techniques based on polynomial response surface approximations for MDO shows that the implementation details are more important than the underlying approximation method (polynomial, spline, DACE, kernel regression, etc.). This paper selectively surveys some of the ancillary techniques—statistics, global search, parallel computing, variable complexity modeling—that augment the construction and use of polynomial surrogates.
Dynamic system analysis and initial particles position in particle swarm optimization
 in: IEEE Swarm Intelligence Symposium 2006, Indianapolis 12–14
, 2006
"... Abstract — This paper focuses on a solution technique for global optimization problems, where the objective function value is possibly computed by the numerical solution of a PDE system. The nature of these optimization problems is that of a ‘blackbox’ type, where expensive simulations provide infor ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
Abstract — This paper focuses on a solution technique for global optimization problems, where the objective function value is possibly computed by the numerical solution of a PDE system. The nature of these optimization problems is that of a ‘blackbox’ type, where expensive simulations provide information to the optimizer, and each function evaluation could require several CPUhours. The paper considers the evolutionary Particle Swarm Optimization (PSO) algorithm, for the minimization of a nonlinear function in the global optimization frameworks described. We reformulate the standard iteration of PSO [10], [3] into a linear dynamic system. Then, the latter is investigated in order to provide indications for the assessment of the initial particles position. We carry out our analysis on a generalized PSO iteration, which includes the standard one proposed in the literature. Therefore, our results perfectly apply to standard PSO too, without any modifications. In our scheme the path of any particle is possibly affected by the trajectories of all the other particles in the swarm. Our preliminary numerical experience, over a set of 35 standard test problems from the literature, confirms the theoretical analysis. I.
Global search based on efficient diagonal partitions and a set of lipschitz constants
 SIAM J. on Optimization
"... ar ..."
(Show Context)