Results 1  10
of
19
Derivativefree optimization: A review of algorithms and comparison of software implementations
"... ..."
Bayesian Guided Pattern Search for Robust Local Optimization
, 2008
"... Optimization for complex systems in engineering often involves the use of expensive computer simulation. By combining statistical emulation using treed Gaussian processes with pattern search optimization, we are able to perform robust local optimization more efficiently and effectively than using ei ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
(Show Context)
Optimization for complex systems in engineering often involves the use of expensive computer simulation. By combining statistical emulation using treed Gaussian processes with pattern search optimization, we are able to perform robust local optimization more efficiently and effectively than using either method alone. Our approach is based on the augmentation of local search patterns with location sets generated through improvement prediction over the input space. We further develop a computational framework for asynchronous parallel implementation of the optimization algorithm. We demonstrate our methods on two standard test problems and our motivating example of calibrating a circuit device simulator. KEY WORDS: robust local optimization; improvement statistics; response surface methodology; treed Gaussian processes.
An adaptive radial basis algorithm (ARBF) for expensive blackbox global optimization’.
 Journal of Global Optimization, DOI 10.1007/s1089800792568, ISSN 09255001 (Print)
, 2007
"... Abstract Response surface methods based on kriging and radial basis function (RBF) interpolation have been successfully applied to solve expensive, i.e. computationally costly, global blackbox nonconvex optimization problems. In this paper we describe extensions of these methods to handle linear, ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
Abstract Response surface methods based on kriging and radial basis function (RBF) interpolation have been successfully applied to solve expensive, i.e. computationally costly, global blackbox nonconvex optimization problems. In this paper we describe extensions of these methods to handle linear, nonlinear, and integer constraints. In particular, algorithms for standard RBF and the new adaptive RBF (ARBF) are described. Note, however, while the objective function may be expensive, we assume that any nonlinear constraints are either inexpensive or are incorporated into the objective function via penalty terms. Test results are presented on standard test problems, both nonconvex problems with linear and nonlinear constraints, and mixedinteger nonlinear problems (MINLP). Solvers in the TOMLAB Optimization Environment (http://tomopt.com/tomlab/) have been compared, specifically the three deterministic derivativefree solvers rbfSolve, ARBFMIP and EGO with three derivativebased mixedinteger nonlinear solvers, OQNLP, MINLPBB and MISQP, as well as the GENO solver implementing a stochastic genetic algorithm. Results show that the deterministic derivativefree methods compare well with the derivativebased ones, but the stochastic genetic algorithm solver is several orders of magnitude too slow for practical use. When the objective function for the test problems is costly to evaluate, the performance of the ARBF algorithm proves to be superior.
A kriging based method for the solution of mixedinteger nonlinear programs containing blackbox functions
, 2009
"... ..."
Mixture surrogate models based on DempsterShafer theory for global optimization problems
 JOURNAL OF GLOBAL OPTIMIZATION
, 2011
"... ..."
A Scalarizing OneStage Algorithm for Efficient MultiObjective Optimization
"... A novel krigingassisted algorithm is proposed for computationally expensive multiobjective optimization problems, such as those which arise in electromagnetic design. The algorithm combines the multiple objectives into a single objective, which it then optimizes using a onestage method from singl ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
A novel krigingassisted algorithm is proposed for computationally expensive multiobjective optimization problems, such as those which arise in electromagnetic design. The algorithm combines the multiple objectives into a single objective, which it then optimizes using a onestage method from singleobjective optimization. Inequality and equality constraint handling techniques are included, as is a method for dealing with failed iterations. Its efficiency is demonstrated on the constrained optimization of a pair of Helmholtz coils. Index Terms—Kriging, optimization methods. I.
A Recursive Local Polynomial Approximation Method using Dirichlet Clouds and Radial Basis Functions
"... We present a recursive function approximation technique that does not require the storage of the arrival data stream. Our work is motivated by algorithms in stochastic optimization which require approximating functions in a recursive setting such as a stochastic approximation algorithm. The unique c ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We present a recursive function approximation technique that does not require the storage of the arrival data stream. Our work is motivated by algorithms in stochastic optimization which require approximating functions in a recursive setting such as a stochastic approximation algorithm. The unique collection of these features in this technique is essential for nonlinear modeling of large data sets where the storage of the data becomes prohibitively expensive and in circumstances where our knowledge about a given query point increases as new information arrives. The algorithm presented here provides locally adaptive parametric models (such as linear models). The local models are updated using recursive least squares and only stores the statistical representative of the local approximations. The resulting scheme is very fast and memory efficient without compromising accuracy in comparison to the standard and some advanced techniques used for functional data analysis in the literature. We motivate the algorithm using synthetic data and illustrate the algorithm on several real data sets. 1
Optimizing Radial Basis Functions by D.C. Programming and its use in Direct Search for Global DerivativeFree Optimization
, 2011
"... In this paper we address the global optimization of functions subject to bound and linear constraints without using derivatives of the objective function. We investigate the use of derivativefree models based on radial basis functions (RBFs) in the search step of directsearch methods of directiona ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper we address the global optimization of functions subject to bound and linear constraints without using derivatives of the objective function. We investigate the use of derivativefree models based on radial basis functions (RBFs) in the search step of directsearch methods of directional type. We also study the application of algorithms based on difference of convex (d.c.) functions programming to solve the resulting subproblems which consist of the minimization of the RBF models subject to simple bounds on the variables. Extensive numerical results are reported with a test set of bound and linearly constrained problems.