Results 1  10
of
25
Derivativefree optimization: A review of algorithms and comparison of software implementations
"... ..."
Comparison of DerivativeFree Optimization Methods for Groundwater Supply and Hydraulic Capture Community Problems
 ADVANCES IN WATER RESOURCES
, 2008
"... Management decisions involving groundwater supply and remediation often rely on optimization techniques to determine an effective strategy. We introduce several derivativefree sampling methods for solving constrained optimization problems that have not yet been considered in this field, and we incl ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
Management decisions involving groundwater supply and remediation often rely on optimization techniques to determine an effective strategy. We introduce several derivativefree sampling methods for solving constrained optimization problems that have not yet been considered in this field, and we include a genetic algorithm for completeness. Two welldocumented community problems are used for illustration purposes: a groundwater supply problem and a hydraulic capture problem. The community problems were found to be challenging applications due to the objective functions being nonsmooth, nonlinear, and having many local minima. Because the results were found to be sensitive to initial iterates for some methods, guidance is provided in selecting initial iterates for these problems that improve the likelihood Preprint submitted to Elsevier 14 January 2008of achieving significant reductions in the objective function to be minimized. In addition, we suggest some potentially fruitful areas for future research.
Coevolution of Fitness Predictors
 IEEE Transactions on Evolutionary Computation
, 2008
"... Abstract—We present an algorithm that coevolves fitness predictors, optimized for the solution population, which reduce fitness evaluation cost and frequency, while maintaining evolutionary progress. Fitness predictors differ from fitness models in that they may or may not represent the objective fi ..."
Abstract

Cited by 17 (10 self)
 Add to MetaCart
(Show Context)
Abstract—We present an algorithm that coevolves fitness predictors, optimized for the solution population, which reduce fitness evaluation cost and frequency, while maintaining evolutionary progress. Fitness predictors differ from fitness models in that they may or may not represent the objective fitness, opening opportunities to adapt selection pressures and diversify solutions. The use of coevolution addresses three fundamental challenges faced in past fitness approximation research: 1) the model learning investment; 2) the level of approximation of the model; and 3) the loss of accuracy. We discuss applications of this approach and demonstrate its impact on the symbolic regression problem. We show that coevolved predictors scale favorably with problem complexity on a series of randomly generated test problems. Finally, we present additional empirical results that demonstrate that fitness prediction can also reduce solution bloat and find solutions more reliably. Index Terms—Bloat Reduction, coevolution, fitness modeling, symbolic regression.
THE CORRELATED KNOWLEDGE GRADIENT FOR SIMULATION OPTIMIZATION OF CONTINUOUS PARAMETERS USING GAUSSIAN PROCESS REGRESSION
"... Abstract. We extend the concept of the correlated knowledgegradient policy for ranking and selection of a finite set of alternatives to the case of continuous decision variables. We propose an approximate knowledge gradient for problems with continuous decision variables in the context of a Gaussia ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Abstract. We extend the concept of the correlated knowledgegradient policy for ranking and selection of a finite set of alternatives to the case of continuous decision variables. We propose an approximate knowledge gradient for problems with continuous decision variables in the context of a Gaussian process regression model in a Bayesian setting, along with an algorithm to maximize the approximate knowledge gradient. In the problem class considered, we use the knowledge gradient for continuous parameters to sequentially choose where to sample an expensive noisy function in order to find the maximum quickly. We show that the knowledge gradient for continuous decisions is a generalization of the efficient global optimization algorithm proposed by Jones, Schonlau, and Welch.
An adaptive radial basis algorithm (ARBF) for expensive blackbox global optimization’.
 Journal of Global Optimization, DOI 10.1007/s1089800792568, ISSN 09255001 (Print)
, 2007
"... Abstract Response surface methods based on kriging and radial basis function (RBF) interpolation have been successfully applied to solve expensive, i.e. computationally costly, global blackbox nonconvex optimization problems. In this paper we describe extensions of these methods to handle linear, ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
Abstract Response surface methods based on kriging and radial basis function (RBF) interpolation have been successfully applied to solve expensive, i.e. computationally costly, global blackbox nonconvex optimization problems. In this paper we describe extensions of these methods to handle linear, nonlinear, and integer constraints. In particular, algorithms for standard RBF and the new adaptive RBF (ARBF) are described. Note, however, while the objective function may be expensive, we assume that any nonlinear constraints are either inexpensive or are incorporated into the objective function via penalty terms. Test results are presented on standard test problems, both nonconvex problems with linear and nonlinear constraints, and mixedinteger nonlinear problems (MINLP). Solvers in the TOMLAB Optimization Environment (http://tomopt.com/tomlab/) have been compared, specifically the three deterministic derivativefree solvers rbfSolve, ARBFMIP and EGO with three derivativebased mixedinteger nonlinear solvers, OQNLP, MINLPBB and MISQP, as well as the GENO solver implementing a stochastic genetic algorithm. Results show that the deterministic derivativefree methods compare well with the derivativebased ones, but the stochastic genetic algorithm solver is several orders of magnitude too slow for practical use. When the objective function for the test problems is costly to evaluate, the performance of the ARBF algorithm proves to be superior.
Using Radial Basis Function Neural Networks to Calibrate Water Quality Model
 International Journal of Intelligent Systems and Technologies
, 2008
"... Abstract—Modern managements of water distribution system (WDS) need water quality models that are able to accurately predict the dynamics of water quality variations within the distribution system environment. Before water quality models can be applied to solve system problems, they should be calibr ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Modern managements of water distribution system (WDS) need water quality models that are able to accurately predict the dynamics of water quality variations within the distribution system environment. Before water quality models can be applied to solve system problems, they should be calibrated. Although former researchers use GA solver to calibrate relative parameters, it is difficult to apply on the largescale or mediumscale real system for long computational time. In this paper a new method is designed which combines both macro and detailed model to optimize the water quality parameters. This new combinational algorithm uses radial basis function (RBF) metamodeling as a surrogate to be optimized for the purpose of decreasing the times of timeconsuming water quality simulation and can realize rapidly the calibration of pipe wall reaction coefficients of chlorine model of largescaled WDS. After two cases study this method is testified to be more efficient and promising, and deserve to generalize in the future. Keywords—Metamodeling, model calibration, radial basis function, water distribution system, water quality model. I.
Mixture surrogate models based on DempsterShafer theory for global optimization problems
 JOURNAL OF GLOBAL OPTIMIZATION
, 2011
"... ..."
Multifidelity Methods for Multidisciplinary System Design
, 2012
"... Optimization of multidisciplinary systems is critical as slight performance improvements can provide significant benefits over the system’s life. However, optimization of multidisciplinary systems is often plagued by computationally expensive simulations and the need to iteratively solve a complex ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Optimization of multidisciplinary systems is critical as slight performance improvements can provide significant benefits over the system’s life. However, optimization of multidisciplinary systems is often plagued by computationally expensive simulations and the need to iteratively solve a complex couplingrelationship between subsystems. These challenges are typically severe enough as to prohibit formal system optimization. A solution is to use multifidelity optimization, where other lowerfidelity simulations may be used to approximate the behavior of the higherfidelity simulation. Lowfidelity simulations are common in practice,
Optimizing Radial Basis Functions by D.C. Programming and its use in Direct Search for Global DerivativeFree Optimization
, 2011
"... In this paper we address the global optimization of functions subject to bound and linear constraints without using derivatives of the objective function. We investigate the use of derivativefree models based on radial basis functions (RBFs) in the search step of directsearch methods of directiona ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper we address the global optimization of functions subject to bound and linear constraints without using derivatives of the objective function. We investigate the use of derivativefree models based on radial basis functions (RBFs) in the search step of directsearch methods of directional type. We also study the application of algorithms based on difference of convex (d.c.) functions programming to solve the resulting subproblems which consist of the minimization of the RBF models subject to simple bounds on the variables. Extensive numerical results are reported with a test set of bound and linearly constrained problems.