Results 1  10
of
19
Mesh adaptive direct search algorithms for constrained optimization
 SIAM J. OPTIM
, 2004
"... This paper introduces the Mesh Adaptive Direct Search (MADS) class of algorithms for nonlinear optimization. MADS extends the Generalized Pattern Search (GPS) class by allowing local exploration, called polling, in a dense set of directions in the space of optimization variables. This means that u ..."
Abstract

Cited by 146 (15 self)
 Add to MetaCart
This paper introduces the Mesh Adaptive Direct Search (MADS) class of algorithms for nonlinear optimization. MADS extends the Generalized Pattern Search (GPS) class by allowing local exploration, called polling, in a dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained. The main GPS convergence result is to identify limit points where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions spans the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many constraints for GPS. The MADS class of algorithms extend this result; the set of refining directions may even be dense in R n, although we give an example where it is not. We present an implementable instance of MADS, and we illustrate and compare it with GPS on some test problems. We also illustrate the limitation of our results with examples.
Convergence of mesh adaptive direct search to secondorder stationary points
 tel00639257, version 1  8 Nov 2011
, 2006
"... Abstract. A previous analysis of secondorder behavior of generalized pattern search algorithms for unconstrained and linearly constrained minimization is extended to the more general class of mesh adaptive direct search (MADS) algorithms for general constrained optimization. Because of the ability ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
(Show Context)
Abstract. A previous analysis of secondorder behavior of generalized pattern search algorithms for unconstrained and linearly constrained minimization is extended to the more general class of mesh adaptive direct search (MADS) algorithms for general constrained optimization. Because of the ability of MADS to generate an asymptotically dense set of search directions, we are able to establish reasonable conditions under which a subsequence of MADS iterates converges to a limit point satisfying secondorder necessary or sufficient optimality conditions for general setconstrained optimization problems.
Optimistic Optimization of a Deterministic Function without the Knowledge of its Smoothness
"... We consider a global optimization problem of a deterministic functionf in a semimetric space, given a finite budget ofnevaluations. The functionf is assumed to be locally smooth (around one of its global maxima) with respect to a semimetric ℓ. We describe two algorithms based on optimistic explorat ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
(Show Context)
We consider a global optimization problem of a deterministic functionf in a semimetric space, given a finite budget ofnevaluations. The functionf is assumed to be locally smooth (around one of its global maxima) with respect to a semimetric ℓ. We describe two algorithms based on optimistic exploration that use a hierarchical partitioning of the space at all scales. A first contribution is an algorithm, DOO, that requires the knowledge of ℓ. We report a finitesample performance bound in terms of a measure of the quantity of nearoptimal states. We then define a second algorithm, SOO, which does not require the knowledge of the semimetric ℓ under which f is smooth, and whose performance is almost as good as DOO optimallyfitted. 1
Comparison of DerivativeFree Optimization Methods for Groundwater Supply and Hydraulic Capture Community Problems
 ADVANCES IN WATER RESOURCES
, 2008
"... Management decisions involving groundwater supply and remediation often rely on optimization techniques to determine an effective strategy. We introduce several derivativefree sampling methods for solving constrained optimization problems that have not yet been considered in this field, and we incl ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
Management decisions involving groundwater supply and remediation often rely on optimization techniques to determine an effective strategy. We introduce several derivativefree sampling methods for solving constrained optimization problems that have not yet been considered in this field, and we include a genetic algorithm for completeness. Two welldocumented community problems are used for illustration purposes: a groundwater supply problem and a hydraulic capture problem. The community problems were found to be challenging applications due to the objective functions being nonsmooth, nonlinear, and having many local minima. Because the results were found to be sensitive to initial iterates for some methods, guidance is provided in selecting initial iterates for these problems that improve the likelihood Preprint submitted to Elsevier 14 January 2008of achieving significant reductions in the objective function to be minimized. In addition, we suggest some potentially fruitful areas for future research.
Developing Portfolios of Water Supply Transfers
, 2005
"... Developing Portfolios of Water Supply Transfers Most cities rely on firm water supply capacity to meet demand, but increasing scarcity and supply costs are encouraging greater use of temporary transfers (e.g., spot leases, options). This raises questions regarding how best to coordinate the use of t ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
(Show Context)
Developing Portfolios of Water Supply Transfers Most cities rely on firm water supply capacity to meet demand, but increasing scarcity and supply costs are encouraging greater use of temporary transfers (e.g., spot leases, options). This raises questions regarding how best to coordinate the use of these transfers in meeting cost and reliability objectives. This work combines a hydrologicwater market simulation with an optimization approach to identify portfolios of permanent rights, options and leases that minimize expected costs of meeting a city’s annual demand with a specified reliability. Spot market prices are linked to hydrologic conditions and described by monthly lease price distributions which are used to price options via a risk neutral approach. Monthly choices regarding when and how much water to acquire through temporary transfers are made on the basis of anticipatory decision rules related to the ratio of expected supplytoexpected demand. The simulation is linked with an algorithm that uses an implicit filtering search method designed for solution surfaces that exhibit high frequency, low amplitude noise. This simulationoptimization approach is applied to a region that currently supports an active water market, with
A Note on the Nonnegativity of Continuoustime ARMA and GARCH Processes.
, 2006
"... Abstract. A general approach for modeling the volatility process in continuoustime is based on the convolution of a kernel with a nondecreasing Lévy process, which is nonnegative if the kernel is nonnegative. Within the framework of Continuoustime AutoRegressive MovingAverage (CARMA) process ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract. A general approach for modeling the volatility process in continuoustime is based on the convolution of a kernel with a nondecreasing Lévy process, which is nonnegative if the kernel is nonnegative. Within the framework of Continuoustime AutoRegressive MovingAverage (CARMA) processes, we derive a necessary condition for the kernel to be nonnegative, and propose a numerical method for checking the nonnegativity of a kernel function. These results can be lifted to solving a similar problem with another approach to modeling volatility via the COntinuoustime Generalized AutoRegressive Conditional Heteroscedastic (COGARCH) processes.
Efficient multistart strategies for local search algorithms
 In
, 2009
"... Local search algorithms applied to optimization problems often suffer from getting trapped in a local optimum. The common solution for this deficiency is to restart the algorithm when no progress is observed. Alternatively, one can start multiple instances of a local search algorithm, and allocate c ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Local search algorithms applied to optimization problems often suffer from getting trapped in a local optimum. The common solution for this deficiency is to restart the algorithm when no progress is observed. Alternatively, one can start multiple instances of a local search algorithm, and allocate computational resources (in particular, processing time) to the instances depending on their behavior. Hence, a multistart strategy has to decide (dynamically) when to allocate additional resources to a particular instance and when to start new instances. In this paper we propose multistart strategies motivated by works on multiarmed bandit problems and Lipschitz optimization with an unknown constant. The strategies continuously estimate the potential performance of each algorithm instance by supposing a convergence rate of the local search algorithm up to an unknown constant, and in every phase allocate resources to those instances that could converge to the optimum for a particular range of the constant. Asymptotic bounds are given on the performance of the strategies. In particular, we prove that at most a quadratic increase in the number of times the target function is evaluated is needed to achieve the performance of a local search algorithm started from the attraction region of the optimum. Experiments are provided using SPSA (Simultaneous Perturbation Stochastic Approximation) and kmeans as local search algorithms, and the results indicate that the proposed strategies work well in practice, and, in all cases studied, need only logarithmically more evaluations of the target function as opposed to the theoretically suggested quadratic increase. 1.
EQUALITY CONSTRAINTS, RIEMANNIAN MANIFOLDS AND DIRECT SEARCH METHODS ∗
, 2006
"... We present a general procedure for handling equality constraints in optimization problems that is of particular use in direct search methods. The central idea is to treat the equality constraints as implicitly defining a Riemannian manifold. Then the function and inequality constraints can be pulled ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
We present a general procedure for handling equality constraints in optimization problems that is of particular use in direct search methods. The central idea is to treat the equality constraints as implicitly defining a Riemannian manifold. Then the function and inequality constraints can be pulledback to the tangent spaces of this manifold. One can then deal with the resulting inequality constrained optimization problem using any method of one’s choosing. An advantage of this procedure is the implicit reduction in dimensionality of the original problem to that of the manifold. Additionally, under some restrictions, convergence results for the method used to solve the inequality constrained optimization problem can be carried over directly to our procedure.
Parallel Space Decomposition of the Mesh Adaptive Direct Search Algorithm
, 2007
"... This paper describes a Parallel Space Decomposition (PSD) technique for the Mesh Adaptive Direct Search (MADS) algorithm. MADS extends Generalized Pattern Search for constrained nonsmooth optimization problems. The objective here is to solve larger problems more efficiently. The new method (PSDMADS ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
This paper describes a Parallel Space Decomposition (PSD) technique for the Mesh Adaptive Direct Search (MADS) algorithm. MADS extends Generalized Pattern Search for constrained nonsmooth optimization problems. The objective here is to solve larger problems more efficiently. The new method (PSDMADS) is an asynchronous parallel algorithm in which the processes solve problems over subsets of variables. The convergence analysis based on the Clarke calculus is essentially the same as for the MADS algorithm. A practical implementation is described and some numerical results on problems with up to 500 variables illustrate advantages and limitations of PSDMADS.
Towards a new direct algorithm: a twopoints based sampling method
, 2005
"... Abstract. The DIRECT algorithm was motivated by a modification to Lipschitzian optimization. The algorithm begins its search by sampling the objective function at the midpoint of an interval, where this function attains its lowest value, and then divides this interval by trisecting it. One of its ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The DIRECT algorithm was motivated by a modification to Lipschitzian optimization. The algorithm begins its search by sampling the objective function at the midpoint of an interval, where this function attains its lowest value, and then divides this interval by trisecting it. One of its weakness is that if a global minimum lies at the boundaries, which can never be reached, the convergence of the algorithm will be unecessary slow. We present a onedimensional variante of the DIRECT algorithm based on another strategy of sudividing the search domain. It consists of decreasing the number of intervals and increasing the number of sampling points, by interverting the roles of dividing and sampling at some steps of the DIRECT algorithm, and thus, overcoming this disadvantage. 1. Introduction. The original DIRECT ”DIviding RECTangles ” algorithm, developed by Donald. R. Jones et al. [10], was essentialy designed for finding the global minimum of a multivariate function. The algorithm was motivated to overcome some of the problems