Results 1  10
of
20
Derivativefree optimization: A review of algorithms and comparison of software implementations
"... ..."
Analysis of Direct Searches for Discontinuous Functions
, 2010
"... It is known that the Clarke generalized directional derivative is nonnegative along the limit directions generated by directional directsearch methods at a limit point of certain subsequences of unsuccessful iterates, if the function being minimized is Lipschitz continuous near the limit point. In ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
It is known that the Clarke generalized directional derivative is nonnegative along the limit directions generated by directional directsearch methods at a limit point of certain subsequences of unsuccessful iterates, if the function being minimized is Lipschitz continuous near the limit point. In this paper we generalize this result for discontinuous functions using Rockafellar generalized directional derivatives (upper subderivatives). We show that Rockafellar derivatives are also nonnegative along the limit directions of those subsequences of unsuccessful iterates when the function values converge to the function value at the limit point. This result is obtained assuming that the function is directionally Lipschitz with respect to the limit direction. It is also possible under appropriate conditions to establish more insightful results by showing that the sequence of points generated by these methods eventually approaches the limit point along the locally best branch or step function (when the number of steps is equal to two). The results of this paper are presented for constrained optimization and illustrated numerically.
Smoothing and WorstCase Complexity for DirectSearch Methods in Nonsmooth Optimization
, 2012
"... In the context of the derivativefree optimization of a smooth objective function, it has been shown that the worst case complexity of directsearch methods is of the same order as the one of steepest descent for derivativebased optimization, more precisely that the number of iterations needed to r ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
In the context of the derivativefree optimization of a smooth objective function, it has been shown that the worst case complexity of directsearch methods is of the same order as the one of steepest descent for derivativebased optimization, more precisely that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is proportional to the inverse of the threshold squared. Motivated by the lack of such a result in the nonsmooth case, we propose, analyze, and test a class of smoothing directsearch methods for the unconstrained optimization of nonsmooth functions. Given a parameterized family of smoothing functions for the nonsmooth objective function dependent on a smoothing parameter, this class of methods consists of applying a directsearch algorithm for a fixed value of the smoothing parameter until the step size is relatively small, after which the smoothing parameter is reduced and the process is repeated. One can show that the worst case complexity (or cost) of this procedure is roughly one order of magnitude worse than the one for direct search or steepest descent on smooth functions. The class of smoothing directsearch methods is also showed to enjoy asymptotic global convergence properties. Some preliminary numerical experiments indicates that this approach leads to better values of the objective function, pushing in some cases the optimization further, apparently without an additional cost in the number of function evaluations.
DIRECT SEARCH ALGORITHMS OVER RIEMANNIAN MANIFOLDS ∗
, 2006
"... We generalize the NelderMead simplex and LTMADS algorithms and, the frame based methods for function minimization to Riemannian manifolds. Examples are given for functions defined on the special orthogonal Lie group SO(n) and the Grassmann manifold G(n, k). Our main examples are applying the genera ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
(Show Context)
We generalize the NelderMead simplex and LTMADS algorithms and, the frame based methods for function minimization to Riemannian manifolds. Examples are given for functions defined on the special orthogonal Lie group SO(n) and the Grassmann manifold G(n, k). Our main examples are applying the generalized LTMADS algorithm to equality constrained optimization problems and, to the Whitney embedding problem for dimensionality reduction of data. A convergence analysis of the frame based method is also given.
Radial basis function algorithms for largescale nonlinearly constrained blackbox optimization
 Presented at the 20th International Symposium on Mathematical Programming (ISMP
"... Abstract. This paper presents a new algorithm for derivativefree optimization of expensive blackbox objective functions subject to expensive blackbox inequality constraints. The proposed algorithm, called ConstrLMSRBF, uses radial basis function (RBF) surrogate models and is an extension of the L ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Abstract. This paper presents a new algorithm for derivativefree optimization of expensive blackbox objective functions subject to expensive blackbox inequality constraints. The proposed algorithm, called ConstrLMSRBF, uses radial basis function (RBF) surrogate models and is an extension of the Local Metric Stochastic RBF (LMSRBF) algorithm by Regis and Shoemaker (2007a) that can handle blackbox inequality constraints. Previous algorithms for the optimization of expensive functions using surrogate models have mostly dealt with bound constrained problems where only the objective function is expensive, and so, the surrogate models are used to approximate the objective function only. In contrast, ConstrLMSRBF builds RBF surrogate models for the objective function and also for all the constraint functions in each iteration, and uses these RBF models to guide the selection of the next point where the objective and constraint functions will be evaluated. Computational results indicate that ConstrLMSRBF is better than alternative methods on 9 out of 14 test problems and on the MOPTA08 problem from the automotive industry (Jones 2008). The MOPTA08 problem has 124 decision variables and 68 inequality constraints and is considered a largescale problem in the area of expensive blackbox optimization. The alternative methods include a Mesh Adaptive Direct Search (MADS) algorithm (Abramson and Audet 2006, Audet and Dennis 2006) that uses a krigingbased surrogate model, the Multistart LMSRBF algorithm by Regis and Shoemaker
Parallel Space Decomposition of the Mesh Adaptive Direct Search Algorithm
, 2007
"... This paper describes a Parallel Space Decomposition (PSD) technique for the Mesh Adaptive Direct Search (MADS) algorithm. MADS extends Generalized Pattern Search for constrained nonsmooth optimization problems. The objective here is to solve larger problems more efficiently. The new method (PSDMADS ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
This paper describes a Parallel Space Decomposition (PSD) technique for the Mesh Adaptive Direct Search (MADS) algorithm. MADS extends Generalized Pattern Search for constrained nonsmooth optimization problems. The objective here is to solve larger problems more efficiently. The new method (PSDMADS) is an asynchronous parallel algorithm in which the processes solve problems over subsets of variables. The convergence analysis based on the Clarke calculus is essentially the same as for the MADS algorithm. A practical implementation is described and some numerical results on problems with up to 500 variables illustrate advantages and limitations of PSDMADS.
Spent Potliner Treatment Process Optimization Using a MADS Algorithm
, 2005
"... Les textes publiés dans la série des rapports de recherche HEC n’engagent que la responsabilité de leurs auteurs. La publication de ces rapports de recherche bénéficie d’une subvention du Fonds québécois de la recherche sur la nature et les technologies. ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Les textes publiés dans la série des rapports de recherche HEC n’engagent que la responsabilité de leurs auteurs. La publication de ces rapports de recherche bénéficie d’une subvention du Fonds québécois de la recherche sur la nature et les technologies.
LMCMA: an Alternative to LBFGS for Large Scale Blackbox Optimization
"... The limited memory BFGS method (LBFGS) of Liu and Nocedal (1989) is often considered to be the method of choice for continuous optimization when first and/or second order information is available. However, the use of LBFGS can be complicated in a blackbox scenario where gradient information i ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The limited memory BFGS method (LBFGS) of Liu and Nocedal (1989) is often considered to be the method of choice for continuous optimization when first and/or second order information is available. However, the use of LBFGS can be complicated in a blackbox scenario where gradient information is not available and therefore should be numerically estimated. The accuracy of this estimation, obtained by finite difference methods, is often problemdependent that may lead to premature convergence of the algorithm. In this paper, we demonstrate an alternative to LBFGS, the limited memory CovarianceMatrix Adaptation Evolution Strategy (LMCMA) proposed by Loshchilov (2014). The LMCMA is a stochastic derivativefree algorithm for numerical optimization of nonlinear, nonconvex optimization problems. Inspired by the LBFGS, the LMCMA samples candidate solutions according to a covariance matrix reproduced from m direction vectors selected during the optimization process. The decomposition of the covariance matrix into Cholesky factors allows to reduce the memory complexity to O(mn), where n is the number of decision variables. The time complexity of sampling one candidate solution is also O(mn), but scales as only about 25 scalarvector multiplications in practice. The algorithm has an important property of invariance w.r.t. strictly increasing transformations of the objective function, such transformations do not compromise its ability to approach the optimum. The LMCMA outperforms the original CMAES and its large scale versions on nonseparable illconditioned problems with a factor increasing with problem dimension. Invariance properties of the algorithm do not prevent it from demonstrating a comparable performance to LBFGS on nontrivial large scale smooth and nonsmooth optimization problems.
Nonasymptotic densities for shape reconstruction. Abstract and Applied Analysis
, 2014
"... In this work we study the problem of reconstructing shapes from simple nonasymptotic densities measured only along shape boundaries. The particular density we study is also known as the integral area invariant and corresponds to the area of a disk centered on the boundary that is also inside the sh ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
In this work we study the problem of reconstructing shapes from simple nonasymptotic densities measured only along shape boundaries. The particular density we study is also known as the integral area invariant and corresponds to the area of a disk centered on the boundary that is also inside the shape. It is easy to show uniqueness when these densities are known for all radii in a neighborhood of r = 0, but much less straightforward when we assume we know it for (almost) only one r> 0. We present variations of uniqueness results for reconstruction of polygons and (a dense set of) smooth curves under certain regularity conditions.
DOI 10.1007/s1058901597535 Mesh adaptive direct search with second directional derivativebased Hessian update
, 2014
"... Abstract The subject of this paper is inequality constrained blackbox optimization with mesh adaptive direct search (MADS). The MADS search step can include additional strategies for accelerating the convergence and improving the accuracy of the solution. The strategy proposed in this paper involv ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract The subject of this paper is inequality constrained blackbox optimization with mesh adaptive direct search (MADS). The MADS search step can include additional strategies for accelerating the convergence and improving the accuracy of the solution. The strategy proposed in this paper involves building a quadratic model of the function and linear models of the constraints. The quadratic model is built by means of a second directional derivativebased Hessian update. The linear terms are obtained by linear regression. The resulting quadratic programming (QP) problem is solved with a dedicated solver and the original functions are evaluated at the QP solution. The proposed search strategy is computationally less expensive than the quadratically constrained QP strategy in the state of the art MADS implementation (NOMAD). The proposed MADS variant (QPMADS) and NOMAD are compared on four sets of test problems. QPMADS outperforms NOMAD on all four of them for all but the smallest computational budgets.