Results 1 - 10
of
20
Derivative-free optimization: A review of algorithms and comparison of software implementations
"... ..."
Analysis of Direct Searches for Discontinuous Functions
, 2010
"... It is known that the Clarke generalized directional derivative is nonnegative along the limit directions generated by directional direct-search methods at a limit point of certain subsequences of unsuccessful iterates, if the function being minimized is Lipschitz continuous near the limit point. In ..."
Abstract
-
Cited by 23 (4 self)
- Add to MetaCart
It is known that the Clarke generalized directional derivative is nonnegative along the limit directions generated by directional direct-search methods at a limit point of certain subsequences of unsuccessful iterates, if the function being minimized is Lipschitz continuous near the limit point. In this paper we generalize this result for discontinuous functions using Rockafellar generalized directional derivatives (upper subderivatives). We show that Rockafellar derivatives are also nonnegative along the limit directions of those subsequences of unsuccessful iterates when the function values converge to the function value at the limit point. This result is obtained assuming that the function is directionally Lipschitz with respect to the limit direction. It is also possible under appropriate conditions to establish more insightful results by showing that the sequence of points generated by these methods eventually approaches the limit point along the locally best branch or step function (when the number of steps is equal to two). The results of this paper are presented for constrained optimization and illustrated numerically.
Smoothing and Worst-Case Complexity for Direct-Search Methods in Nonsmooth Optimization
, 2012
"... In the context of the derivative-free optimization of a smooth objective function, it has been shown that the worst case complexity of direct-search methods is of the same order as the one of steepest descent for derivative-based optimization, more precisely that the number of iterations needed to r ..."
Abstract
-
Cited by 8 (2 self)
- Add to MetaCart
In the context of the derivative-free optimization of a smooth objective function, it has been shown that the worst case complexity of direct-search methods is of the same order as the one of steepest descent for derivative-based optimization, more precisely that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is proportional to the inverse of the threshold squared. Motivated by the lack of such a result in the non-smooth case, we propose, analyze, and test a class of smoothing direct-search methods for the unconstrained optimization of nonsmooth functions. Given a parameterized family of smoothing functions for the non-smooth objective function dependent on a smoothing parameter, this class of methods consists of applying a direct-search algorithm for a fixed value of the smoothing parameter until the step size is relatively small, after which the smoothing parameter is reduced and the process is repeated. One can show that the worst case complexity (or cost) of this procedure is roughly one order of magnitude worse than the one for direct search or steepest descent on smooth functions. The class of smoothing direct-search methods is also showed to enjoy asymptotic global convergence properties. Some preliminary numerical experiments indicates that this approach leads to better values of the objective function, pushing in some cases the optimization further, apparently without an additional cost in the number of function evaluations.
DIRECT SEARCH ALGORITHMS OVER RIEMANNIAN MANIFOLDS ∗
, 2006
"... We generalize the Nelder-Mead simplex and LTMADS algorithms and, the frame based methods for function minimization to Riemannian manifolds. Examples are given for functions defined on the special orthogonal Lie group SO(n) and the Grassmann manifold G(n, k). Our main examples are applying the genera ..."
Abstract
-
Cited by 8 (3 self)
- Add to MetaCart
(Show Context)
We generalize the Nelder-Mead simplex and LTMADS algorithms and, the frame based methods for function minimization to Riemannian manifolds. Examples are given for functions defined on the special orthogonal Lie group SO(n) and the Grassmann manifold G(n, k). Our main examples are applying the generalized LTMADS algorithm to equality constrained optimization problems and, to the Whitney embedding problem for dimensionality reduction of data. A convergence analysis of the frame based method is also given.
Radial basis function algorithms for large-scale nonlinearly constrained black-box optimization
- Presented at the 20th International Symposium on Mathematical Programming (ISMP
"... Abstract. This paper presents a new algorithm for derivative-free optimization of expensive black-box objective functions subject to expensive black-box inequality constraints. The proposed algorithm, called ConstrLMSRBF, uses radial basis function (RBF) surrogate models and is an extension of the L ..."
Abstract
-
Cited by 6 (1 self)
- Add to MetaCart
(Show Context)
Abstract. This paper presents a new algorithm for derivative-free optimization of expensive black-box objective functions subject to expensive black-box inequality constraints. The proposed algorithm, called ConstrLMSRBF, uses radial basis function (RBF) surrogate models and is an extension of the Local Metric Stochastic RBF (LMSRBF) algorithm by Regis and Shoemaker (2007a) that can handle black-box inequality constraints. Previous algorithms for the optimization of expensive functions using surrogate models have mostly dealt with bound constrained problems where only the objective function is expensive, and so, the surrogate models are used to approximate the objective function only. In contrast, ConstrLMSRBF builds RBF surrogate models for the objective function and also for all the constraint functions in each iteration, and uses these RBF models to guide the selection of the next point where the objective and constraint functions will be evaluated. Computational results indicate that ConstrLMSRBF is better than alternative methods on 9 out of 14 test problems and on the MOPTA08 problem from the automotive industry (Jones 2008). The MOPTA08 problem has 124 decision variables and 68 inequality constraints and is considered a large-scale problem in the area of expensive black-box optimization. The alternative methods include a Mesh Adaptive Direct Search (MADS) algorithm (Abramson and Audet 2006, Audet and Dennis 2006) that uses a kriging-based surrogate model, the Multistart LMSRBF algorithm by Regis and Shoemaker
Parallel Space Decomposition of the Mesh Adaptive Direct Search Algorithm
, 2007
"... This paper describes a Parallel Space Decomposition (PSD) technique for the Mesh Adaptive Direct Search (MADS) algorithm. MADS extends Generalized Pattern Search for constrained nonsmooth optimization problems. The objective here is to solve larger problems more efficiently. The new method (PSD-MADS ..."
Abstract
-
Cited by 4 (3 self)
- Add to MetaCart
(Show Context)
This paper describes a Parallel Space Decomposition (PSD) technique for the Mesh Adaptive Direct Search (MADS) algorithm. MADS extends Generalized Pattern Search for constrained nonsmooth optimization problems. The objective here is to solve larger problems more efficiently. The new method (PSD-MADS) is an asynchronous parallel algorithm in which the processes solve problems over subsets of variables. The convergence analysis based on the Clarke calculus is essentially the same as for the MADS algorithm. A practical implementation is described and some numerical results on problems with up to 500 variables illustrate advantages and limitations of PSD-MADS.
Spent Potliner Treatment Process Optimization Using a MADS Algorithm
, 2005
"... Les textes publiés dans la série des rapports de recherche HEC n’engagent que la responsabilité de leurs auteurs. La publication de ces rapports de recherche bénéficie d’une subvention du Fonds québécois de la recherche sur la nature et les technologies. ..."
Abstract
-
Cited by 3 (2 self)
- Add to MetaCart
(Show Context)
Les textes publiés dans la série des rapports de recherche HEC n’engagent que la responsabilité de leurs auteurs. La publication de ces rapports de recherche bénéficie d’une subvention du Fonds québécois de la recherche sur la nature et les technologies.
LM-CMA: an Alternative to L-BFGS for Large Scale Black-box Optimization
"... The limited memory BFGS method (L-BFGS) of Liu and Nocedal (1989) is often con-sidered to be the method of choice for continuous optimization when first- and/or second- order information is available. However, the use of L-BFGS can be compli-cated in a black-box scenario where gradient information i ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
The limited memory BFGS method (L-BFGS) of Liu and Nocedal (1989) is often con-sidered to be the method of choice for continuous optimization when first- and/or second- order information is available. However, the use of L-BFGS can be compli-cated in a black-box scenario where gradient information is not available and therefore should be numerically estimated. The accuracy of this estimation, obtained by finite difference methods, is often problem-dependent that may lead to premature conver-gence of the algorithm. In this paper, we demonstrate an alternative to L-BFGS, the limited memory Covari-anceMatrix Adaptation Evolution Strategy (LM-CMA) proposed by Loshchilov (2014). The LM-CMA is a stochastic derivative-free algorithm for numerical optimization of non-linear, non-convex optimization problems. Inspired by the L-BFGS, the LM-CMA samples candidate solutions according to a covariance matrix reproduced from m di-rection vectors selected during the optimization process. The decomposition of the covariance matrix into Cholesky factors allows to reduce the memory complexity to O(mn), where n is the number of decision variables. The time complexity of sampling one candidate solution is also O(mn), but scales as only about 25 scalar-vector mul-tiplications in practice. The algorithm has an important property of invariance w.r.t. strictly increasing transformations of the objective function, such transformations do not compromise its ability to approach the optimum. The LM-CMA outperforms the original CMA-ES and its large scale versions on non-separable ill-conditioned prob-lems with a factor increasing with problem dimension. Invariance properties of the algorithm do not prevent it from demonstrating a comparable performance to L-BFGS on non-trivial large scale smooth and nonsmooth optimization problems.
Nonasymptotic densities for shape reconstruction. Abstract and Applied Analysis
, 2014
"... In this work we study the problem of reconstructing shapes from simple nonasymptotic densities measured only along shape bound-aries. The particular density we study is also known as the integral area invariant and corresponds to the area of a disk centered on the boundary that is also inside the sh ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
In this work we study the problem of reconstructing shapes from simple nonasymptotic densities measured only along shape bound-aries. The particular density we study is also known as the integral area invariant and corresponds to the area of a disk centered on the boundary that is also inside the shape. It is easy to show unique-ness when these densities are known for all radii in a neighborhood of r = 0, but much less straightforward when we assume we know it for (almost) only one r> 0. We present variations of uniqueness re-sults for reconstruction of polygons and (a dense set of) smooth curves under certain regularity conditions.
DOI 10.1007/s10589-015-9753-5 Mesh adaptive direct search with second directional derivative-based Hessian update
, 2014
"... Abstract The subject of this paper is inequality constrained black-box optimization with mesh adaptive direct search (MADS). The MADS search step can include addi-tional strategies for accelerating the convergence and improving the accuracy of the solution. The strategy proposed in this paper involv ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract The subject of this paper is inequality constrained black-box optimization with mesh adaptive direct search (MADS). The MADS search step can include addi-tional strategies for accelerating the convergence and improving the accuracy of the solution. The strategy proposed in this paper involves building a quadratic model of the function and linear models of the constraints. The quadratic model is built by means of a second directional derivative-based Hessian update. The linear terms are obtained by linear regression. The resulting quadratic programming (QP) problem is solved with a dedicated solver and the original functions are evaluated at the QP solution. The proposed search strategy is computationally less expensive than the quadratically constrained QP strategy in the state of the art MADS implementation (NOMAD). The proposed MADS variant (QPMADS) and NOMAD are compared on four sets of test problems. QPMADS outperforms NOMAD on all four of them for all but the smallest computational budgets.