Results 1  10
of
13
Optimization of the nested MonteCarlo algorithm on the traveling salesman problem with time windows
 In Applications of Evolutionary Computation
, 2011
"... Abstract. The traveling salesman problem with time windows is known to be a really difficult benchmark for optimization algorithms. In this paper, we are interested in the minimization of the travel cost. To solve this problem, we propose to use the nested MonteCarlo algorithm combined with a Self ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
Abstract. The traveling salesman problem with time windows is known to be a really difficult benchmark for optimization algorithms. In this paper, we are interested in the minimization of the travel cost. To solve this problem, we propose to use the nested MonteCarlo algorithm combined with a SelfAdaptation Evolution Strategy. We compare the efficiency of several fitness functions. We show that with our technique we can reach the state of the art solutions for a lot of problems in a short period of time.
On the parallel speedup of Estimation of Multivariate Normal Algorithm and Evolution Strategies
 EVONUM (EVOSTAR WORKSHOP)
, 2009
"... Motivated by parallel optimization, we experiment EDAlike adaptationrules in the case of λ large. The rule we use, essentially based on estimation of multivariate normal algorithm, is (i) compliant with all families of distributions for which a density estimation algorithm exists (ii) simple (iii) ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Motivated by parallel optimization, we experiment EDAlike adaptationrules in the case of λ large. The rule we use, essentially based on estimation of multivariate normal algorithm, is (i) compliant with all families of distributions for which a density estimation algorithm exists (ii) simple (iii) parameterfree (iv) better than current rules in this framework of λ large. The speedup as a function of λ is consistent with theoretical bounds.
On the huge benefit of quasirandom mutations for multimodal optimization with application to gridbased tuning of neurocontrollers
 ESANN (2009)
, 2009
"... ..."
On the Design of Constraint Covariance Matrix SelfAdaptation Evolution Strategies Including a Cardinality Constraint
 IEEE Transactions on Evolutionary Computation
"... AbstractThis paper describes the algorithm's engineering of a covariance matrix selfadaptation Evolution Strategy (ES) for solving a mixed linear/nonlinear constrained optimization problem arising in portfolio optimization. While the feasible solution space is defined by the (probabilistic) ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
AbstractThis paper describes the algorithm's engineering of a covariance matrix selfadaptation Evolution Strategy (ES) for solving a mixed linear/nonlinear constrained optimization problem arising in portfolio optimization. While the feasible solution space is defined by the (probabilistic) simplex, the nonlinearity comes in by a cardinality constraint bounding the number of linear inequalities violated. This gives rise to a nonconvex optimization problem. The design is based on the CMSAES and relies on three specific techniques to fulfill the different constraints. The resulting algorithm is then thoroughly tested on a data set derived from time series data of the Dow Jones Index.
DerivativeFree Optimization
"... Abstract. In many engineering applications it is common to find optimization problems where the cost function and/or constraints require complex simulations. Though it is often, but not always, theoretically possible in these cases to extract derivative information efficiently, the associated imple ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. In many engineering applications it is common to find optimization problems where the cost function and/or constraints require complex simulations. Though it is often, but not always, theoretically possible in these cases to extract derivative information efficiently, the associated implementation procedures are typically nontrivial and timeconsuming (e.g., adjointbased methodologies). Derivativefree (noninvasive, blackbox) optimization has lately received considerable attention within the optimization community, including the establishment of solid mathematical foundations for many of the methods considered in practice. In this chapter we will describe some of the most conspicuous derivativefree optimization techniques. Our depiction will concentrate first on local optimization such as pattern search techniques, and other methods based on interpolation/approximation. Then, we will survey a number of global search methodologies, and finally give guidelines on constraint handling approaches.
ComparisonBased Complexity of Multiobjective Optimization
, 2013
"... Several comparisonbased complexity results have been published recently, including multiobjective optimization. However, these results are, in the multiobjective case, quite pessimistic, due to the huge family of fitness functions considered. Combining assumptions on fitness functions and traditio ..."
Abstract
 Add to MetaCart
(Show Context)
Several comparisonbased complexity results have been published recently, including multiobjective optimization. However, these results are, in the multiobjective case, quite pessimistic, due to the huge family of fitness functions considered. Combining assumptions on fitness functions and traditional comparisonbased assumptions, we get more realistic bounds emphasizing the importance of reducing the number of conflicting objectives for reducing the runtime of multiobjective optimization. The approach can in particular predict lower bounds on the computation time, depending on the type of requested convergence: pointwise, or to the whole Pareto set. Also, a new (untested yet) algorithm is proposed for approximating the whole Pareto set. Accepted in GECCO 2011. 1
Author manuscript, published in "EvoStar 2010 (2010)" Adaptive Noisy Optimization
, 2010
"... Abstract. In this paper, adaptive noisy optimization on variants of the noisy sphere model is considered, i.e. optimization in which the same algorithm is able to adapt to several frameworks, including some for which no bound has never been derived. Incidentally, bounds derived by [16] for noise qui ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. In this paper, adaptive noisy optimization on variants of the noisy sphere model is considered, i.e. optimization in which the same algorithm is able to adapt to several frameworks, including some for which no bound has never been derived. Incidentally, bounds derived by [16] for noise quickly decreasing to zero around the optimum are extended to the more general case of a positively lowerbounded noise thanks to a careful use of Bernstein bounds (using empirical estimates of the variance) instead of Chernofflike variants. 1
Author manuscript, published in "Parallel Problem Solving From Nature (2010)" Log(λ) Modifications for Optimal Parallelism
, 2010
"... Abstract. It is usually considered that evolutionary algorithms are highly parallel. In fact, the theoretical speedups for parallel optimization are far better than empirical results; this suggests that evolutionary algorithms, for large numbers of processors, are not so efficient. In this paper, w ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. It is usually considered that evolutionary algorithms are highly parallel. In fact, the theoretical speedups for parallel optimization are far better than empirical results; this suggests that evolutionary algorithms, for large numbers of processors, are not so efficient. In this paper, we show that in many cases automatic parallelization provably provides better results than the standard parallelization consisting of simply increasing the population size λ. A corollary of these results is that logarithmic bounds on the speedup (as a function of the number of computing units) are tight within constant factors. Importantly, we propose a simple modification, termed log(λ)correction, which strongly improves several important algorithms when λ is large. 1
Author manuscript, published in "Parallel Problem Solving From Nature (PPSN2010) (2010) xxxxxxx" Loglinear Convergence of the Scaleinvariant (µ/µw, λ)ES and Optimal µ for Intermediate Recombination for Large Population Sizes
, 2010
"... Abstract. Evolution Strategies (ESs) are populationbased methods well suited for parallelization. In this paper, we study the convergence of the (µ/µw, λ)ES, an ES with weighted recombination, and derive its optimal convergence rate and optimal µ especially for large population sizes. First, we th ..."
Abstract
 Add to MetaCart
Abstract. Evolution Strategies (ESs) are populationbased methods well suited for parallelization. In this paper, we study the convergence of the (µ/µw, λ)ES, an ES with weighted recombination, and derive its optimal convergence rate and optimal µ especially for large population sizes. First, we theoretically prove the loglinear convergence of the algorithm using a scaleinvariant adaptation rule for the stepsize and minimizing spherical objective functions and identify its convergence rate as the expectation of an underlying random variable. Then, using MonteCarlo computations of the convergence rate in the case of equal weights, we derive optimal values for µ that we compare with previously proposed rules. Our numerical computations show also a dependency of the optimal convergence rate in ln(λ) in agreement with previous theoretical results. 1
unknown title
"... On the huge benefit of quasirandom mutations for multimodal optimization with application to gridbased tuning of neurocontrollers ..."
Abstract
 Add to MetaCart
(Show Context)
On the huge benefit of quasirandom mutations for multimodal optimization with application to gridbased tuning of neurocontrollers