#### DMCA

## Natural Evolution Strategies

### Cached

### Download Links

- [www.idsia.ch]
- [people.idsia.ch]
- [www.idsia.ch]
- [www.kyb.tuebingen.mpg.de]
- [www.idsia.ch]
- [people.idsia.ch]
- [www.kyb.tuebingen.mpg.de]
- [www.kyb.tuebingen.mpg.de]
- [www.kyb.mpg.de]
- [is.tuebingen.mpg.de]
- [www-clmc.usc.edu]
- [www.idsia.ch]
- [www.idsia.ch]
- [www.idsia.ch]
- [people.idsia.ch]
- [www.idsia.ch]
- [www.idsia.ch]
- [people.idsia.ch]
- [www.idsia.ch]
- [people.idsia.ch]
- [www.idsia.ch]
- [jmlr.org]
- [www6.in.tum.de]
- [stiff-project.eu]
- [www.researchgate.net]
- [www.researchgate.net]
- [www.idsia.ch]

Citations: | 40 - 22 self |

### Citations

5028 | Optimization by simulated annealing
- Kirkpatrick, Gelatt, et al.
- 1983
(Show Context)
Citation Context ...ics, Tuebingen, Germany (email: mail@janpeters.net).sTechnical Report No. IDSIA-01-08 2 A variety of algorithms has been developed within this framework, including methods such as Simulated Annealing =-=[5]-=-, Simultaneous Perturbation Stochastic Optimization [6], simple Hill Climbing, Particle Swarm Optimization [7] and the class of Evolutionary Algorithms, of which Evolution Strategies (ES) [8, 9, 10] a... |

2461 |
A simpler method for function minimization
- Nelder, Mead
- 1965
(Show Context)
Citation Context ...n The problem of black-box optimization has spawned a wide variety of approaches. A first class of methods was inspired by classic optimization methods, including simplex methods such as Nelder-Mead (=-=Nelder and Mead, 1965-=-), as well as members of the quasi-Newton family of algorithms. Simulated annealing (Kirkpatrick et al., 1983), a popular method introduced in 1983, was inspired by thermodynamics, and is in fact an a... |

2144 |
On information and sufficiency
- Kullback, Leibler
- 1951
(Show Context)
Citation Context ...ral’ measure of distance D(θ′||θ) between probability distributions pi (z|θ) and pi (z|θ′). One such natural distance measure between two probability distributions is the Kullback-Leibler divergence (=-=Kullback and Leibler, 1951-=-). The natural gradient can then be formalized as the solution to the constrained optimization problem max δθ J (θ + δθ) ≈ J (θ) + δθ>∇θJ, s.t. D (θ + δθ||θ) = ε, (5) where J (θ) is the expected fitne... |

914 | A direct adaptive method for faster backpropagation learning: The RPROP algorithm
- Riedmiller, Braun
- 1993
(Show Context)
Citation Context ...earning rate in order to more quickly converge to it. In order to produce variations η′ which can be judged using the above-mentioned Utest, we propose a procedure similar in spirit to Rprop-updates (=-=Riedmiller and Braun, 1993-=-; Igel and Hüsken, 2003), where the learning rates are either increased or decreased by a multiplicative constant whenever there is evidence that such a change will lead to better samples. More concr... |

811 |
Evolutionsstrategie: Optimierung technischer System nach Prinzipien der biologischen Evolution. Fromann-Holzboog
- Rechenberg
- 1994
(Show Context)
Citation Context ...nnealing [5], Simultaneous Perturbation Stochastic Optimization [6], simple Hill Climbing, Particle Swarm Optimization [7] and the class of Evolutionary Algorithms, of which Evolution Strategies (ES) =-=[8, 9, 10]-=- and in particular its Covariance Matrix Adaption (CMA) instantiation [11] are of great interest to us. Evolution Strategies, so named because of their inspiration from natural Darwinian evolution, ge... |

773 |
Differential evolution a simple and efficient heuristic for global optimization over continuous spaces
- Storn, Price
- 1997
(Show Context)
Citation Context ...methods, such as those inspired by evolution, have been developed from the early 1950s on. These include the broad class of genetic algorithms (Holland, 1975; Goldberg, 1989), differential evolution (=-=Storn and Price, 1997-=-), estimation of distribution algorithms (Larrañaga, 2002; Pelikan et al., 2000; Bosman and Thierens, 2000; Bosman et al., 2007; Pelikan et al., 2006), particle swarm optimization (Kennedy and Eberha... |

528 | Completely derandomized self-adaptation in evolutionary strategies
- Hansen, Ostermeier
(Show Context)
Citation Context ... Climbing, Particle Swarm Optimization [7] and the class of Evolutionary Algorithms, of which Evolution Strategies (ES) [8, 9, 10] and in particular its Covariance Matrix Adaption (CMA) instantiation =-=[11]-=- are of great interest to us. Evolution Strategies, so named because of their inspiration from natural Darwinian evolution, generally produce consecutive generations of samples. During each generation... |

440 | Simple statistical gradient-following algorithms for connectionist reinforcement learning
- Williams
- 1992
(Show Context)
Citation Context ...itute a well-principled approach to real-valued black box function optimization with a relatively clean derivation from first principles. Its theoretical relationship to the field of Policy Gradients =-=[13, 14]-=-, and in particular Natural Actor-Critic [15], should be clear to any reader familiar with both fields. The experiments however show that, on most benchmarks, NES is still roughly 5 times slower in pe... |

429 | Natural gradient works efficiently in learning
- Amari
- 1998
(Show Context)
Citation Context ...s a (1, λ)-Evolution Strategy with 1 candidate solution per generation and λ samples or ‘children’), adapts both a mutation matrix and the parent individual using a natural gradient based update step =-=[12]-=-. Every generation, a gradient towards better expected fitness is estimated using a Monte Carlo approximation. This gradient is then used to update both the parent individual’s parameters and the muta... |

428 |
Swarm Intelligence
- Kennedy, Eberhart
- 2001
(Show Context)
Citation Context ...s has been developed within this framework, including methods such as Simulated Annealing [5], Simultaneous Perturbation Stochastic Optimization [6], simple Hill Climbing, Particle Swarm Optimization =-=[7]-=- and the class of Evolutionary Algorithms, of which Evolution Strategies (ES) [8, 9, 10] and in particular its Covariance Matrix Adaption (CMA) instantiation [11] are of great interest to us. Evolutio... |

361 | Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation - Larrañaga, Lozano - 2001 |

335 | A survey of optimization by building and using probabilistic models
- Pelikan, Goldberg, et al.
(Show Context)
Citation Context ...t performance (with the help of well-chosen heuristic methods). The core idea, similar to the framework of estimation of distribution algorithms (EDAs) (Mühlenbein and Paass, 1996; Larrañaga, 2002; =-=Pelikan et al., 2000-=-) and many evolution strategies approaches (e.g., Ostermeier et al. 1994), is to maintain and iteratively update a search distribution from which search points are drawn and subsequently evaluated. Ho... |

301 | From recombination of genes to the estimation of distributions i. binary parameters - Mühlenbein, Paaß - 1996 |

261 |
Evolution Strategies A Comprehensive Introduction
- Beyer, Schwefel
- 2002
(Show Context)
Citation Context ...nnealing [5], Simultaneous Perturbation Stochastic Optimization [6], simple Hill Climbing, Particle Swarm Optimization [7] and the class of Evolutionary Algorithms, of which Evolution Strategies (ES) =-=[8, 9, 10]-=- and in particular its Covariance Matrix Adaption (CMA) instantiation [11] are of great interest to us. Evolution Strategies, so named because of their inspiration from natural Darwinian evolution, ge... |

232 |
Theory of Evolution Strategies
- Beyer
- 2001
(Show Context)
Citation Context ...nnealing [5], Simultaneous Perturbation Stochastic Optimization [6], simple Hill Climbing, Particle Swarm Optimization [7] and the class of Evolutionary Algorithms, of which Evolution Strategies (ES) =-=[8, 9, 10]-=- and in particular its Covariance Matrix Adaption (CMA) instantiation [11] are of great interest to us. Evolution Strategies, so named because of their inspiration from natural Darwinian evolution, ge... |

229 |
The Cross-Entropy Method: A Unified Approach to
- Rubinstein, Kroese
- 2004
(Show Context)
Citation Context ...(Larrañaga, 2002; Pelikan et al., 2000; Bosman and Thierens, 2000; Bosman et al., 2007; Pelikan et al., 2006), particle swarm optimization (Kennedy and Eberhart, 2001), and the cross-entropy method (=-=Rubinstein and Kroese, 2004-=-). Evolution strategies (ES), introduced by Ingo Rechenberg and Hans-Paul Schwefel in the 1960s and 1970s (Rechenberg and Eigen, 1973; Schwefel, 1977), were designed to cope with high-dimensional cont... |

219 |
Numerische Optimierung von Computer-Modellen mittels der Evolutionsstrategie (“Numeric Optimization of Computer Models by Means of an Evolution Strategy
- Schwefel
- 1977
(Show Context)
Citation Context ... 2001), and the cross-entropy method (Rubinstein and Kroese, 2004). Evolution strategies (ES), introduced by Ingo Rechenberg and Hans-Paul Schwefel in the 1960s and 1970s (Rechenberg and Eigen, 1973; =-=Schwefel, 1977-=-), were designed to cope with high-dimensional continuous-valued domains and have remained an active field of research for more than four decades (Beyer and Schwefel, 2002). ESs involve evaluating the... |

192 | Real-parameter black-box optimization benchmarking 2009: Noisy functions definitions
- Hansen, Finck, et al.
(Show Context)
Citation Context ... noise-free functions (12 unimodal, 12 multimodal; Hansen et al., 2010a) and 30 noisy functions (Hansen et al., 2010b). In order to make our results fully comparable, we also use the identical setup (=-=Hansen and Auger, 2010-=-), which transforms the pure benchmark functions to make the parameters non-separable (for some) and avoid trivial optima at the origin. The framework permits restarts until the budget of function eva... |

159 | Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization
- Suganthan, Hansen, et al.
- 2005
(Show Context)
Citation Context ...enchmark functions with dimensionality 15, averaged over 10 runs. VI. EXPERIMENTS To test the performance of the algorithm, we chose a standard set of unimodal and multimodal benchmark functions from =-=[13]-=- and [11] that are often used in the literature. Good fitness functions should be easy to interpret, but do scale up with n. They must be highly nonlinear, non-separable, largely resistant to hill-cli... |

145 | Policy gradient reinforcement learning for fast quadrupedal locomotion
- Kohl, Stone
- 2004
(Show Context)
Citation Context ...to illustrate the importance and prevalence of this general problem setup, one could point to a diverse set of tasks such as the classic nozzle shape design problem [2], developing an Aibo robot gait =-=[3]-=- or non-Markovian neurocontrol [4]. Now, since exhaustively searching the entire space of solution parameters is considered infeasible, and since we do not assume a precise model of our fitness functi... |

125 | Solving non-markovian control tasks with neuroevolution
- Gomez, Miikkulainen
- 1999
(Show Context)
Citation Context ...revalence of this general problem setup, one could point to a diverse set of tasks such as the classic nozzle shape design problem [2], developing an Aibo robot gait [3] or non-Markovian neurocontrol =-=[4]-=-. Now, since exhaustively searching the entire space of solution parameters is considered infeasible, and since we do not assume a precise model of our fitness function, we are forced to settle for tr... |

116 | Policy gradient methods for robotics
- Peters, Schaal
- 2006
(Show Context)
Citation Context ...itute a well-principled approach to real-valued black box function optimization with a relatively clean derivation from first principles. Its theoretical relationship to the field of Policy Gradients =-=[13, 14]-=-, and in particular Natural Actor-Critic [15], should be clear to any reader familiar with both fields. The experiments however show that, on most benchmarks, NES is still roughly 5 times slower in pe... |

89 | Natural actor-critic
- Peters, Vijayakumar, et al.
- 2005
(Show Context)
Citation Context ...lack box function optimization with a relatively clean derivation from first principles. Its theoretical relationship to the field of Policy Gradients [13, 14], and in particular Natural Actor-Critic =-=[15]-=-, should be clear to any reader familiar with both fields. The experiments however show that, on most benchmarks, NES is still roughly 5 times slower in performance than CMA. This contrasts with the r... |

82 |
Evolving neural network controllers for unstable systems
- Wieland
- 1991
(Show Context)
Citation Context ... no velocity information, which makes this task partially observable. It provides a perfect testbed for algorithms focusing on learning fine control with memory in continuous state and action spaces (=-=Wieland, 1991-=-). The controller is represented by a simple recurrent neural network, with three inputs, (position x and the two poles’ angles β1 and β2), and a variable number n of tanh units in the output layer, w... |

78 | Husken M (2003) Empirical evaluation of the improved Rprop learning algorithms. Neurocomputing 50 - Igel |

70 | Evolutionary tuning of multiple svm parameters, Neurocomputing 64
- Friedrichs, Igel
- 2005
(Show Context)
Citation Context ...viduals for the next generation. The culminating algorithm, the covariance matrix adaptation evolution strategy (CMA-ES; Hansen and Ostermeier, 2001), has proven successful in numerous studies (e.g., =-=Friedrichs and Igel, 2005-=-; Muller et al., 2002; Shepherd et al., 2006). While evolution strategies have shown to be effective at black-box optimization, analyzing the actual dynamics of the procedure turns out to be difficult... |

54 | Learning probability distributions in continuous evolutionary algorithms–a comparative review
- Kern, Müller, et al.
(Show Context)
Citation Context ...rforming a ‘natural’ change of coordinate system at each step. Furthermore, we uncover a close relationship between the resulting updates of the search distribution and those of the well-known CMA-ES =-=[4, 6]-=- algorithm, providing a retroactive theoretical justification for some of its heuristics. The resulting algorithm, exponential NES (xNES), is simpler and significantly more stable, even with greatly r... |

54 |
Convergence results for (1,λ)-SA-ES using the theory of ϕirreducible markov chains
- Auger
(Show Context)
Citation Context ...on, analyzing the actual dynamics of the procedure turns out to be difficult, the considerable efforts of various researchers notwithstanding (Beyer, 2001; Jägersküpper, 2007; Jebalia et al., 2010; =-=Auger, 2005-=-; Schaul, 2012f). 1.2 The NES Family Natural Evolution Strategies (NES) are a family of evolution strategies which iteratively update a search distribution by using an estimated gradient on its distri... |

54 | Step-size Adaptation Based on Nonlocal Use of Selection Information
- Ostermeier, Gawelczyk, et al.
- 1994
(Show Context)
Citation Context ...re idea, similar to the framework of estimation of distribution algorithms (EDAs) (Mühlenbein and Paass, 1996; Larrañaga, 2002; Pelikan et al., 2000) and many evolution strategies approaches (e.g., =-=Ostermeier et al. 1994-=-), is to maintain and iteratively update a search distribution from which search points are drawn and subsequently evaluated. However, NES updates the search distribution in the direction of higher ex... |

50 | Accelerated neural evolution through cooperatively coevolved synapses
- Gomez, Miikkulainen
- 2008
(Show Context)
Citation Context ...is (sufficiently close to) separable, and it is unnecessary to use the full covariance matrix. For reference we also plot the corresponding results of the previously best performing algorithm CoSyNE (=-=Gomez et al., 2008-=-). SNES is well-suited for neuroevolution problems because they tend to be high-dimensional, multi-modal, but with highly redundant global optima (there is not a unique set of weights that defines the... |

48 | P.: A method for handling uncertainty in evolutionary optimization with an application to feedback control of combustion - Hansen, Niederberger, et al. - 2009 |

38 | Expanding from discrete to continuous estimation of distribution algorithms: The idea
- Bosman, Thierens
- 2000
(Show Context)
Citation Context ...e the broad class of genetic algorithms (Holland, 1975; Goldberg, 1989), differential evolution (Storn and Price, 1997), estimation of distribution algorithms (Larrañaga, 2002; Pelikan et al., 2000; =-=Bosman and Thierens, 2000-=-; Bosman et al., 2007; Pelikan et al., 2006), particle swarm optimization (Kennedy and Eberhart, 2001), and the cross-entropy method (Rubinstein and Kroese, 2004). Evolution strategies (ES), introduce... |

36 |
Scalable Optimization via Probabilistic Modeling: From Algorithms to Applications
- Pelikan, Sastry, et al.
- 2006
(Show Context)
Citation Context ..., 1975; Goldberg, 1989), differential evolution (Storn and Price, 1997), estimation of distribution algorithms (Larrañaga, 2002; Pelikan et al., 2000; Bosman and Thierens, 2000; Bosman et al., 2007; =-=Pelikan et al., 2006-=-), particle swarm optimization (Kennedy and Eberhart, 2001), and the cross-entropy method (Rubinstein and Kroese, 2004). Evolution strategies (ES), introduced by Ingo Rechenberg and Hans-Paul Schwefel... |

35 | Benchmarking a bi-population CMA-ES on the BBOB-2009 function testbed - Hansen |

30 | A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity
- Ros, Hansen
- 2008
(Show Context)
Citation Context ...eters in the covariance matrix is reduced from d(d+ 1)/2 ∈ O(d2) to d ∈ O(d), which allows us to increase the learning rate ησ by a factor of d/3 ∈ O(d), a choice which has proven robust in practice (=-=Ros and Hansen, 2008-=-). The algorithm variants that we will be evaluating below are xNES (Algorithm 5), “xNES-as”, that is xNES using adaptation sampling (Section 3.2), and the separable SNES (Algorithm 6). A Python imple... |

27 |
Why natural gradient
- Amari, Douglas
- 1998
(Show Context)
Citation Context ...it towards better expected fitness. A well-known advantage of natural gradient methods over ‘vanilla’ gradient ascent is isotropic convergence on fitness landscapes with highly correlated coordinates =-=[1]-=-. Although relying exclusively on function value evaluations, the resulting optimization behavior closely resembles second order optimization techniques. This avoids drawbacks of regular gradients whi... |

26 |
D.: Improving evolution strategies through active covariance matrix adaptation
- Jastrebski, Arnold
- 2006
(Show Context)
Citation Context ...n CMA-ES. For the utility function we copied the weighting scheme of CMA-ES, but we normalized the values such that they sum to zero, which is the simplest form of implementing a fitness baseline. In =-=[5]-=- a similar approach has been proposed for CMA-ES. The remaining parameters have been determined via an empirical investigation, aiming for robust performance. They are used throughout this paper. A Py... |

25 | Machine Learning of Motor Skills for Robotics
- Peters
- 2007
(Show Context)
Citation Context ... the Fisher information matrix of the given parametric family of search distributions. The solution to the constrained optimization problem in Equation (5) can be found using a Lagrangian multiplier (=-=Peters, 2007-=-), yielding the necessary condition Fδθ = β∇θJ, for some constant β > 0. The direction of the natural gradient ∇̃θJ is given by δθ thus defined. If F is invertible,1 the natural gradient amounts to ∇̃... |

23 | Analysis of a simple evolutionary algorithm for minimization in euclidean spaces - Jägersküpper |

22 | Stochastic Search using the Natural Gradient
- Sun, Wierstra, et al.
- 2009
(Show Context)
Citation Context ...abilities of its updates, most probably resulting from unreliable estimates of the natural gradient, particularly in medium to high dimensional problems. Recent work on the Exact NES (eNES) algorithm =-=[12, 11]-=- has improved the robustness and reduced the computational complexity of NES. This was achieved by analytically computing the exact Fisher information matrix, which is needed for the natural gradient,... |

21 | Optimization based on bacterial chemotaxis - Müller, Marchetto, et al. |

20 | Gradient-based Adaptation of General Gaussian Kernels
- Glasmachers, Igel
- 2005
(Show Context)
Citation Context ... of the eigenvalue, possibly resulting in undesired oscillations. An elegant way to fix these problems is to represent the covariance matrix using the exponential map for symmetric matrices (see e.g. =-=[3]-=- for a related approach). Let n Sd := M ∈ R d×d ˛ M T o = Mand n Pd := M ∈ Sd ˛ v T Mv > 0 for all v ∈ R d o \ {0} denote the vector space of symmetric and the (cone) manifold of symmetric positive d... |

18 | Efficient Natural Evolution Strategies
- Sun, Wierstra, et al.
- 2009
(Show Context)
Citation Context ...abilities of its updates, most probably resulting from unreliable estimates of the natural gradient, particularly in medium to high dimensional problems. Recent work on the Exact NES (eNES) algorithm =-=[12, 11]-=- has improved the robustness and reduced the computational complexity of NES. This was achieved by analytically computing the exact Fisher information matrix, which is needed for the natural gradient,... |

16 |
Two-phase nozzle and hollow core jet experiments. Proceedings of the llth symposium on engineering aspects of magnetohydrodynamics
- Klockgether, Schwefel
- 1970
(Show Context)
Citation Context ...on optimization problems. In order to illustrate the importance and prevalence of this general problem setup, one could point to a diverse set of tasks such as the classic nozzle shape design problem =-=[2]-=-, developing an Aibo robot gait [3] or non-Markovian neurocontrol [4]. Now, since exhaustively searching the entire space of solution parameters is considered infeasible, and since we do not assume a ... |

16 |
S.: Bidirectional relation between CMA evolution strategies and natural evolution strategies
- Akimoto, Nagata, et al.
- 2010
(Show Context)
Citation Context ... formulated in global linear coordinates. The connection of these updates can be shown either by applying the xNES update directly to the natural coordinates without the exponential parameterization (=-=Akimoto et al., 2010-=-), or by approximating the exponential map by its first order Taylor expansion. Akimoto et al. (2010) established the same connection directly in coordinates based on the Cholesky decomposition of Σ, ... |

15 | Similarities and differences between policy gradient methods and evolution strategies
- Heidrich-Meisner, Igel
(Show Context)
Citation Context ...clear to any reader familiar with both fields. In recent work carried out independently from ours, the similarities between Policy Gradient methods and Evolution Strategies have also been pointed out =-=[17]-=-, which suggests there might be fruitful future interaction between the two fields. The experiments however show that, on most unimodal benchmarks, NES is still roughly 2 to 5 times slower in performa... |

15 | Selection and Reinforcement Learning for Combinatorial Optimization - Berny |

14 |
N.: Log linear convergence and divergence of the scaleinvariant (1+1)-ES in noisy environments. Algorithmica p
- Jebalia, Auger, et al.
(Show Context)
Citation Context ...t black-box optimization, analyzing the actual dynamics of the procedure turns out to be difficult, the considerable efforts of various researchers notwithstanding (Beyer, 2001; Jägersküpper, 2007; =-=Jebalia et al., 2010-=-; Auger, 2005; Schaul, 2012f). 1.2 The NES Family Natural Evolution Strategies (NES) are a family of evolution strategies which iteratively update a search distribution by using an estimated gradient ... |

13 | High dimensions and heavy tails for natural evolution strategies
- Schaul, Glasmachers, et al.
(Show Context)
Citation Context ...imate it from samples. 1.3 Paper Outline This paper builds upon and extends our previous work on Natural Evolution Strategies (Wierstra et al., 2008; Sun et al., 2009a,b; Glasmachers et al., 2010a,b; =-=Schaul et al., 2011-=-), and is structured as follows: Section 2 presents the general idea of search gradients as described in Wierstra et al. (2008), explaining stochastic search using parameterized distributions while do... |

12 | Benchmarking a bi-population CMA-ES on the BBOB-2009 noisy testbed - Hansen |

10 |
Theoretical framework for comparing several stochastic optimization approaches
- Spall, Hill, et al.
- 2006
(Show Context)
Citation Context ... that is rich in deceptive local optima, the Rastrigin benchmark. 1 Introduction Real-valued ‘black box’ function optimization is one of the major branches of modern applied machine learning research =-=[1]-=-. It concerns itself with optimizing the continuous parameters of some unknown objective function, also called a fitness function. The exact structure of the objective function is assumed to be unknow... |

10 |
Stochastic optimization and the simultaneous perturbation method
- Spall
- 1999
(Show Context)
Citation Context ...chnical Report No. IDSIA-01-08 2 A variety of algorithms has been developed within this framework, including methods such as Simulated Annealing [5], Simultaneous Perturbation Stochastic Optimization =-=[6]-=-, simple Hill Climbing, Particle Swarm Optimization [7] and the class of Evolutionary Algorithms, of which Evolution Strategies (ES) [8, 9, 10] and in particular its Covariance Matrix Adaption (CMA) i... |

7 | Three dimensional evolutionary aerodynamic design optimization with CMA-ES - Hasenjäger, Sendhoff, et al. - 2005 |

6 | A Natural Evolution Strategy for Multi-Objective Optimization - Glasmachers, Schaul, et al. |

6 |
Modeling morphology evolution and mechanical behavior during thermo-mechanical processing of semi-crystalline polymers
- Shepherd, McDowell, et al.
- 2006
(Show Context)
Citation Context ... algorithm, the covariance matrix adaptation evolution strategy (CMA-ES; Hansen and Ostermeier, 2001), has proven successful in numerous studies (e.g., Friedrichs and Igel, 2005; Muller et al., 2002; =-=Shepherd et al., 2006-=-). While evolution strategies have shown to be effective at black-box optimization, analyzing the actual dynamics of the procedure turns out to be difficult, the considerable efforts of various resear... |

4 | The second harmonic generation case-study as a gateway for es to quantum control problems - Shir, Bäck - 2007 |

3 | Identification of the isotherm function in chromatography using cma-es
- Jebalia, Auger, et al.
- 2007
(Show Context)
Citation Context ...ess evaluations at certain points in parameter space. Problems that fall within this category are numerous, ranging from applications in health and science (Winter et al., 2005; Shir and Bäck, 2007; =-=Jebalia et al., 2007-=-) to aeronautic design (Hasenjäger et al., 2005; Klockgether and Schwefel, 1970) and control (Hansen et al., 2009). Numerous algorithms in this vein have been developed and applied in the past fifty ... |

3 | Natural evolution strategies converge on sphere functions - Schaul - 2012 |

2 | Benchmarking Separable Natural Evolution Strategies on the Noiseless and Noisy Black-box Optimization Testbeds - Schaul - 2012 |

2 | Comparing natural evolution strategies to bipop-cma-es on noiseless and noisy black-box optimization testbeds - Schaul - 2012 |

1 |
Sur la représentation géométrique des syetèmes matérieles non holonomes
- Cartan
- 1928
(Show Context)
Citation Context ...tion geometric perspective, viewing Pd as the Riemannian parameter manifold equipped with the Fisher information metric. The invariance property is a direct consequence of the Cartan-Hadamand theorem =-=[2]-=-. However, the exponential parameterization considerably complicates the computation of the Fisher information matrix F, which now involves partial derivatives of the matrix exponential (3). This can ... |

1 |
Adapted maximum-likelihood Gaussian models for numerical optimization with continuous EDAs
- Bosman, Grahl, et al.
- 2007
(Show Context)
Citation Context ...c algorithms (Holland, 1975; Goldberg, 1989), differential evolution (Storn and Price, 1997), estimation of distribution algorithms (Larrañaga, 2002; Pelikan et al., 2000; Bosman and Thierens, 2000; =-=Bosman et al., 2007-=-; Pelikan et al., 2006), particle swarm optimization (Kennedy and Eberhart, 2001), and the cross-entropy method (Rubinstein and Kroese, 2004). Evolution strategies (ES), introduced by Ingo Rechenberg ... |

1 | Benchmarking exponential natural evolution strategies on the noiseless and noisy black-box optimization testbeds - Schaul |

1 | Benchmarking natural evolution strategies with adaptation sampling on the noiseless and noisy black-box optimization testbeds - Schaul |

1 | Investigating the impact of adaptation sampling in natural evolution strategies on black-box optimization testbeds - Schaul |