Results 1  10
of
19
Methods For The Analysis Of Evolutionary Algorithms On PseudoBoolean Functions
 IN
, 2000
"... Many experiments have shown that evolutionary algorithms are useful randomized search heuristics for optimization problems. In order to learn more about the reasons for their e#ciency and in order to obtain proven results on evolutionary algorithms it is necessary to develop a theory of evolutionary ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
Many experiments have shown that evolutionary algorithms are useful randomized search heuristics for optimization problems. In order to learn more about the reasons for their e#ciency and in order to obtain proven results on evolutionary algorithms it is necessary to develop a theory of evolutionary algorithms. Such a theory is still in its infancy. A major part of a theory is the analysis of di#erent variants of evolutionary algorithms on selected functions. Several results of this kind have been obtained during the last years. Here important analytical tools are presented, discussed, and applied to wellchosen example functions.
On the Choice of the Offspring Population Size in Evolutionary Algorithms
, 2004
"... Evolutionary algorithms (EAs) generally come with a large number of parameters that have to be set before the algorithm can be used. Finding appropriate settings is a difficult task. The influence of these parameters on the efficiency of the search performed by an evolutionary algorithm can be very ..."
Abstract

Cited by 44 (5 self)
 Add to MetaCart
Evolutionary algorithms (EAs) generally come with a large number of parameters that have to be set before the algorithm can be used. Finding appropriate settings is a difficult task. The influence of these parameters on the efficiency of the search performed by an evolutionary algorithm can be very high. But there is still a lack of theoretically justified guidelines to help the practitioner find good values for these parameters. One such parameter is the offspring population size. Using a simplified but still realistic evolutionary algorithm, a thorough analysis of the effects of the offspring population size is presented. The result is a much better understanding of the role of offspring population size in an EA and suggests a simple way to dynamically adapt this parameter when necessary.
How To Analyse Evolutionary Algorithms
, 2002
"... Many variants of evolutionary algorithms have been designed and applied. The ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
Many variants of evolutionary algorithms have been designed and applied. The
The Cooperative Coevolutionary (1+1) EA
, 2003
"... Coevolutionary algorithms are a variant of evolutionary algorithms which are aimed for the solution of more complex tasks than traditional evolutionary algorithms. One example is a general cooperative coevolutionary framework for function optimization. A thorough and rigorous introductory research i ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Coevolutionary algorithms are a variant of evolutionary algorithms which are aimed for the solution of more complex tasks than traditional evolutionary algorithms. One example is a general cooperative coevolutionary framework for function optimization. A thorough and rigorous introductory research in which the optimization potential of cooperative coevolution is studied is presented. Using the cooperative coevolutionary framework as a starting point, the CC (1+1) EA is defined and investigated. The main interest is in the analysis of the expected optimization time. The research concentrates on separability since this is a key property of objective functions. It is shown that separability alone is not sufficient to yield any advantage of the CC (1+1) EA over its traditional, noncoevolutionary counterpart. Such an advantage is demonstrated to have one basis in the increased explorative possibilities of the cooperative coevolutionary
On the analysis of a dynamic evolutionary algorithm
 Journal of Discrete Algorithms
, 2001
"... Evolutionary algorithms are applied as problemindependent optimization algorithms. They are quite efficient in many situations. However, it is difficult to analyze even the behavior of simple variants of evolutionary algorithms like the socalled (1 + 1) EA on rather simple functions. Nevertheless ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Evolutionary algorithms are applied as problemindependent optimization algorithms. They are quite efficient in many situations. However, it is difficult to analyze even the behavior of simple variants of evolutionary algorithms like the socalled (1 + 1) EA on rather simple functions. Nevertheless, only the analysis of the expected run time and the success probability within a given number of steps can guide the choice of the free parameters of the algorithms. Here static (1 + 1) EAs with a fixed mutation probability are compared with dynamic (1+1) EAs with a simple schedule for the variation of the mutation probability. The dynamic variant is first analyzed for functions typically chosen as examplefunctions for evolutionary algorithms. Then functions are presented where each static (1 + 1) EA has exponential run time while the dynamic variant has polynomial run time and for other functions it is shown that the dynamic (1 + 1) EA has exponential run time while the static variant with a good choice of the mutation probability has with large probability polynomial run time.
On the Design and Analysis of Evolutionary Algorithms
, 2000
"... Evolutionary algorithms are problemindependent randomized search heuristics. It is discussed when it is useful to work with such algorithms and it is argued why these search heuristics should be analyzed just as all other deterministic and randomized algorithms. Such an approach is started by ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Evolutionary algorithms are problemindependent randomized search heuristics. It is discussed when it is useful to work with such algorithms and it is argued why these search heuristics should be analyzed just as all other deterministic and randomized algorithms. Such an approach is started by analyzing a simple evolutionary algorithm on linear functions, quadratic functions, unimodal functions, and its behavior on plateaus of constant fitness. Furthermore, it is investigated what can be gained and lost by a dynamic variant of this algorithm. Finally, it is proved that crossover can decrease the run time of evolutionary algorithms significantly.
How to analyze evolutionary algorithms
 Theoretical Computer Science
"... Many variants of evolutionary algorithms have been designed and applied. The experimental knowledge is immense. The rigorous analysis of evolutionary algorithms is difficult, but such a theory can help to understand, design, and teach evolutionary algorithms. In this survey, first the history of att ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Many variants of evolutionary algorithms have been designed and applied. The experimental knowledge is immense. The rigorous analysis of evolutionary algorithms is difficult, but such a theory can help to understand, design, and teach evolutionary algorithms. In this survey, first the history of attempts to analyse evolutionary algorithms is described and then new methods for continuous as well as discrete search spaces are presented and discussed.
On the analysis of the (1+1) memetic algorithm
 In Proc. of GECCO ’06
, 2006
"... Memetic algorithms are evolutionary algorithms incorporating local search to increase exploitation. This hybridization has been fruitful in countless applications. However, theory on memetic algorithms is still in its infancy. Here, we introduce a simple memetic algorithm, the (1+1) Memetic Algorith ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Memetic algorithms are evolutionary algorithms incorporating local search to increase exploitation. This hybridization has been fruitful in countless applications. However, theory on memetic algorithms is still in its infancy. Here, we introduce a simple memetic algorithm, the (1+1) Memetic Algorithm ((1+1) MA), working with a population size of 1 and no crossover. We compare it with the wellknown (1+1) EA and randomized local search and show that these algorithms can outperform each other drastically. On problems like, e. g., long path problems it is essential to limit the duration of local search. We investigate the (1+1) MA with a fixed maximal local search duration and define a class of fitness functions where a small variation of the local search duration has a large impact on the performance of the (1+1) MA. All results are proved rigorously without assumptions.
Theoretical Analysis of MutationAdaptive Evolutionary Algorithms
 Evolutionary Computation
, 2001
"... Adaptive evolutionary algorithms require a more sophisticated modeling than their staticparameter counterparts. Taking into account the current population is not enough when implementing parameteradaptation rules based on success rates (evolution strategies) or on premature convergence (genetic al ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Adaptive evolutionary algorithms require a more sophisticated modeling than their staticparameter counterparts. Taking into account the current population is not enough when implementing parameteradaptation rules based on success rates (evolution strategies) or on premature convergence (genetic algorithms). Instead of Markov chains, we use random systems with complete connections  accounting for a complete, rather than recent, history of the algorithm's evolution. Under the new paradigm, we analyze the convergence of several mutationadaptive algorithms: a binary genetic algorithm, the 1/5 success rule evolution strategy, a continuous, respectively a dynamic (1+1) evolutionary algorithm. Keywords Genetic algorithms, evolution strategies, convergence in probability, Markov chain, dependence with complete connections, ergodic behavior for a stochastic process. 1 Markov Chain and Convergence of the Evolutionary Algorithm Let S = {s 1 , . . . , s n } be a finite state space. We call Markov chain (MC) a random process {X t } t#0 moving from one state to another, such that for all t, i, j, h, k P rob{X t+1 = s k  X t = s i , X t1 = s j , . . . , X 0 = s h } = P rob{X t+1 = s k  X t = s i }. We say that an evolutionary algorithm (EA) is convergent if the probability of containing the global optimum inside the current generation tends to one as the generation index tends to infinity. A population containing the global optimum (also referred to as best individual/chromosome and abbreviated bestchrom) will be called optimal. A comprehensive survey on finite MC results for EAs was given in Rudolph (1998). In short, the algorithm's convergence can be related to the asymptotic behavior of a finite homogeneous MC by the following condensed procedure. 1. De...