Results 1  10
of
175
Removing The Genetics from The Standard Genetic Algorithm
 In Proceedings of ICML’95
, 1995
"... We present an abstraction of the genetic algorithm (GA), termed populationbased incremental learning (PBIL), that explicitly maintains the statistics contained in a GA’s population, but which abstracts away the crossover operator and redefines the role of the population. This results in PBIL being ..."
Abstract

Cited by 212 (13 self)
 Add to MetaCart
We present an abstraction of the genetic algorithm (GA), termed populationbased incremental learning (PBIL), that explicitly maintains the statistics contained in a GA’s population, but which abstracts away the crossover operator and redefines the role of the population. This results in PBIL being simpler, both computationally and theoretically, than the GA. Empirical results reported elsewhere show that PBIL is faster and more effective than the GA on a large set of commonly used benchmark problems. Here we present results on a problem custom designed to benefit both from the GA’s crossover operator and from its use of a population. The results show that PBIL performs as well as, or better than, GAs carefully tuned to do well on this problem. This suggests that even on problems custom designed for GAs, much of the power of the GA may derive from the statistics maintained implicitly in its population, and not from the population itself nor from the crossover operator.
Evolving cellular automata to perform computations: Mechanisms and impediments
 Physica D
, 1994
"... We present results from experiments in which a genetic algorithm (GA) was used to evolve cellular automata (CAs) to perform a particular computational task—onedimensional density classification. We look in detail at the evolutionary mechanisms producing the GA’s behavior on this task and the impedi ..."
Abstract

Cited by 139 (18 self)
 Add to MetaCart
(Show Context)
We present results from experiments in which a genetic algorithm (GA) was used to evolve cellular automata (CAs) to perform a particular computational task—onedimensional density classification. We look in detail at the evolutionary mechanisms producing the GA’s behavior on this task and the impediments faced by the GA. In particular, we identify four “epochs of innovation ” in which new CA strategies for solving the problem are discovered by the GA, describe how these strategies are implemented in CA rule tables, and identify the GA mechanisms underlying their discovery. The epochs are characterized by a breaking of the task’s symmetries on the part of the GA. The symmetry breaking results in a shortterm fitness gain but ultimately prevents the discovery of the most highly fit strategies. We discuss the extent to which symmetry breaking and other impediments are general phenomena in any GA search. 1.
An Indexed Bibliography of Genetic Algorithms in Power Engineering
, 1995
"... s: Jan. 1992  Dec. 1994 ffl CTI: Current Technology Index Jan./Feb. 1993  Jan./Feb. 1994 ffl DAI: Dissertation Abstracts International: Vol. 53 No. 1  Vol. 55 No. 4 (1994) ffl EEA: Electrical & Electronics Abstracts: Jan. 1991  Dec. 1994 ffl P: Index to Scientific & Technical Proceed ..."
Abstract

Cited by 90 (10 self)
 Add to MetaCart
s: Jan. 1992  Dec. 1994 ffl CTI: Current Technology Index Jan./Feb. 1993  Jan./Feb. 1994 ffl DAI: Dissertation Abstracts International: Vol. 53 No. 1  Vol. 55 No. 4 (1994) ffl EEA: Electrical & Electronics Abstracts: Jan. 1991  Dec. 1994 ffl P: Index to Scientific & Technical Proceedings: Jan. 1986  Feb. 1995 (except Nov. 1994) ffl EI A: The Engineering Index Annual: 1987  1992 ffl EI M: The Engineering Index Monthly: Jan. 1993  Dec. 1994 The following GA researchers have already kindly supplied their complete autobibliographies and/or proofread references to their papers: Dan Adler, Patrick Argos, Jarmo T. Alander, James E. Baker, Wolfgang Banzhaf, Ralf Bruns, I. L. Bukatova, Thomas Back, Yuval Davidor, Dipankar Dasgupta, Marco Dorigo, Bogdan Filipic, Terence C. Fogarty, David B. Fogel, Toshio Fukuda, Hugo de Garis, Robert C. Glen, David E. Goldberg, Martina GorgesSchleuter, Jeffrey Horn, Aristides T. Hatjimihail, Mark J. Jakiela, Richard S. Judson, Akihiko Konaga...
The distributed genetic algorithm revisited
 Proceedings of the Sixth International Conference on Genetic Algorithms
, 1995
"... This paper extends previous work done by Tanese on the distributed genetic algorithm (DGA). Tanese found that the DGA outperformed the canonical serial genetic algorithm (CGA) on a class of di cult, randomlygenerated Walsh polynomials. This left open the question of whether the DGA would have simila ..."
Abstract

Cited by 82 (0 self)
 Add to MetaCart
(Show Context)
This paper extends previous work done by Tanese on the distributed genetic algorithm (DGA). Tanese found that the DGA outperformed the canonical serial genetic algorithm (CGA) on a class of di cult, randomlygenerated Walsh polynomials. This left open the question of whether the DGA would have similar success on functions that were more amenable to optimization by the CGA. In this work, experiments were done to compare the DGA's performance on the Royal Road class of tness functions to that of the CGA. Besides achieving superlinear speedup on KSR parallel computers, the DGA again outperformed the CGA on the functions R3 and R4 with regard to the metrics of best tness, average tness, and number of times the optimum was reached. Its performance on 1 and 2 was comparable to that of the CGA. The e ect of varying the DGA's migration parameters was also investigated. The results of the experiments are presented and discussed, and suggestions for future research are made. R R i
Statistical dynamics of the Royal Road genetic algorithm
 Theoretical Computer Science
, 1999
"... Metastability is a common phenomenon. Many evolutionary processes, both natural and artificial, alternate between periods of stasis and brief periods of rapid change in their behavior. In this paper an analytical model for the dynamics of a mutationonly genetic algorithm (GA) is introduced that iden ..."
Abstract

Cited by 73 (5 self)
 Add to MetaCart
(Show Context)
Metastability is a common phenomenon. Many evolutionary processes, both natural and artificial, alternate between periods of stasis and brief periods of rapid change in their behavior. In this paper an analytical model for the dynamics of a mutationonly genetic algorithm (GA) is introduced that identifies a new and general mechanism causing metastability in evolutionary dynamics. The GA’s population dynamics is described in terms of flows in the space of fitness distributions. The trajectories through fitness distribution space are derived in closed form in the limit of infinite populations. We then show how finite populations induce metastability, even in regions where fitness does not exhibit a local optimum. In particular, the model predicts the occurrence of “fitness epochs”—periods of stasis in population fitness distributions—at finite population size and identifies the locations of these fitness epochs with the flow’s hyperbolic fixed points. This enables exact predictions of the metastable fitness distributions during the fitness epochs, as well as giving insight into the nature of the periods of stasis and the innovations between them. All these results are obtained as closedform expressions in terms of the GA’s parameters.
On the analysis of evolutionary algorithms  A proof that crossover really can help
 PROCEEDINGS OF THE 7TH ANNUAL EUROPEAN SYMPOSIUM ON ALGORITHMS (ESA Â99
, 1999
"... ..."
Genetic Algorithm Difficulty and the Modality of Fitness Landscapes
 Foundations of Genetic Algorithms 3
, 1994
"... We assume that the modality (i.e., number of local optima) of a fitness landscape is related to the difficulty of finding the best point on that landscape by evolutionary computation (e.g., hillclimbers and genetic algorithms (GAs)). We first examine the limits of modality by constructing a unimodal ..."
Abstract

Cited by 65 (2 self)
 Add to MetaCart
(Show Context)
We assume that the modality (i.e., number of local optima) of a fitness landscape is related to the difficulty of finding the best point on that landscape by evolutionary computation (e.g., hillclimbers and genetic algorithms (GAs)). We first examine the limits of modality by constructing a unimodal function and a maximally multimodal function. At such extremes our intuition breaks down. A fitness landscape consisting entirely of a single hill leading to the global optimum proves to be hard for hillclimbers but apparently easy for GAs. A provably maximally multimodal function, in which half the points in the search space are local optima, can be easy for both hillclimbers and GAs. Exploring the more realistic intermediate range between the extremes of modality, we construct local optima with varying degrees of "attraction" to our evolutionary algorithms. Most work on optima and their basins of attraction has focused on hills and hillclimbers, while some research has explored attraction...
An Empirical Comparison of Seven Iterative and Evolutionary Function Optimization Heuristics
, 1995
"... This report is a repository for the results obtained from a large scale empirical comparison of seven iterative and evolutionbased optimization heuristics. Twentyseven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, ..."
Abstract

Cited by 57 (8 self)
 Add to MetaCart
This report is a repository for the results obtained from a large scale empirical comparison of seven iterative and evolutionbased optimization heuristics. Twentyseven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, are examined. The problem sets include jobshop scheduling, traveling salesman, knapsack, binpacking, neural network weight optimization, and standard numerical optimization. The search spaces in these problems range from 2^368 to 2^2040. The results indicate that using genetic algorithms for the optimization of static functions does not yield a benefit, in terms of the final answer obtained, over simpler optimization heuristics. Descriptions of the algorithms tested and the encodings of the problems are described in detail for reproducibility.
How Mutation and Selection Solve Long Path Problems in Polynomial Expected Time
, 1996
"... It is shown by means of Markov chain analysis that unimodal binary long path problems can be solved by mutation and elitist selection in a polynomially bounded number of trials on average. 1 Unimodality of Binary Functions The notion of unimodal functions usually appears in the theory of optimizati ..."
Abstract

Cited by 56 (2 self)
 Add to MetaCart
It is shown by means of Markov chain analysis that unimodal binary long path problems can be solved by mutation and elitist selection in a polynomially bounded number of trials on average. 1 Unimodality of Binary Functions The notion of unimodal functions usually appears in the theory of optimization in IR 1 . Elster et al. (1977), pp. 228230, provide a precise definition that is specialized to functions in IR 1 whereas the definition in Bronstein and Semendjajew (1988), p. 137, for functions in IR ` with ` 1 presupposes differentiability. Here, the following definition for functions over IB ` will be used: Definition 1 Let f be a realvalued function with domain IB ` where IB = f0; 1g. A point x 2 IB ` is called a local solution of f if f(x ) f(x) for all x 2 fy 2 IB ` : k y \Gamma x k 1 = 1g (1) where k x k 1 = P ` i=1 j x i j is the Hamming norm. If the inequality in (1) is strict, then x is termed a strictly local solution. The value f(x ) at a...