Results 1 - 10
of
282
Predictive Models for the Breeder Genetic Algorithm -- I. Continuous Parameter Optimization
- EVOLUTIONARY COMPUTATION
, 1993
"... In this paper a new genetic algorithm called the Breeder Genetic Algorithm (BGA) is introduced. The BGA is based on artificial selection similar to that used by human breeders. A predictive model for the BGA is presented which is derived from quantitative genetics. The model is used to predict t ..."
Abstract
-
Cited by 395 (25 self)
- Add to MetaCart
In this paper a new genetic algorithm called the Breeder Genetic Algorithm (BGA) is introduced. The BGA is based on artificial selection similar to that used by human breeders. A predictive model for the BGA is presented which is derived from quantitative genetics. The model is used to predict the behavior of the BGA for simple test functions. Different mutation schemes are compared by computing the expected progress to the solution. The numerical performance of the BGA is demonstrated on a test suite of multimodal functions. The number of function evaluations needed to locate the optimum scales only as n ln(n) where n is the number of parameters. Results up to n = 1000 are reported.
Evolutionary Programming Made Faster
- IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 1999
"... Evolutionary programming (EP) has been applied with success to many numerical and combinatorial optimization problems in recent years. EP has rather slow convergence rates, however, on some function optimization problems. In this paper, a "fast EP" (FEP) is proposed which uses a Cauchy ins ..."
Abstract
-
Cited by 328 (38 self)
- Add to MetaCart
Evolutionary programming (EP) has been applied with success to many numerical and combinatorial optimization problems in recent years. EP has rather slow convergence rates, however, on some function optimization problems. In this paper, a "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator. The relationship between FEP and classical EP (CEP) is similar to that between fast simulated annealing and the classical version. Both analytical and empirical studies have been carried out to evaluate the performance of FEP and CEP for different function optimization problems. This paper shows that FEP is very good at search in a large neighborhood while CEP is better at search in a small local neighborhood. For a suite of 23 benchmark problems, FEP performs much better than CEP for multimodal functions with many local minima while being comparable to CEP in performance for unimodal and multimodal functions with only a few local minima. This paper also shows the relationship between the search step size and the probability of finding a global optimum and thus explains why FEP performs better than CEP on some functions but not on others. In addition, the importance of the neighborhood size and its relationship to the probability of finding a near-optimum is investigated. Based on these analyses, an improved FEP (IFEP) is proposed and tested empirically. This technique mixes different search operators (mutations). The experimental results show that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested.
Evolutionary computation: Comments on the history and current state
- IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 1997
"... Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general structure, and ..."
Abstract
-
Cited by 273 (0 self)
- Add to MetaCart
Evolutionary computation has started to receive significant attention during the last decade, although the origins can be traced back to the late 1950’s. This article surveys the history as well as the current state of this rapidly growing field. We describe the purpose, the general structure, and the working principles of different approaches, including genetic algorithms (GA) [with links to genetic programming (GP) and classifier systems (CS)], evolution strategies (ES), and evolutionary programming (EP) by analysis and comparison of their most important constituents (i.e., representations, variation operators, reproduction, and selection mechanism). Finally, we give a brief overview on the manifold of application domains, although this necessarily must remain incomplete.
A Taxonomy of Global Optimization Methods Based on Response Surfaces
- Journal of Global Optimization
, 2001
"... Abstract. This paper presents a taxonomy of existing approaches for using response surfaces for global optimization. Each method is illustrated with a simple numerical example that brings out its advantages and disadvantages. The central theme is that methods that seem quite reasonable often have no ..."
Abstract
-
Cited by 230 (1 self)
- Add to MetaCart
Abstract. This paper presents a taxonomy of existing approaches for using response surfaces for global optimization. Each method is illustrated with a simple numerical example that brings out its advantages and disadvantages. The central theme is that methods that seem quite reasonable often have non-obvious failure modes. Understanding these failure modes is essential for the development of practical algorithms that fulfill the intuitive promise of the response surface approach. Key words: global optimization, response surface, kriging, splines 1.
Tackling real-coded genetic algorithms: operators and tools for the behavioural analysis
- Arti Intelligence Reviews
, 1998
"... Abstract. Genetic algorithms play a significant role, as search techniques for handling com-plex spaces, in many fields such as artificial intelligence, engineering, robotic, etc. Genetic algorithms are based on the underlying genetic process in biological organisms and on the natural evolution prin ..."
Abstract
-
Cited by 189 (27 self)
- Add to MetaCart
Abstract. Genetic algorithms play a significant role, as search techniques for handling com-plex spaces, in many fields such as artificial intelligence, engineering, robotic, etc. Genetic algorithms are based on the underlying genetic process in biological organisms and on the natural evolution principles of populations. These algorithms process a population of chromo-somes, which represent search space solutions, with three operations: selection, crossover and mutation. Under its initial formulation, the search space solutions are coded using the binary alphabet. However, the good properties related with these algorithms do not stem from the use of this alphabet; other coding types have been considered for the representation issue, such as real coding, which would seem particularly natural when tackling optimization problems of parameters with variables in continuous domains. In this paper we review the features of real-coded genetic algorithms. Different models of genetic operators and some mechanisms available for studying the behaviour of this type of genetic algorithms are revised and compared. Key words: genetic algorithms, real coding, continuous search spaces Abbreviations: GAs – genetic algorithms; BCGA – binary-coded genetic algorithm; RCGA – real-coded genetic algorithm
Algorithms for the Satisfiability (SAT) Problem: A Survey
- DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1996
"... . The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computer-aided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, compute ..."
Abstract
-
Cited by 144 (3 self)
- Add to MetaCart
(Show Context)
. The satisfiability (SAT) problem is a core problem in mathematical logic and computing theory. In practice, SAT is fundamental in solving many problems in automated reasoning, computer-aided design, computeraided manufacturing, machine vision, database, robotics, integrated circuit design, computer architecture design, and computer network design. Traditional methods treat SAT as a discrete, constrained decision problem. In recent years, many optimization methods, parallel algorithms, and practical techniques have been developed for solving SAT. In this survey, we present a general framework (an algorithm space) that integrates existing SAT algorithms into a unified perspective. We describe sequential and parallel SAT algorithms including variable splitting, resolution, local search, global optimization, mathematical programming, and practical SAT algorithms. We give performance evaluation of some existing SAT algorithms. Finally, we provide a set of practical applications of the sat...
A tutorial on Bayesian optimization of expensive cost functions, withapplicationtoactiveusermodeling andhierarchical reinforcement learning
, 2009
"... We present a tutorial on Bayesian optimization, a method of finding the maximum of expensive cost functions. Bayesian optimization employs the Bayesian technique of setting a prior over the objective function and combining it with evidence to get a posterior function. This permits a utility-based se ..."
Abstract
-
Cited by 83 (11 self)
- Add to MetaCart
We present a tutorial on Bayesian optimization, a method of finding the maximum of expensive cost functions. Bayesian optimization employs the Bayesian technique of setting a prior over the objective function and combining it with evidence to get a posterior function. This permits a utility-based selection of the next observation to make on the objective function, which must take into account both exploration (sampling from areas of high uncertainty) and exploitation (sampling areas likely to offer improvement over the current best observation). We also present two detailed extensions of Bayesian optimization, with experiments—active user modelling with preferences, and hierarchical reinforcement learning— and a discussion of the pros and cons of Bayesian optimization based on our experiences. 1
A Radial Basis Function Method for Global Optimization
- JOURNAL OF GLOBAL OPTIMIZATION
, 1999
"... We introduce a method that aims to find the global minimum of a continuous nonconvex function on a compact subset of R^d. It is assumed that function evaluations are expensive and that no additional information is available. Radial basis function interpolation is used to define a utility function. T ..."
Abstract
-
Cited by 75 (1 self)
- Add to MetaCart
(Show Context)
We introduce a method that aims to find the global minimum of a continuous nonconvex function on a compact subset of R^d. It is assumed that function evaluations are expensive and that no additional information is available. Radial basis function interpolation is used to define a utility function. The maximizer of this function is the next point where the objective function is evaluated. We show that, for most types of radial basis functions that are considered in this paper, convergence can be achieved without further assumptions on the objective function. Besides, it turns out that our method is closely related to a statistical global optimization method, the P-algorithm. A general framework for both methods is presented. Finally, a few numerical examples show that on the set of Dixon-Szego test functions our method yields favourable results in comparison to other global optimization methods.
On the computation of all global minimizers through particle swarm optimization
- IEEE Transactions on Evolutionary Computation
, 2004
"... Abstract—This paper presents approaches for effectively computing all global minimizers of an objective function. The approaches include transformations of the objective function through the recently proposed deflection and stretching techniques, as well as a repulsion source at each detected minimi ..."
Abstract
-
Cited by 74 (18 self)
- Add to MetaCart
(Show Context)
Abstract—This paper presents approaches for effectively computing all global minimizers of an objective function. The approaches include transformations of the objective function through the recently proposed deflection and stretching techniques, as well as a repulsion source at each detected minimizer. The aforementioned techniques are incorporated in the context of the particle swarm optimization (PSO) method, resulting in an efficient algorithm which has the ability to avoid previously detected solutions and, thus, detect all global minimizers of a function. Experimental results on benchmark problems originating from the fields of global optimization, dynamical systems, and game theory, are reported, and conclusions are derived. Index Terms—Deflection technique, detecting all minimizers, dynamical systems, Nash equilibria, particle swarm optimization (PSO), periodic orbits, stretching technique. I.