Results 1  10
of
26
Using previous models to bias structural learning in the hierarchical BOA
, 2008
"... Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at l ..."
Abstract

Cited by 20 (11 self)
 Add to MetaCart
(Show Context)
Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling explicit probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum or at least its accurate approximation, besides this, any EDA provides us with a sequence of probabilistic models, which in most cases hold a great deal of information about the problem. Although using problemspecific knowledge has been shown to significantly improve performance of EDAs and other evolutionary algorithms, this readily available source of problemspecific information has been practically ignored by the EDA community. This paper takes the first step towards the use of probabilistic models obtained by EDAs to speed up the solution of similar problems in future. More specifically, we propose two approaches to biasing model building in the hierarchical Bayesian optimization algorithm (hBOA) based on knowledge automatically learned from previous hBOA runs on similar problems. We show that the proposed methods lead to substantial speedups and argue that the methods should work well in other applications that require solving a large number of problems with similar structure.
An introduction and survey of estimation of distribution algorithms
 SWARM AND EVOLUTIONARY COMPUTATION
, 2011
"... ..."
(Show Context)
Learning Computer Programs with the Bayesian Optimization Algorithm. Forthcoming
, 2004
"... We describe an extension of the Bayesian Optimization Algorithm (BOA), a probabilistic model building genetic algorithm, to the domain of program tree evolution. The new system, BOA programming (BOAP), improves significantly on previous probabilistic model building genetic programming (PMBGP) system ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
(Show Context)
We describe an extension of the Bayesian Optimization Algorithm (BOA), a probabilistic model building genetic algorithm, to the domain of program tree evolution. The new system, BOA programming (BOAP), improves significantly on previous probabilistic model building genetic programming (PMBGP) systems in terms of the articulacy and openended flexibility of the models learned, and hence control over the distribution of instances generated. Innovations include a novel tree representation and a generalized program evaluation scheme.
Robust and Scalable BlackBox Optimization, Hierarchy and Ising Spin Glasses
, 2003
"... One of the most important challenges in computational optimization is the design of advanced blackbox optimization techniques that would enable automated, robust, and scalable solution to challenging optimization problems. This paper describes an advanced blackbox optimizer—the hierarchical Bayesi ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
One of the most important challenges in computational optimization is the design of advanced blackbox optimization techniques that would enable automated, robust, and scalable solution to challenging optimization problems. This paper describes an advanced blackbox optimizer—the hierarchical Bayesian optimization algorithm (hBOA)—that combines techniques of genetic and evolutionary computation, machine learning, and statistics to create a widely applicable tool for solving realworld optimization problems. The paper motivates hBOA, describes its basic procedure, and provides an indepth empirical analysis of hBOA on the class of random 2D and 3D Ising spin glass problems. The results on Ising spin glasses indicate that even without much problemspecific knowledge, hBOA can provide competitive or better results than techniques specialized in solving the particular problem or class of problems. Furthermore, hBOA can solve a large class of nearly decomposable and hierarchical problems for which there exists no other scalable solution. 1
Limits of scalability of multiobjective estimation of distribution algorithms
 Proceedings of the Congress on Evolutionary Computation
, 2005
"... Abstract The paper analyzes the scalability of multiobjective estimation of distribution algorithms (MOEDAs), particularly multiobjective extended compact genetic algorithm (meCGA), on a class of boundedlydifficult additivelyseparable multiobjective optimization problems. The paper demonstrates t ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Abstract The paper analyzes the scalability of multiobjective estimation of distribution algorithms (MOEDAs), particularly multiobjective extended compact genetic algorithm (meCGA), on a class of boundedlydifficult additivelyseparable multiobjective optimization problems. The paper demonstrates that even if the linkage is correctly identified, massive multimodality of the search problems can easily overwhelm the nicher and lead to exponential scaleup. The exponential growth of the Paretooptimal solutions introduces a fundamental limit on the scalability of MOEDAs and the number of competing substructures between the multiple objectives. Facetwise models are subsequently used to predict this limit in the growth rate of the number of differing substructures between the two objectives to avoid the niching method from being overwhelmed and lead to polynomial scalability of MOEDAs. 1
Novamente: An Integrative Architecture for General Intelligence
 In AAAI Technical Report FS0401. Menlo Park
, 2004
"... The Novamente AI Engine is briefly reviewed. The overall architecture is unique, drawing on systemtheoretic ideas regarding complex mental dynamics and associated emergent patterns. We describe how these are facilitated by a novel knowledge representation which allows diverse cognitive processes to ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The Novamente AI Engine is briefly reviewed. The overall architecture is unique, drawing on systemtheoretic ideas regarding complex mental dynamics and associated emergent patterns. We describe how these are facilitated by a novel knowledge representation which allows diverse cognitive processes to interact effectively. We then elaborate the two primary cognitive algorithms used to construct these processes: probabilistic term logic (PTL), and the Bayesian Optimization Algorithm (BOA). PTL is a highly flexible inference framework, applicable to domains involving uncertain, dynamic data, and autonomous agents in complex environments. BOA is a populationbased optimization algorithm which can incorporate prior knowledge. While originally designed to operate on bit strings, our extended version also learns programs and predicates with variable length and treelike structure, used to represent actions, perceptions, and internal state. We detail some of the specific dynamics and structures we expect to emerge through the interaction of the cognitive processes, outline our approach to training the system through experiential interactive learning, and conclude with a description of some recent results obtained with our partial implementation, including practical work in bioinformatics, natural language processing, and knowledge discovery.
Decomposable problems, niching, and scalability of multiobjective estimation of distribution algorithms
, 2005
"... The paper analyzes the scalability of multiobjective estimation of distribution algorithms (MOEDAs) on a class of boundedlydifficult additivelyseparable multiobjective optimization problems. The paper illustrates that even if the linkage is correctly identified, massive multimodality of the search ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
The paper analyzes the scalability of multiobjective estimation of distribution algorithms (MOEDAs) on a class of boundedlydifficult additivelyseparable multiobjective optimization problems. The paper illustrates that even if the linkage is correctly identified, massive multimodality of the search problems can easily overwhelm the nicher and lead to exponential scaleup. Facetwise models are subsequently used to propose a growth rate of the number of differing substructures between the two objectives to avoid the niching method from being overwhelmed and lead to polynomial scalability of MOEDAs. 1
Order or not: Does parallelization of model building in hBOA affect its scalability
, 2006
"... It has been shown that model building in the hierarchical Bayesian optimization algorithm (hBOA) can be efficiently parallelized by randomly generating an ancestral ordering of the nodes of the network prior to learning the network structure and allowing only dependencies consistent with the generat ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
It has been shown that model building in the hierarchical Bayesian optimization algorithm (hBOA) can be efficiently parallelized by randomly generating an ancestral ordering of the nodes of the network prior to learning the network structure and allowing only dependencies consistent with the generated ordering. However, it has not been thoroughly shown that this approach to restricting probabilistic models does not affect scalability of hBOA on important classes of problems. This paper demonstrates that although the use of a random ancestral ordering restricts the structure of considered models to allow efficient parallelization of model building, its effects on hBOA performance and scalability are negligible.
Learn from the Past: Improving ModelDirected . . . Distancebased Bias
, 2012
"... For many optimization problems it is possible to define a problemspecific distance metric over decision variables that correlates with the strength of interactions between the variables. Examples of such problems include additively decomposable functions, facility location problems, and atomic clus ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
For many optimization problems it is possible to define a problemspecific distance metric over decision variables that correlates with the strength of interactions between the variables. Examples of such problems include additively decomposable functions, facility location problems, and atomic cluster optimization. However, the use of such a metric for enhancing efficiency of optimization techniques is often not straightforward. This paper describes a framework that allows optimization practitioners to improve efficiency of modeldirected optimization techniques by combining such a distance metric with information mined from previous optimization runs on similar problems. The framework is demonstrated and empirically evaluated in the context of the hierarchical Bayesian optimization algorithm (hBOA). Experimental results provide strong empirical evidence that the proposed approach provides significant speedups and that it can be effectively combined with other efficiency enhancements. The paper demonstrates how straightforward it is to adapt the proposed framework to other modeldirected optimization techniques by presenting several examples.