Results 1  10
of
4,122
A Fast and Elitist MultiObjective Genetic Algorithm: NSGAII
, 2000
"... Multiobjective evolutionary algorithms which use nondominated sorting and sharing have been mainly criticized for their (i) O(MN computational complexity (where M is the number of objectives and N is the population size), (ii) nonelitism approach, and (iii) the need for specifying a sharing param ..."
Abstract

Cited by 1815 (60 self)
 Add to MetaCart
II is observed. Because of NSGAII's low computational requirements, elitist approach, parameterless niching approach, and simple constrainthandling strategy, NSGAII should find increasing applications in the coming years.
A Fast Elitist NonDominated Sorting Genetic Algorithm for MultiObjective Optimization: NSGAII
, 2000
"... Multiobjective evolutionary algorithms which use nondominated sorting and sharing have been mainly criticized for their (i) 4 computational complexity (where is the number of objectives and is the population size), (ii) nonelitism approach, and (iii) the need for specifying a sharing ..."
Abstract

Cited by 662 (15 self)
 Add to MetaCart
to find much better spread of solutions in all problems compared to PAESanother elitist multiobjective EA which pays special attention towards creating a diverse Paretooptimal front. Because of NSGAII's low computational requirements, elitist approach, and parameterless sharing approach
Greedy Function Approximation: A Gradient Boosting Machine
 Annals of Statistics
, 2000
"... Function approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest{descent minimization. A general gradient{descent \boosting" paradigm is developed for additi ..."
Abstract

Cited by 1000 (13 self)
 Add to MetaCart
Function approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest{descent minimization. A general gradient{descent \boosting" paradigm is developed
Loopy belief propagation for approximate inference: An empirical study. In:
 Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performanc ..."
Abstract

Cited by 676 (15 self)
 Add to MetaCart
. Again we found that loopy belief propagation always converged, with the average number of iterations equal to 8.65. The protocol for the ALARM network experiments dif fered from the previous two in that the structure and parameters were fixed only the observed evidence differed between experimental
Adjustable robust solutions of uncertain linear programs
, 2004
"... We consider linear programs with uncertain parameters, lying in some prescribed uncertainty set, where part of the variables must be determined before the realization of the uncertain parameters (“nonadjustable variables”), while the other part are variables that can be chosen after the realization ..."
Abstract

Cited by 370 (12 self)
 Add to MetaCart
We consider linear programs with uncertain parameters, lying in some prescribed uncertainty set, where part of the variables must be determined before the realization of the uncertain parameters (“nonadjustable variables”), while the other part are variables that can be chosen after
Approximating the nondominated front using the Pareto Archived Evolution Strategy
 EVOLUTIONARY COMPUTATION
, 2000
"... We introduce a simple evolution scheme for multiobjective optimization problems, called the Pareto Archived Evolution Strategy (PAES). We argue that PAES may represent the simplest possible nontrivial algorithm capable of generating diverse solutions in the Pareto optimal set. The algorithm, in its ..."
Abstract

Cited by 321 (19 self)
 Add to MetaCart
simplest form, is a (1 + 1) evolution strategy employing local search but using a reference archive of previously found solutions in order to identify the approximate dominance ranking of the current and candidate solution vectors. (1 + 1)PAES is intended to be a baseline approach against which more
Parameterless optimization with the extended compact genetic algorithm and iterated local search
 In GECCO2004: Proceedings of the Genetic and Evolutionary Computation Conference, Part I, volume 3102 of Lecture Notes in Computer Science
, 2004
"... Abstract. This paper presents a parameterless optimization framework that uses the extended compact genetic algorithm (ECGA) and iterated local search (ILS), but is not restricted to these algorithms. The presented optimization algorithm (ILS+ECGA) comes as an extension of the parameterless geneti ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract. This paper presents a parameterless optimization framework that uses the extended compact genetic algorithm (ECGA) and iterated local search (ILS), but is not restricted to these algorithms. The presented optimization algorithm (ILS+ECGA) comes as an extension of the parameterless
The phonology of Dutch
"... The phonology of most languages has until now been available only in a fragmented way, through unpublished theses, or articles scattered in more or less accessible journals. Each volume in this series will offer an extensive treatment of the phonology of one language within a modern theoretical pers ..."
Abstract

Cited by 262 (6 self)
 Add to MetaCart
The phonology of most languages has until now been available only in a fragmented way, through unpublished theses, or articles scattered in more or less accessible journals. Each volume in this series will offer an extensive treatment of the phonology of one language within a modern theoretical
The ParameterLess Genetic Algorithm: Rational And Automated Parameter Selection For Simplified Genetic Algorithm Operation
, 2000
"... Genetic algorithms (GAs) have been used to solve difficult optimization problems in a number of fields. One of the advantages of these algorithms is that they operate well even in domains where little is known, thus giving the GA the flavor of a general purpose problem solver. However, in order ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
to set these parameters. Instead, the parameters are set automatically by the algorithm itself. The validity of the approach is illustrated with artificial problems often used to test GA techniques, and also with a simplified version of a network expansion problem.
Multiobjective Optimization and Multiple Constraint Handling with Evolutionary AlgorithmsPart I: A Unified Formulation
 IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
, 1998
"... In optimization, multiple objectives and constraints cannot be handled independently of the underlying optimizer. Requirements such as continuity and differentiability of the cost surface add yet another conflicting element to the decision process. While ``better'' solutions should be rate ..."
Abstract

Cited by 232 (13 self)
 Add to MetaCart
, most notably the concerted handling of multiple candidate solutions. However, EAs are essentially unconstrained search techniques which require the assignment of a scalar measure of quality, or fitness, to such candidate solutions. After reviewing current evolutionary approaches to multiobjective
Results 1  10
of
4,122