Results 1  10
of
12
Approximating the least hypervolume contributor: NPhard in general, but fast in practice
, 2008
"... ..."
An efficient algorithm for computing hypervolume contributions
 Evolutionary Computation
"... The hypervolume indicator serves as a sorting criterion in many recent multiobjective evolutionary algorithms (MOEAs). Typical algorithms remove the solution with the smallest loss with respect to the dominated hypervolume from the population. We present a new algorithm which determines for a popul ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
(Show Context)
The hypervolume indicator serves as a sorting criterion in many recent multiobjective evolutionary algorithms (MOEAs). Typical algorithms remove the solution with the smallest loss with respect to the dominated hypervolume from the population. We present a new algorithm which determines for a population of size n with d objectives, a solution with minimal hypervolume contribution in time O(n d/2 log n) for d> 2. This improves all previously published algorithms by a factor of n for all d> 3 and by a factor of √ n for d = 3. We also analyze hypervolume indicator based optimization algorithms which remove λ> 1 solutions from a population of size n = µ + λ. We show that there are populations such that the hypervolume contribution of iteratively chosen λ solutions is much larger than the hypervolume contribution of an optimal set of λ solutions. Selecting the optimal set of λ solutions implies calculating () n conventional hypervolume conµ tributions, which is considered to be computationally too expensive. We present the first hypervolume algorithm which calculates directly the contribution of every set of λ solutions. This gives an additive term of () n in the runtime of the calculation inµ stead of a multiplicative factor of ()
On sequential online archiving of objective vectors
 In EvolutionaryMulticriterion Optimization (EMO 2011
, 2011
"... Abstract. In this paper, we examine the problem of maintaining an approximation of the set of nondominated points visited during a multiobjective optimization, a problem commonly known as archiving. Most of the currently available archiving algorithms are reviewed, and what is known about their con ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we examine the problem of maintaining an approximation of the set of nondominated points visited during a multiobjective optimization, a problem commonly known as archiving. Most of the currently available archiving algorithms are reviewed, and what is known about their convergence and approximation properties is summarized. The main scenario considered is the restricted case where the archive must be updated online as points are generated one by one, and at most a fixed number of points are to be stored in the archive at any one time. In this scenario, the monotonicity of an archiving algorithm is proposed as a weaker, but more practical, property than negative efficiency preservation. This paper shows that hypervolumebased archivers and a recently proposed multilevel grid archiver have this property. On the other hand, the archiving methods used by SPEA2 and NSGAII do not, and they may deteriorate with time. Themonotonicity property has meaning on any input sequence of points. We also classify archivers according to limit properties, i.e. convergence and approximation properties of the archiver in the limit of infinite (input) samples from a finite space with strictly positive generation probabilities for all points. This paper establishes a number of research questions, and provides the initial framework and analysis for answering them.
Succinct Sampling from Discrete Distributions
"... We revisit the classic problem of sampling from a discrete distribution: Given n nonnegative wbit integers x1,..., xn, the task is to build a data structure that allows sampling i with probability proportional to xi. The classic solution is Walker’s alias method that takes, when implemented on a W ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We revisit the classic problem of sampling from a discrete distribution: Given n nonnegative wbit integers x1,..., xn, the task is to build a data structure that allows sampling i with probability proportional to xi. The classic solution is Walker’s alias method that takes, when implemented on a Word RAM, O(n) preprocessing time, O(1) expected query time for one sample, and n(w+2 lg n+o(1)) bits of space. Using the terminology of succinct data structures, this solution has redundancy 2n lg n + o(n) bits, i.e., it uses 2n lg n + o(n) bits in addition to the information theoretic minimum required for storing the input. In this paper, we study whether this space usage can be improved. In the systematic case, in which the input is readonly, we present a novel data structure using r + O(w) redundant
Scaling up indicatorbased MOEAs by approximating the least hypervolume contributor: A preliminary study
 In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO): Workshop on Theoretical Aspects of Evolutionary Multiobjective Optimization
, 2010
"... CMAES 1. ..."
(Show Context)
JK.: ”Hypervolumebased MultiObjective Local
 Search”, Neural Computing and Applications, Springer London
, 2012
"... Abstract This paper presents a multiobjective local search, where the selection is realized according to the hypervolume contribution of solutions. The HBMOLS algorithm proposed is inspired from the IBEA algorithm, an indicatorbased multiobjective evolutionary algorithm proposed by Zitzler and Ku ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract This paper presents a multiobjective local search, where the selection is realized according to the hypervolume contribution of solutions. The HBMOLS algorithm proposed is inspired from the IBEA algorithm, an indicatorbased multiobjective evolutionary algorithm proposed by Zitzler and Künzli in 2004, where the optimization goal is defined in terms of a binary indicator defining the selection operator. In this paper, we use the indicator optimization principle, and we apply it to an iterated local search algorithm, using hypervolume contribution indicator as selection mechanism. The methodology proposed here has been defined in order to be easily adaptable and to be as parameterindependent as possible. We carry out a range of experiments on the multiobjective flow shop problem and the multiobjective quadratic assignment problem, using the hypervolume contribution selection as well as two different binary indicators which were initially proposed in the IBEA algorithm. Experimental results indicate that the HBMOLS algorithm is highly effective in comparison with the algorithms based on binary indicators.
Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization
 J GLOB OPTIM
, 2014
"... ..."
Illustration of Fairness in Evolutionary MultiObjective Optimization
"... It is widely assumed that evolutionary algorithms for multiobjective optimization problems should use certain mechanisms to achieve a good spread over the Pareto front. In this paper, we examine such mechanisms from a theoretical point of view and analyze simple algorithms incorporating the concept ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
It is widely assumed that evolutionary algorithms for multiobjective optimization problems should use certain mechanisms to achieve a good spread over the Pareto front. In this paper, we examine such mechanisms from a theoretical point of view and analyze simple algorithms incorporating the concept of fairness. This mechanism tries to balance the number of offspring of all individuals in the current population. We rigorously analyze the runtime behavior of different fairness mechanisms and present showcase examples to point out situations, where the right mechanism can speed up the optimization process significantly. We also indicate drawbacks for the use of fairness by presenting instances, where the optimization process is slowed down drastically. 1
Local search Flow shop problem
"... Abstract This paper presents a multiobjective local search, where the selection is realized according to the hypervolume contribution of solutions. The HBMOLS algorithm proposed is inspired from the IBEA algorithm, an indicatorbased multiobjective evolutionary algorithm proposed by Zitzler and Kü ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract This paper presents a multiobjective local search, where the selection is realized according to the hypervolume contribution of solutions. The HBMOLS algorithm proposed is inspired from the IBEA algorithm, an indicatorbased multiobjective evolutionary algorithm proposed by Zitzler and Künzli in 2004, where the optimization goal is defined in terms of a binary indicator defining the selection operator. In this paper, we use the indicator optimization principle, and we apply it to an iterated local search algorithm, using hypervolume contribution indicator as selection mechanism. The methodology proposed here has been defined in order to be easily adaptable and to be as parameterindependent as possible. We carry out a range of experiments on the multiobjective flow shop problem and the multiobjective quadratic assignment problem, using the hypervolume contribution selection as well as two different binary indicators which were initially proposed in the IBEA algorithm. Experimental results indicate that the HBMOLS algorithm is highly effective in comparison with the algorithms based on binary indicators.
Rate in Oil & Gas Equipment
, 2012
"... Scale deposition can damage equipment in the oil & gas production industry. Hence, the reliable and accurate prediction of the scale deposition rate is critical for production availability. In this study, we consider the problem of predicting the scale deposition rate, providing an indication of ..."
Abstract
 Add to MetaCart
(Show Context)
Scale deposition can damage equipment in the oil & gas production industry. Hence, the reliable and accurate prediction of the scale deposition rate is critical for production availability. In this study, we consider the problem of predicting the scale deposition rate, providing an indication of the associated prediction uncertainty. We tackle the problem using an empirical modeling approach, based on experimental data. Specifically, we implement a multiobjective genetic algorithm (namely, nondominated sorting genetic algorithm–II (NSGAII)) to train a neural network (NN) (i.e. to find its parameters, that is its weights and biases) to provide the prediction intervals (PIs) of the scale deposition rate. The PIs are optimized both in terms of accuracy (coverage probability) and dimension (width). We perform kfold crossvalidation to guide the choice of the NN structure (i.e. the number of hidden neurons). We use hypervolume indicator metric to evaluate the Pareto fronts in the validation step. A case study is considered, with regards to a set of experimental observations: the NSGAIItrained neural network is shown capable of providing PIs with both high coverage and small width.