#### DMCA

## Covering pareto-optimal fronts by subswarms in multi-objective particle swarm optimization (2004)

Venue: | In 2004 Congress on Evolutionary Computation (CEC’2004 |

Citations: | 16 - 1 self |

### Citations

1831 |
Multiobjective optimization using evolutionary algorithms
- Deb
- 2001
(Show Context)
Citation Context ... solutions Difficulties: adverse density of solutions, non-convex front and discontinuous front 3 Test function T6 has a disconnected Pareto-optimal front which is a different version as described in =-=[2]-=-.sA. Parameter Setting The parameters are selected as follows. - Inertia weight: 0.4 - Turbulence factor: 0.07 - Population size: initial run: 100 for T6, 200 for T1 and T3, 300 for T4, covering: 200 ... |

672 | SPEA2: Improving the strength pareto evolutionary algorithm for multiobjective optimization
- Zitzler, Laumanns, et al.
- 2002
(Show Context)
Citation Context ...tput, by keeping a good diversity along the Pareto-optimal front. Diversity of output solutions is studied by applying methods like niching, clustering or truncation by several researchers [2], [14], =-=[15]-=-. These techniques often need a high computational time and at last we have a restricted number of solutions in the output [14], [15]. On the other hand, finding a large set of Pareto-optimal solution... |

433 | Evolutionary algorithms for multiobjective optimization: methods and applications
- Zitzler
- 1999
(Show Context)
Citation Context ... the solutions shown in Figure 7 (1st Run) are not actually on the true Pareto-optimal front, although the MOPSO method has obtained obviously higher converged solutions than the other methods, e.g., =-=[14]-=- (this is also explained in [2]). One reason is the restricted number of generations. We have to notice that the obtained solutions from the first run have good diversity. 1408 f2 1 0.8 0.6 0.4 0.2 (a... |

113 | Mopso: A proposal for multiple objective particle swarm optimization
- Coello, Lechuga
- 2003
(Show Context)
Citation Context ...he Pareto-optimal set. The Pareto-optimal set in the objective space is called Pareto-optimal front. II. BACKGROUND MOPSO methods are one of the utilities in solving different kinds of MOPs. In MOPSO =-=[1]-=-, [4], [6], [10], a set of particles are initialized in the decision space at random. To each particle i, a position xi in the decision space and velocity vi is assigned. The particles change their po... |

58 | A multi-objective algorithm based upon particle swarm optimization, an efficient data structure and turbulence
- Fieldsend, Singh
- 2002
(Show Context)
Citation Context ...reto-optimal set. The Pareto-optimal set in the objective space is called Pareto-optimal front. II. BACKGROUND MOPSO methods are one of the utilities in solving different kinds of MOPs. In MOPSO [1], =-=[4]-=-, [6], [10], a set of particles are initialized in the decision space at random. To each particle i, a position xi in the decision space and velocity vi is assigned. The particles change their positio... |

58 | On a Multi-Objective Evolutionary Algorithm and Its Convergence to the Pareto Set
- Rudolph
- 1998
(Show Context)
Citation Context ...nation criteria can be e.g., a maximum number of generations. Finding a relatively large set of Pareto-optimal solutions is possible by running the MOPSO for many generations. It is proved by Rudolph =-=[12]-=- that existence of elitism (keeping elite solutions in the archive) is necessary to converge to the Pareto-optimal front in MOEAs. MOPSOs have also the same structure as MOEA, with the differences in ... |

57 |
Strategies for finding good local guides in Multi-Objective Particle Swarm Optimization (MOPSO
- Mostaghim, Teich
- 2003
(Show Context)
Citation Context ...to-optimal front during generations. By running a MOPSO with a restricted archive size, it is possible to find a well distributed set of non-dominated solutions very close to the Pareto-optimal front =-=[10]-=-. Here, we use this knowledge and propose another MOPSO (called covering MOPSO) to cover the gaps between the non-dominated solutions. The particles in the population of the covering MOPSO are divided... |

49 | Using unconstrained elite archives for multi-objective optimisation
- Fieldsend, Everson, et al.
(Show Context)
Citation Context ...ve is restricted to a certain size. This is done because of the following reasons: • Most of MO methods need a high computational time, when the size of the archive increases. This is also studied by =-=[5]-=-, [11]. • Diversity of solutions improves, when the archive size is fixed, particularly in MOPSO. • The computational time for finding the best local guides in MOPSO increases, if we store high number... |

34 | Particle swarm with extended memory for multiobjective optimization
- Hu, Eberhart, et al.
- 2003
(Show Context)
Citation Context ...optimal set. The Pareto-optimal set in the objective space is called Pareto-optimal front. II. BACKGROUND MOPSO methods are one of the utilities in solving different kinds of MOPs. In MOPSO [1], [4], =-=[6]-=-, [10], a set of particles are initialized in the decision space at random. To each particle i, a position xi in the decision space and velocity vi is assigned. The particles change their positions an... |

23 | Archiving with guaranteed convergence and diversity in multi-objective optimization
- Laumanns, Thiele, et al.
- 2002
(Show Context)
Citation Context ...ering method is a combination of MOEA and a subdivision method [3]. Also, the ɛ-MOEA method is theoretically able to cover the approximated Pareto-optimal front in the case of using small values of ɛ =-=[8]-=-, [9]. However, it also needs a high computational time to complete the covering. In this paper, we address the covering of the Pareto-optimal front by applying MOPSO. MOPSO methods have the property ... |

16 |
Covering Pareto Sets by Multilevel Evolutionary Subdivision Techniques
- Schutze, Mostaghim, et al.
(Show Context)
Citation Context ...ptimal solutions is to apply covering techniques. Indeed, by covering we find a finite set of solutions which are very close to each other. Covering the Pareto-optimal front is studied by Hybrid MOEA =-=[13]-=-. In [13], the non-dominated solutions are found by MOEA and then a recovering method is applied to cover the Pareto-optimal front. In this case, the recovering method is a combination of MOEA and a s... |

10 |
2005, ’Covering Pareto Sets by Multilevel Subdivision Techniques
- Dellnitz, Schütze, et al.
(Show Context)
Citation Context ...-dominated solutions are found by MOEA and then a recovering method is applied to cover the Pareto-optimal front. In this case, the recovering method is a combination of MOEA and a subdivision method =-=[3]-=-. Also, the ɛ-MOEA method is theoretically able to cover the approximated Pareto-optimal front in the case of using small values of ɛ [8], [9]. However, it also needs a high computational time to comp... |

10 | Comparison of data structures for storing Pareto sets - Mostaghim, Teich, et al. - 2002 |

8 |
A bicriterial optimization problem of antenna design
- Jüschke, Jahn, et al.
- 1997
(Show Context)
Citation Context ...ose the feeding of the antenna to optimize the performance of the antenna. The fixed geometry of the antenna and the wave propagation of the fields generated by currents on the antenna are studied by =-=[7]-=- as a multi-objective optimization problem. The optimization problem is a 2-objective problem: maximize the radiation efficiency in a particular direction and minimize the power radiated into other di... |

2 |
The role of e-dominance in multiobjective particle swarm optimization
- Mostaghim, Teich
- 2003
(Show Context)
Citation Context ... method is a combination of MOEA and a subdivision method [3]. Also, the ɛ-MOEA method is theoretically able to cover the approximated Pareto-optimal front in the case of using small values of ɛ [8], =-=[9]-=-. However, it also needs a high computational time to complete the covering. In this paper, we address the covering of the Pareto-optimal front by applying MOPSO. MOPSO methods have the property that ... |