#### DMCA

## Abstract Author's personal copy The landscape adaptive particle swarm optimizer (2007)

### Citations

3621 | A modified particle swarm optimizer,”
- Shi, Eberhart
- 1998
(Show Context)
Citation Context ...ion Applied Soft Computing 8 (2008) 295–304 Particle swarm optimization (PSO) is a population-based global optimization method based on a simple simulation of bird flocking or fish schooling behavior =-=[1]-=-. In PSO, the search points are known as particles, and each particle is initialized with a random position and random initial velocity in the Ddimensional search space. The position and velocity of a... |

985 |
Practical Nonparametric Statistics.
- Conover
- 1999
(Show Context)
Citation Context ...nce of different algorithms for each function over 100 repetitions. 3.3.2. Experiment No.2 The final minimal fitness values after 5000 iterations are analyzed using the Kruskal–Wallis test (K–W test) =-=[36]-=-. The test statistic T is defined as: T 1 S 2 X k i1 S 2 1 N 1 R 2 i ni X all ranks ðN þ 1Þ2 N 4 RðXijÞ 2 ðN þ 1Þ2 N 4 (14) (15) where k is the number of groups, N the total number of samples, ni... |

826 | The particle swarm - explosion, stability and convergence in a multidimensional complex space," - Clerc, Kennedy - 2002 |

820 |
Evolutionsstrategie: Optimierung Technischer Systeme nach Prinzipien der Biologischen Evolution,
- Rechenberg
- 1973
(Show Context)
Citation Context ...and analysis of consecutive experiments with stepwise variable adjustments driving a suitably flexible object/system into its optimal state in spite of environmental noise [23]. Initially, Rechenberg =-=[24]-=- developed the (1 + 1)-ES, a simple mutation plus selection scheme operating on one individual that creates one offspring per generation by means of Gaussian mutation. He also proposed a (m + 1)-ES wh... |

612 |
Numerical Optimization of Computer Models,
- Schwefel
- 1981
(Show Context)
Citation Context ...hich, after mutation, eventually replaces the worst parent individual. This strategy is thought to be the foundation of the well-known (m + l)-ES and (m, l)-ES introduced and investigated by Schwefel =-=[25,26]-=-, which became the state-ofthe-art in ES research [27]. In an evolution strategy, the population is randomly initialized. Then a number of generations involving recombination, mutation and selection a... |

302 |
Empirical study of particle swarm optimization.
- Shi, Eberhart
- 1999
(Show Context)
Citation Context ... velocity. The synchronous update of position is thus: * x ðt þ 1Þ x * ðtÞþv * ðt þ 1Þ (2) The pseudo code of PSO is shown in Fig. 1. In order to improve the local search precision, Shi and Eberhart =-=[2]-=- introduce the inertia weight w to Eq. (1) to give the following update rule: * v ðt þ 1Þ vv * ðtÞþc1R * 1ð p * best www.elsevier.com/locate/asoc x * ðtÞÞ þ c2R * 2ðg * best x * ðtÞÞ (3) A value of w... |

245 | Comparing Inertia Weights and Constriction Factors in Particle Swarm Optimization,
- Eberhart, Shi
- 2000
(Show Context)
Citation Context ...e. Usually, V max may be set as X max. Although the constriction factor is proposed to overcome the limit of V max, it has been reported that better results may be obtained while setting Vmax as Xmax =-=[32]-=-. Learning factors: usually c1 is set equal to c2 (popularly being 2) and ranges from [0,4] following the suggestion of Carlisle and Dozier [33]. He also suggests that c1 = 2.8 and c2 = 1.3 produce go... |

223 | Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation,”
- Hansen, Ostermeier
- 1996
(Show Context)
Citation Context ...an evolve during the evolution process and are needed in a self-adaptive ES. During the past years, the improvement of self-adaptation led to the development of the Covariance Matrix Adaptation (CMA) =-=[28]-=-. The objective of CMA is to fit the search distribution to the contour lines of the objective function f to be minimized. Here, the (m, l)-ES with covariance matrix adaptation (CMA-ES) isstreated as ... |

220 |
Numerische Optimierung von Computer-modellen mittels der Evolutionsstrategie”,
- Schwefel
- 1977
(Show Context)
Citation Context ...hich, after mutation, eventually replaces the worst parent individual. This strategy is thought to be the foundation of the well-known (m + l)-ES and (m, l)-ES introduced and investigated by Schwefel =-=[25,26]-=-, which became the state-ofthe-art in ES research [27]. In an evolution strategy, the population is randomly initialized. Then a number of generations involving recombination, mutation and selection a... |

177 |
The swarm and the queen: Towards a deterministic and adaptive particle swarm optimization,”
- Clerc
- 1999
(Show Context)
Citation Context ...nearly from 0.95 to 0.4 is the best reduction strategy of those tested in [2], over their suite of test functions. In Chatterjee’s work [3], a dynamic change of inertia weight is suggested.s296 Clerc =-=[4]-=- indicates that the use of a constriction factor K may also be necessary to ensure convergence of the particle swarm algorithm, defined as when all particles have stopped moving. Their update rule for... |

171 |
The particle swarm optimization algorithm: convergence analysis and parameter selection.
- Trelea
- 2003
(Show Context)
Citation Context ...count some earlier results reported in the literature to choose parameter settings, as follows. In most cases, increasing the number of particles decreases the number of required algorithm iterations =-=[31]-=-. But more particles require more function evaluations. The typical range is suggested as [20–40]. Vmax determines the maximum change one particle can take. Usually, V max may be set as X max. Althoug... |

170 | Evolutionary optimization versus particle swarm optimization: philosophy and performance difference. - Angeline - 1998 |

147 |
Evaluating the CMA evolution strategy on multimodal test functions
- Hansen, Kern
- 2004
(Show Context)
Citation Context ...s 30 in order to keep the function evaluations consistent with PSO methods. The population of the parents, m is 15. For the other parameters we use the original calculations reported in Hanson’s work =-=[35]-=-. 3.3. Experiments 3.3.1. Experiment No.1 For each function, the stopping condition is the maximum number of iterations set at 5000. Test results are shown in Fig. 3 which gives the performance of dif... |

138 |
worlds and mega-minds: Effects of neighborhood topology on particle swarm performance
- Small
- 1999
(Show Context)
Citation Context ...* ðtÞÞ Author's personal copy x * ðtÞÞŠ (4) 2 K ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi j2 ’ ’ 2 p ; (5) 4’ j where w = c1 + c2 and w > 4. In addition to the global version of PSO, Kennedy =-=[5]-=- has also introduced the use of a local variable ðl * bestÞ, which confers more ability to escape from local optima. In this approach, g * best is replaced by l * best in Eq. (1): * v ðt þ 1Þ v * ðtÞ... |

91 | An Off-The-Shelf PSO,
- Carlisle, Dozier
- 2001
(Show Context)
Citation Context ...tter results may be obtained while setting Vmax as Xmax [32]. Learning factors: usually c1 is set equal to c2 (popularly being 2) and ranges from [0,4] following the suggestion of Carlisle and Dozier =-=[33]-=-. He also suggests that c1 = 2.8 and c2 = 1.3 produce good results. Jiang explains the parameter selection guideline by stochastic convergence analysis [34]. Global version versus local version: the g... |

69 | A study of particle swarm optimization particle trajectories,
- Bergh, Engelbrecht
- 2006
(Show Context)
Citation Context ... convergence to a stable point, which is shown to be a weighted average of the personal best and global best positions, where the weights are determined by the values of the acceleration coefficients =-=[22]-=-. A comprehensive review is beyond the scope of this article. 1.3. Evolution Strategy In the 1960s, the evolution strategy (ES) was originated as a set of rules for the automatic design and analysis o... |

58 | A multi-objective algorithm based upon particle swarm optimization, an efficient data structure and turbulence
- Fieldsend, Singh
- 2002
(Show Context)
Citation Context ...and the velocity should still be constricted by the constriction factor: * v ðt þ 1Þ Kv * ðtÞþc1R * 1ð p * best þ c3R * 3ðl * best x * ðtÞÞŠ x * ðtÞÞ þ c2R * 2ðg * best x * ðtÞÞ Fieldsend and Singh =-=[7]-=- introduce a stochastic variable, turbulence, into the standard PSO. He et al. [8] modify the stochastic variable to a passive congregation which improves the PSO from both accuracy and convergence ra... |

50 |
Evolution strategies—a comprehensive introduction,
- Beyer, Schwefel
- 2002
(Show Context)
Citation Context ...t individual. This strategy is thought to be the foundation of the well-known (m + l)-ES and (m, l)-ES introduced and investigated by Schwefel [25,26], which became the state-ofthe-art in ES research =-=[27]-=-. In an evolution strategy, the population is randomly initialized. Then a number of generations involving recombination, mutation and selection are performed. Selection is based on the fitness value ... |

35 |
Nonlinear Inertia Weight Variation for Dynamic Adaptation in Particle Swarm Optimization
- Chatterjee, Siarry
- 2006
(Show Context)
Citation Context ... ðtÞÞ þ c2R * 2ðg * best x * ðtÞÞ (3) A value of w decreasing linearly from 0.95 to 0.4 is the best reduction strategy of those tested in [2], over their suite of test functions. In Chatterjee’s work =-=[3]-=-, a dynamic change of inertia weight is suggested.s296 Clerc [4] indicates that the use of a constriction factor K may also be necessary to ensure convergence of the particle swarm algorithm, defined ... |

31 |
A particle swarm optimizer with passive congregation.
- He, Wu, et al.
- 2004
(Show Context)
Citation Context ...1Þ Kv * ðtÞþc1R * 1ð p * best þ c3R * 3ðl * best x * ðtÞÞŠ x * ðtÞÞ þ c2R * 2ðg * best x * ðtÞÞ Fieldsend and Singh [7] introduce a stochastic variable, turbulence, into the standard PSO. He et al. =-=[8]-=- modify the stochastic variable to a passive congregation which improves the PSO from both accuracy and convergence rate. Leontitsis et al. [9] also put forward a concept of repellor in their paper. T... |

31 |
Stochastic convergence analysis and parameter selection of the standard particle swarm optimization algorithm,”
- Jiang, Luo, et al.
- 2007
(Show Context)
Citation Context ...owing the suggestion of Carlisle and Dozier [33]. He also suggests that c1 = 2.8 and c2 = 1.3 produce good results. Jiang explains the parameter selection guideline by stochastic convergence analysis =-=[34]-=-. Global version versus local version: the global versions faster but might converge to a local optimum for some problems; the local version is a little bit slower but not easily trapped into a local ... |

29 | Hybrid Particle Swarm Optimiser with Breeding and Subpopulations - Løvbjerg, Rasmussen, et al. - 2001 |

28 | Effects of swarm size on cooperative particle swarm optimizers. - Bergh, Engelbrecht - 2001 |

25 |
Adaptive particle swarm optimization, in
- Yasuda, Ide, et al.
(Show Context)
Citation Context ...o gain a better, general understanding of the behavior of particle swarms, theoretical analyses of particle trajectories are necessary. A few theoretical studies of particle trajectories can be found =-=[20,21]-=-. These studies facilitated the derivation of heuristics to select parameter values for guaranteed convergence to a stable point, which is shown to be a weighted average of the personal best and globa... |

23 | Multiobjective control of power plants using particle swarm optimization techniques - Heo, Lee, et al. - 2006 |

20 |
On the convergence analysis and parameter selection in particle swarm optimization
- Zheng, Ma, et al.
- 2010
(Show Context)
Citation Context ...o gain a better, general understanding of the behavior of particle swarms, theoretical analyses of particle trajectories are necessary. A few theoretical studies of particle trajectories can be found =-=[20,21]-=-. These studies facilitated the derivation of heuristics to select parameter values for guaranteed convergence to a stable point, which is shown to be a weighted average of the personal best and globa... |

10 |
Experimentelle Optimierung einer Zweiphasendüse Teil I
- Schwefel
- 1968
(Show Context)
Citation Context ...es for the automatic design and analysis of consecutive experiments with stepwise variable adjustments driving a suitably flexible object/system into its optimal state in spite of environmental noise =-=[23]-=-. Initially, Rechenberg [24] developed the (1 + 1)-ES, a simple mutation plus selection scheme operating on one individual that creates one offspring per generation by means of Gaussian mutation. He a... |

8 |
Multiobjective Particle Swarm Optimization for Parameter Estimation
- Gill, Kaheil, et al.
(Show Context)
Citation Context ... the parameter identification of the induction motor. Lin et al. apply the modified PSO on the quantitative structure-activity relationship (QSAR) models [15]. PSO is also implemented for drug design =-=[16]-=-. In some research work, PSO, which is initially dealt with a single-objective function, has also been extended to deal with multi-objective problems [17–19]. Most of the PSO studies are empirical. To... |

6 | Microwave absorber optimal design using multi-objective particle swarm optimization - Goudos, Sahalos |

3 | Accelerating real valued genetic algorithms using mutation-with-momentum
- Temby, Vamplew, et al.
- 2005
(Show Context)
Citation Context ...mum solution far from the global optima where many search algorithms are trapped. Moreover, the global optimum exists near the bounds of the domain. There is no correlation among its design variables =-=[30]-=-. Table 1 Standard multi-modal objective functions No. Function Equation ( f* gives the minima) Parameters f1 DeJong F1 f ðx * Þ Pn i1x2i ; f ð0; 0; ...; 0Þ 0 Dimensions = 30 Xmax = 100 f2 Griewank... |

2 |
An extended particle swarm optimizer
- Xu, Xin
- 2005
(Show Context)
Citation Context ...(6) where l * best is the best position achieved by a ‘neighboring’ particle and the definition of the neighborhood varies in different implementations of the approach. In recent research, Xu and Xin =-=[6]-=- point out that the combined use of g * best and l * best may be helpful for the search process, and the velocity should still be constricted by the constriction factor: * v ðt þ 1Þ Kv * ðtÞþc1R * 1... |

2 |
Repel the swarm to the optimum
- Leontitsis, Kontogiorgos, et al.
- 2006
(Show Context)
Citation Context ...c variable, turbulence, into the standard PSO. He et al. [8] modify the stochastic variable to a passive congregation which improves the PSO from both accuracy and convergence rate. Leontitsis et al. =-=[9]-=- also put forward a concept of repellor in their paper. The authors believe that the worst particles have the property of repelling the particles to the local optima. 1.2. Related research Fig. 1. The... |

2 |
Piecewise hypersphere modeling by particle swarm optimization in QSAR studies of bioactivities of chemical compounds
- Lin
- 2005
(Show Context)
Citation Context ...ithm and support vector machine, etc. in the literature [10–13]. Various PSOs have been applied to J. Yisu et al. / Applied Soft Computing 8 (2008) 295–304 (7) different research fields. In reference =-=[14]-=-, PSO is implemented to the parameter identification of the induction motor. Lin et al. apply the modified PSO on the quantitative structure-activity relationship (QSAR) models [15]. PSO is also imple... |

2 |
Particle swarms for drug design
- Cedeno, Agrafiotis
- 2005
(Show Context)
Citation Context ...elds. In reference [14], PSO is implemented to the parameter identification of the induction motor. Lin et al. apply the modified PSO on the quantitative structure-activity relationship (QSAR) models =-=[15]-=-. PSO is also implemented for drug design [16]. In some research work, PSO, which is initially dealt with a single-objective function, has also been extended to deal with multi-objective problems [17–... |

1 | Particle swarm optimization-based SVM for incipient fault classification of power transformers - Tsair-Fwu, Ming-Yuan, et al. - 2006 |

1 |
QSAR analysis of substituted bis[(acridine-4-carboxamide) propyl] methylamines using optimized block-wise variable combination by particle swarm optimization for partial least squares modeling
- Li, Lin, et al.
(Show Context)
Citation Context ...rsely influenced. Thus, the information from the leaders does not always have positive effects. Considering of this, a possibility to escape the misdirecting influence should be allowed. In reference =-=[29]-=-, some randomly flying particles have been used to avoid the premature convergence. But when the dimension of the searching space is very large, the distribution of several free particles will be too ... |