Results 1  10
of
277
Selforganizing hierarchical particle swarm optimizer with timevarying acceleration coefficients
 IEEE Transactions on Evolutionary Computation
, 2004
"... Abstract—This paper introduces a novel parameter automation strategy for the particle swarm algorithm and two further extensions to improve its performance after a predefined number of generations. Initially, to efficiently control the local search and convergence to the global optimum solution, tim ..."
Abstract

Cited by 189 (2 self)
 Add to MetaCart
(Show Context)
Abstract—This paper introduces a novel parameter automation strategy for the particle swarm algorithm and two further extensions to improve its performance after a predefined number of generations. Initially, to efficiently control the local search and convergence to the global optimum solution, timevarying acceleration coefficients (TVAC) are introduced in addition to the timevarying inertia weight factor in particle swarm optimization (PSO). From the basis of TVAC, two new strategies are discussed to improve the performance of the PSO. First, the concept of “mutation ” is introduced to the particle swarm optimization along with TVAC (MPSOTVAC), by adding a small perturbation to a randomly selected modulus of the velocity vector of a random particle by predefined probability. Second, we introduce a novel particle swarm concept “selforganizing hierarchical particle swarm optimizer with TVAC (HPSOTVAC). ” Under this method, only the “social ” part and the “cognitive ” part of the particle swarm strategy are considered to estimate the new velocity of each particle and particles are reinitialized whenever they are stagnated in the search space. In addition, to overcome the difficulties of selecting an appropriate mutation step size for different problems, a timevarying mutation step size was introduced. Further, for most of the benchmarks, mutation probability is found to be insensitive to the performance of MPSOTVAC method. On the other hand, the effect of reinitialization velocity on the performance of HPSOTVAC method is also observed. Timevarying reinitialization step size is found to be an efficient parameter optimization strategy for HPSOTVAC method. The HPSOTVAC strategy outperformed all the methods considered in this investigation for most of the functions. Furthermore, it has also been observed that both the MPSO and HPSO strategies perform poorly when the acceleration coefficients are fixed at two. Index Terms—Acceleration coefficients, hierarchical particle swarm, mutation, particle swarm, reinitialization. I.
Adaptive Particle Swarm Optimization
, 2008
"... This paper proposes an adaptive particle swarm optimization (APSO) with adaptive parameters and elitist learning strategy (ELS) based on the evolutionary state estimation (ESE) approach. The ESE approach develops an ‘evolutionary factor’ by using the population distribution information and relative ..."
Abstract

Cited by 55 (2 self)
 Add to MetaCart
(Show Context)
This paper proposes an adaptive particle swarm optimization (APSO) with adaptive parameters and elitist learning strategy (ELS) based on the evolutionary state estimation (ESE) approach. The ESE approach develops an ‘evolutionary factor’ by using the population distribution information and relative particle fitness information in each generation, and estimates the evolutionary state through a fuzzy classification method. According to the identified state and taking into account various effects of the algorithmcontrolling parameters, adaptive control strategies are developed for the inertia weight and acceleration coefficients for faster convergence speed. Further, an adaptive ‘elitist learning strategy ’ (ELS) is designed for the best particle to jump out of possible local optima and/or to refine its accuracy, resulting in substantially improved quality of global solutions. The APSO algorithm is tested on 6 unimodal and multimodal functions, and the experimental results demonstrate that the APSO generally outperforms the compared PSOs, in terms of solution accuracy, convergence speed and algorithm reliability.
Parallel Global Optimization with the Particle Swarm Algorithm
 JOURNAL OF NUMERICAL METHODS IN ENGINEERING
, 2003
"... ..."
(Show Context)
A Dissipative Particle Swarm Optimization
 Congress on Evolutionary Computation
, 2002
"... A dissipative particle swarm optimization is developed according to the selforganization of dissipative structure. The negative entropy is introduced to construct an opening dissipative system that is farfromequilibrium so as to driving the irreversible evolution process with better fitness. The ..."
Abstract

Cited by 42 (2 self)
 Add to MetaCart
(Show Context)
A dissipative particle swarm optimization is developed according to the selforganization of dissipative structure. The negative entropy is introduced to construct an opening dissipative system that is farfromequilibrium so as to driving the irreversible evolution process with better fitness. The testing of two multimodal functions indicates it improves the performance effectively.
Two improved differential evolution schemes for faster global search
 in Proc. ACMSIGEVO GECCO
, 2005
"... Differential evolution (DE) is well known as a simple and efficient scheme for global optimization over continuous spaces. In this paper we present two new, improved variants of DE. Performance comparisons of the two proposed methods are provided against (a) the original DE, (b) the canonical partic ..."
Abstract

Cited by 39 (9 self)
 Add to MetaCart
(Show Context)
Differential evolution (DE) is well known as a simple and efficient scheme for global optimization over continuous spaces. In this paper we present two new, improved variants of DE. Performance comparisons of the two proposed methods are provided against (a) the original DE, (b) the canonical particle swarm optimization (PSO), and (c) two PSOvariants. The new DEvariants are shown to be statistically significantly better on a sevenfunction test bed for the following performance measures: solution quality, time to find the solution, frequency of finding the solution, and scalability. Categories and Subject Descriptors
Swarms in Dynamic Environments
 in Genetic and Evolutionary Computation — Gecco 2003, Lecture Notes in Computer Science
"... Abstract. Charged particle swarm optimization (CPSO) is well suited to the dynamic search problem since interparticle repulsion maintains population diversity and good tracking can be achieved with a simple algorithm. This work extends the application of CPSO to the dynamic problem by considering a ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Charged particle swarm optimization (CPSO) is well suited to the dynamic search problem since interparticle repulsion maintains population diversity and good tracking can be achieved with a simple algorithm. This work extends the application of CPSO to the dynamic problem by considering a bimodal parabolic environment of high spatial and temporal severity. Two types of charged swarms and an adapted neutral swarm are compared for a number of different dynamic environments which include extreme ‘needleinthehaystack’ cases. The results suggest that charged swarms perform best in the extreme cases, but neutral swarms are better optimizers in milder environments. 1
Swarm Intelligence Algorithms for Data Clustering
 IN SOFT COMPUTING FOR KNOWLEDGE DISCOVERY AND DATA MINING BOOK, PART IV
"... Clustering aims at representing large datasets by a fewer number of prototypes or clusters. It brings simplicity in modeling data and thus plays a central role in the process of knowledge discovery and data mining. Data mining tasks, in these days, require fast and accurate partitioning of huge da ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
Clustering aims at representing large datasets by a fewer number of prototypes or clusters. It brings simplicity in modeling data and thus plays a central role in the process of knowledge discovery and data mining. Data mining tasks, in these days, require fast and accurate partitioning of huge datasets, which may come with a variety of attributes or features. This, in turn, imposes severe computational requirements on the relevant clustering techniques. A family of bioinspired algorithms, wellknown as Swarm Intelligence (SI) has recently emerged that meets these requirements and has successfully been applied to a number of real world clustering problems. This chapter explores the role of SI in clustering different kinds of datasets. It finally describes a new SI technique for partitioning any dataset into an optimal number of groups through one run of optimization. Computer simulations undertaken in this research have also been provided to demonstrate the effectiveness of the proposed algorithm.
Parallel asynchronous particle swarm optimization
 INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING 67(4
, 2006
"... The high computational cost of complex engineering optimization problems has motivated the development of parallel optimization algorithms. A recent example is the parallel particle swarm optimization (PSO) algorithm, which is valuable due to its global search capabilities. Unfortunately, because ex ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
(Show Context)
The high computational cost of complex engineering optimization problems has motivated the development of parallel optimization algorithms. A recent example is the parallel particle swarm optimization (PSO) algorithm, which is valuable due to its global search capabilities. Unfortunately, because existing parallel implementations are synchronous (PSPSO), they do not make efficient use of computational resources when a load imbalance exists. In this study, we introduce a parallel asynchronous PSO (PAPSO) algorithm to enhance computational efficiency. The performance of the PAPSO algorithm was compared to that of a PSPSO algorithm in homogeneous and heterogeneous computing environments for small to mediumscale analytical test problems and a mediumscale biomechanical test problem. For all problems, the robustness and convergence rate of PAPSO were comparable to those of PSPSO. However, the parallel performance of PAPSO was significantly better than that of PSPSO for heterogeneous computing environments or heterogeneous computational tasks. For example, PAPSO was 3.5 times faster than was PSPSO for the biomechanical test problem executed on a heterogeneous cluster with 20 processors. Overall, PAPSO exhibits excellent parallel performance when a large number of processors (more than about 15) is utilized and either (1) heterogeneity exists in the
S.: Two ways to grow tissue for Artificial Immune Systems
"... Abstract. An immune system without tissue is like evolution without genes. Something very important is missing. Here we present the novel concept of tissue for artificial immune systems. Much like the genetic representation of genetic algorithms, tissue provides an interface between problem and immu ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
(Show Context)
Abstract. An immune system without tissue is like evolution without genes. Something very important is missing. Here we present the novel concept of tissue for artificial immune systems. Much like the genetic representation of genetic algorithms, tissue provides an interface between problem and immune algorithm. Two tissuegrowing algorithms are presented with experimental results illustrating their abilities to dynamically cluster data and provide useful signals. The use of tissue to provide an innate immune response driving the adaptive response of conventional immune algorithms is then discussed. 1