Results 1  10
of
339
Selforganizing hierarchical particle swarm optimizer with timevarying acceleration coefficients
 IEEE Transactions on Evolutionary Computation
, 2004
"... Abstract—This paper introduces a novel parameter automation strategy for the particle swarm algorithm and two further extensions to improve its performance after a predefined number of generations. Initially, to efficiently control the local search and convergence to the global optimum solution, tim ..."
Abstract

Cited by 194 (2 self)
 Add to MetaCart
Abstract—This paper introduces a novel parameter automation strategy for the particle swarm algorithm and two further extensions to improve its performance after a predefined number of generations. Initially, to efficiently control the local search and convergence to the global optimum solution, timevarying acceleration coefficients (TVAC) are introduced in addition to the timevarying inertia weight factor in particle swarm optimization (PSO). From the basis of TVAC, two new strategies are discussed to improve the performance of the PSO. First, the concept of “mutation ” is introduced to the particle swarm optimization along with TVAC (MPSOTVAC), by adding a small perturbation to a randomly selected modulus of the velocity vector of a random particle by predefined probability. Second, we introduce a novel particle swarm concept “selforganizing hierarchical particle swarm optimizer with TVAC (HPSOTVAC). ” Under this method, only the “social ” part and the “cognitive ” part of the particle swarm strategy are considered to estimate the new velocity of each particle and particles are reinitialized whenever they are stagnated in the search space. In addition, to overcome the difficulties of selecting an appropriate mutation step size for different problems, a timevarying mutation step size was introduced. Further, for most of the benchmarks, mutation probability is found to be insensitive to the performance of MPSOTVAC method. On the other hand, the effect of reinitialization velocity on the performance of HPSOTVAC method is also observed. Timevarying reinitialization step size is found to be an efficient parameter optimization strategy for HPSOTVAC method. The HPSOTVAC strategy outperformed all the methods considered in this investigation for most of the functions. Furthermore, it has also been observed that both the MPSO and HPSO strategies perform poorly when the acceleration coefficients are fixed at two. Index Terms—Acceleration coefficients, hierarchical particle swarm, mutation, particle swarm, reinitialization. I.
Particle Swarm Optimization: Basic Concepts, Variants and Applications in Power Systems
, 2008
"... Many areas in power systems require solving one or more nonlinear optimization problems. While analytical methods might suffer from slow convergence and the curse of dimensionality, heuristicsbased swarm intelligence can be an efficient alternative. Particle swarm optimization (PSO), part of the s ..."
Abstract

Cited by 90 (12 self)
 Add to MetaCart
Many areas in power systems require solving one or more nonlinear optimization problems. While analytical methods might suffer from slow convergence and the curse of dimensionality, heuristicsbased swarm intelligence can be an efficient alternative. Particle swarm optimization (PSO), part of the swarm intelligence family, is known to effectively solve largescale nonlinear optimization problems. This paper presents a detailed overview of the basic concepts of PSO and its variants. Also, it provides a comprehensive survey on the power system applications that have benefited from the powerful nature of PSO as an optimization technique. For each application, technical details that are required for applying PSO, such as its type, particle formulation (solution representation), and the most efficient fitness functions are also discussed.
Matching Algorithms to Problems: An Experimental Test of the Particle Swarm and Some Genetic Algorithms on the Multimodal Problem Generator
 In: Proceedings of the IEEE Congress on Evolutionary Computation (CEC
, 1998
"... A multimodal problem generator was used to test three versions of genetic algorithm and the binary particle swarm algorithm in a factorial timeseries experiment. Specific strengths and weaknesses of the various algorithms were identified. 1. Introduction This paper will compare the performance of ..."
Abstract

Cited by 85 (3 self)
 Add to MetaCart
(Show Context)
A multimodal problem generator was used to test three versions of genetic algorithm and the binary particle swarm algorithm in a factorial timeseries experiment. Specific strengths and weaknesses of the various algorithms were identified. 1. Introduction This paper will compare the performance of the binary particle swarm and several varieties of genetic algorithm on sets of problems produced by a multimodality problem generator. The study will be constructed in the form of a repeatedmeasures factorial experiment, reporting results from multivariate analysis of variance. Research questions involve effects of various aspects of problems on performance of the particle swarm and genetic algorithms with mutation or crossover, or both. One difficulty with empirical comparisons of search algorithms is that results may not generalize beyond the test problems used. For instance, a new algorithm may be carefully tuned so that it outperforms some existing algorithms on a few problems. Unfort...
Particle swarm optimization  An Overview
 SWARM INTELL
, 2007
"... Particle swarm optimization (PSO) has undergone many changes since its introduction in 1995. As researchers have learned about the technique, they have derived new versions, developed new applications, and published theoretical studies of the effects of the various parameters and aspects of the algo ..."
Abstract

Cited by 84 (0 self)
 Add to MetaCart
Particle swarm optimization (PSO) has undergone many changes since its introduction in 1995. As researchers have learned about the technique, they have derived new versions, developed new applications, and published theoretical studies of the effects of the various parameters and aspects of the algorithm. This paper comprises a snapshot of particle swarming from the authors’ perspective, including variations in the algorithm, current and ongoing research, applications and open problems.
Particle Swarm Optimization: Surfing the Waves
 Proceedings of the Congress on Evolutionary Computation
, 1999
"... A new optimization method has been proposed by Kennedy et. al. in [7, 8], called Particle Swarm Optimization (PSO). This approach combines social psychology principles in sociocognition of human (and artificial) agents and evolutionary computation. It has been successfully applied to nonlinear fun ..."
Abstract

Cited by 57 (2 self)
 Add to MetaCart
(Show Context)
A new optimization method has been proposed by Kennedy et. al. in [7, 8], called Particle Swarm Optimization (PSO). This approach combines social psychology principles in sociocognition of human (and artificial) agents and evolutionary computation. It has been successfully applied to nonlinear function optimization and neural network training. Preliminary formal analyses for a simple PSO system show that a particle in a simple PSO system follows a path defined by a sinusoidal wave, randomly deciding on both its amplitude and frequency [12]. This paper takes the next step, generalizing to obtain closed form equations for trajectories of particles in a multidimensional search space. 1 Introduction Evolutionary computation techniques are search methods based on natural systems. For example, Genetic Algorithms (GAs) use principles of genetics and natural selection [4]. "Particle Swarm Optimization" (PSO) [7, 8] is a recently proposed algorithm, motivated by the behavior of organisms s...
A hierarchical particle swarm optimizer and its adaptive variant
 Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on
, 2005
"... Abstract—A hierarchical version of the particle swarm optimization (PSO) metaheuristic is introduced in this paper. In the new method called HPSO, the particles are arranged in a dynamic hierarchy that is used to define a neighborhood structure. Depending on the quality of their sofar bestfound ..."
Abstract

Cited by 52 (0 self)
 Add to MetaCart
(Show Context)
Abstract—A hierarchical version of the particle swarm optimization (PSO) metaheuristic is introduced in this paper. In the new method called HPSO, the particles are arranged in a dynamic hierarchy that is used to define a neighborhood structure. Depending on the quality of their sofar bestfound solution, the particles move up or down the hierarchy. This gives good particles that move up in the hierarchy a larger influence on the swarm. We introduce a variant of HPSO, in which the shape of the hierarchy is dynamically adapted during the execution of the algorithm. Another variant is to assign different behavior to the individual particles with respect to their level in the hierarchy. HPSO and its variants are tested on a commonly used set of optimization functions and are compared to PSO using different standard neighborhood schemes.
Parallel Global Optimization with the Particle Swarm Algorithm
 JOURNAL OF NUMERICAL METHODS IN ENGINEERING
, 2003
"... ..."
(Show Context)
Dynamic clustering using particle swarm optimization with application in unsupervised image segmentation
 2005
"... A new dynamic clustering approach (DCPSO), based on Particle Swarm Optimization, is proposed. This approach is applied to unsupervised image classification. The proposed approach automatically determines the "optimum " number of clusters and simultaneously clusters the data set with minima ..."
Abstract

Cited by 37 (0 self)
 Add to MetaCart
(Show Context)
A new dynamic clustering approach (DCPSO), based on Particle Swarm Optimization, is proposed. This approach is applied to unsupervised image classification. The proposed approach automatically determines the "optimum " number of clusters and simultaneously clusters the data set with minimal user interference. The algorithm starts by partitioning the data set into a relatively large number of clusters to reduce the effects of initial conditions. Using binary particle swarm optimization the "best" number of clusters is selected. The centers of the chosen clusters is then refined via the Kmeans clustering algorithm. The experiments conducted show that the proposed approach generally found the "optimum" number of clusters on the tested images.