Results 1  10
of
98
Population structure and particle swarm performance
 In: Proceedings of the Congress on Evolutionary Computation (CEC 2002
, 2002
"... Abstract: The effects of various population topologies on the particle swarm algorithm were systematically investigated. Random graphs were generated to specifications, and their performance on several criteria was compared. What makes a good population structure? We discovered that previous assumpt ..."
Abstract

Cited by 188 (6 self)
 Add to MetaCart
Abstract: The effects of various population topologies on the particle swarm algorithm were systematically investigated. Random graphs were generated to specifications, and their performance on several criteria was compared. What makes a good population structure? We discovered that previous assumptions may not have been correct. I.
The fully informed particle swarm: Simpler, maybe better
 IEEE Transactions on Evolutionary Computation
, 2004
"... The canonical particle swarm algorithm is a new approach to optimization, drawing inspiration from group behavior and the establishment of social norms. It is gaining popularity, especially because of the speed of convergence and the fact it is easy to use. However, we feel that each individual is n ..."
Abstract

Cited by 128 (5 self)
 Add to MetaCart
(Show Context)
The canonical particle swarm algorithm is a new approach to optimization, drawing inspiration from group behavior and the establishment of social norms. It is gaining popularity, especially because of the speed of convergence and the fact it is easy to use. However, we feel that each individual is not simply influenced by the best performer among his neighbors. We thus decided to make the individuals “fully informed. ” The results are very promising, as informed individuals seem to find better solutions in all the benchmark functions.
Adaptive Particle Swarm Optimization
, 2008
"... This paper proposes an adaptive particle swarm optimization (APSO) with adaptive parameters and elitist learning strategy (ELS) based on the evolutionary state estimation (ESE) approach. The ESE approach develops an ‘evolutionary factor’ by using the population distribution information and relative ..."
Abstract

Cited by 67 (2 self)
 Add to MetaCart
(Show Context)
This paper proposes an adaptive particle swarm optimization (APSO) with adaptive parameters and elitist learning strategy (ELS) based on the evolutionary state estimation (ESE) approach. The ESE approach develops an ‘evolutionary factor’ by using the population distribution information and relative particle fitness information in each generation, and estimates the evolutionary state through a fuzzy classification method. According to the identified state and taking into account various effects of the algorithmcontrolling parameters, adaptive control strategies are developed for the inertia weight and acceleration coefficients for faster convergence speed. Further, an adaptive ‘elitist learning strategy ’ (ELS) is designed for the best particle to jump out of possible local optima and/or to refine its accuracy, resulting in substantially improved quality of global solutions. The APSO algorithm is tested on 6 unimodal and multimodal functions, and the experimental results demonstrate that the APSO generally outperforms the compared PSOs, in terms of solution accuracy, convergence speed and algorithm reliability.
Solving Constrained Nonlinear Optimization Problems with Particle Swarm Optimization
 6th World Multiconference on Systemics, Cybernetics and Informatics (SCI 2002
, 2002
"... This paper presents a Particle Swarm Optimization (PSO) algorithm for constrained nonlinear optimization problems . In PSO, the potential solutions, called particles, are "flown" through the problem space by learning from the current optimal particle and its own memory. In this paper, pres ..."
Abstract

Cited by 62 (1 self)
 Add to MetaCart
(Show Context)
This paper presents a Particle Swarm Optimization (PSO) algorithm for constrained nonlinear optimization problems . In PSO, the potential solutions, called particles, are "flown" through the problem space by learning from the current optimal particle and its own memory. In this paper, preserving feasibility strategy is employed to deal with constraints. PSO is started with a group of feasible solutions and a feasibility function is used to check if the new explored solutions satisfy all the constraints. All particles keep only those feasible solutions in their memory. Eleven test cases were tested and showed that PSO is an efficient and general solution to solve most nonlinear optimization problems with nonlinear inequality constraints.
Adaptive Particle Swarm Optimization: Detection And Response to Dynamic Systems
 In Proceedings of the IEEE Congress on Evolutionary Computation, CEC2002. IEEE
, 2002
"... This paper introduces an adaptive PSO, which automatically tracks various changes in a dynamic system. Different environment detection and response techniques are tested on the parabolic and Rosenbrock benchmark functions, and rerandomization is introduced to respond to the dynamic changes. Perform ..."
Abstract

Cited by 58 (1 self)
 Add to MetaCart
This paper introduces an adaptive PSO, which automatically tracks various changes in a dynamic system. Different environment detection and response techniques are tested on the parabolic and Rosenbrock benchmark functions, and rerandomization is introduced to respond to the dynamic changes. Performance on the benchmark functions with various severities is analyzed.
An Approach to Multimodal Biomedical Image Registration Utilizing Particle Swarm Optimization
 IEEE Transactions on Evolutionary Computation
, 2004
"... Biomedical image registration, or geometric alignment of twodimensional and/or threedimensional (3D) image data, is becoming increasingly important in diagnosis, treatment planning, functional studies, computerguided therapies, and in biomedical research. Registration based on intensity values u ..."
Abstract

Cited by 51 (0 self)
 Add to MetaCart
(Show Context)
Biomedical image registration, or geometric alignment of twodimensional and/or threedimensional (3D) image data, is becoming increasingly important in diagnosis, treatment planning, functional studies, computerguided therapies, and in biomedical research. Registration based on intensity values usually requires optimization of some similarity metric between the images. Local optimization techniques frequently fail because functions of these metrics with respect to transformation parameters are generally nonconvex and irregular and, therefore, global methods are often required. In this paper, a new evolutionary approach, particle swarm optimization, is adapted for singleslice 3Dto3D biomedical image registration. A new hybrid particle swarm technique is proposed that incorporates initial user guidance. Multimodal registrations with initial orientations far from the ground truth were performed on three volumes from different modalities. Results of optimizing the normalized mutual information similarity metric were compared with various evolutionary strategies. The hybrid particle swarm technique produced more accurate registrations than the evolutionary strategies in many cases, with comparable convergence. These results demonstrate that particle swarm approaches, along with evolutionary techniques and local methods, are useful in image registration, and emphasize the need for hybrid approaches for difficult registration problems.
Parallel Global Optimization with the Particle Swarm Algorithm
 JOURNAL OF NUMERICAL METHODS IN ENGINEERING
, 2003
"... ..."
(Show Context)
A note on the learning automata based algorithms for adaptive parameter selection in PSO
 Applied Soft Computing
, 2011
"... in PSO ..."
(Show Context)
A Study of Global Optimization using Particle Swarms
 J. Global Opt
, 2004
"... Abstract. A number of recently proposed variants of the particle swarm optimization algorithm (PSOA) are applied to an extended DixonSzegö bound constrained test set in global optimization. Of the variants considered, it is shown that constriction as proposed by Clerc, and dynamic inertia and maxim ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
(Show Context)
Abstract. A number of recently proposed variants of the particle swarm optimization algorithm (PSOA) are applied to an extended DixonSzegö bound constrained test set in global optimization. Of the variants considered, it is shown that constriction as proposed by Clerc, and dynamic inertia and maximum velocity reduction as proposed by Fourie and Groenwold, represent the main contenders from a cost efficiency point of view. A parameter sensitivity analysis is then performed for these two variants in the interests of finding a reliable general purpose ‘offtheshelf ’ PSOA for global optimization. In doing so, it is shown that inclusion of dynamic inertia renders the PSOA relatively insensitive to the values of the cognitive and social scaling factors.
Parallel asynchronous particle swarm optimization
 INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING 67(4
, 2006
"... The high computational cost of complex engineering optimization problems has motivated the development of parallel optimization algorithms. A recent example is the parallel particle swarm optimization (PSO) algorithm, which is valuable due to its global search capabilities. Unfortunately, because ex ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
(Show Context)
The high computational cost of complex engineering optimization problems has motivated the development of parallel optimization algorithms. A recent example is the parallel particle swarm optimization (PSO) algorithm, which is valuable due to its global search capabilities. Unfortunately, because existing parallel implementations are synchronous (PSPSO), they do not make efficient use of computational resources when a load imbalance exists. In this study, we introduce a parallel asynchronous PSO (PAPSO) algorithm to enhance computational efficiency. The performance of the PAPSO algorithm was compared to that of a PSPSO algorithm in homogeneous and heterogeneous computing environments for small to mediumscale analytical test problems and a mediumscale biomechanical test problem. For all problems, the robustness and convergence rate of PAPSO were comparable to those of PSPSO. However, the parallel performance of PAPSO was significantly better than that of PSPSO for heterogeneous computing environments or heterogeneous computational tasks. For example, PAPSO was 3.5 times faster than was PSPSO for the biomechanical test problem executed on a heterogeneous cluster with 20 processors. Overall, PAPSO exhibits excellent parallel performance when a large number of processors (more than about 15) is utilized and either (1) heterogeneity exists in the