### Table 1: Results of 30 independent runs on 8 benchmark tests using Augmented Lagrange Particle Swarm Optimization. Column 2 shows the number of particles np and column 3 the number of function calls nf. Details about the test functions can be found in the Appendix.

2005

"... In PAGE 6: ...(18), we maintain the magnitude of the penalty factors such that an e ective change in Lagrange multipliers is possible. This lower bound is formulated by rp;i 1 2 s j ij g;h : (20) Table1 summarizes the experimental results using ALPSO for solving eight constrained benchmark problems. All results show the average values of 30 independent runs on each test function.... In PAGE 7: ...with [13] the results from ALPSO are comparable or superior with less function evaluations required. The number of function evaluations listed in Table1 represents an upper limit where we stopped the optimization process. However, the best solution of each run was usually found much earlier.... ..."

### Table 1. Existing frameworks for multi-objective optimization (hybrid. stands for hy- bridization features, // for parallel features, lang. for programming language, ref. for reference, EA for Evolutionary Algorithm, LS for Local Search, SA for Simulated An- nealing, TS for Tabu Search, ACO for Ant Colony Optimization and PSO for Particle Swarm Optimization).

### Table 10 PSO searching process on Lung Iter Best Solution Fitness

"... In PAGE 35: ...35 Example 5 The process of the particle swarms searching for optimal solutions for dataset Lung is given in Table10 and Figure 7. Table 10 PSO searching process on Lung Iter Best Solution Fitness ... ..."

### Table 11 PSO searching process on DNA Iter Best Solution Fitness

"... In PAGE 36: ...36 Example 6 The process of the particle swarms searching for optimal solutions for dataset DNA is given in Table11 and Figure 8. Table 11 PSO searching process on DNA Iter Best Solution Fitness ... ..."

### Table 1: Feature selection techniques based on evolutionary approach. Technique Researcher Genetic algorithm Zhang, P. et al. (2005);

"... In PAGE 2: ... Therefore to find the optimal features, some researchers use evolutionary approach such as genetic algorithm, particle swarm optimization, ant colony optimization, artificial fish swarm algorithm and others. Table1 gives some studies on feature selection techniques based on evolutionary approach. Table 1: Feature selection techniques based on evolutionary approach.... ..."

### Table 1. Parameters of the particle swarms.

### Table 4 Classification results with different reducts 1: Number of rules; 2: Classification accuracy POSAR CEAR DISMAR GAAR PSORSAR

"... In PAGE 25: ... So, all the particles have a powerful search capability, which can help the swarm avoid dead ends. The comparison of the number of decision rules and the classification accuracy with different reducts are shown in Table4... ..."

### Table 2: Lagrange multiplier estimates of the 30 runs solving the eight benchmark problems. See the Appendix for their description.

2005

"... In PAGE 7: ...ptimization process. However, the best solution of each run was usually found much earlier. The Aug- mented Lagrange Particle Swarm Optimization method also reliably detects the active constraints and computes accurate Lagrange multiplier estimates. Table2 lists the mean values and the corresponding standard deviation of the multipliers obtained during the 30 test runs on the eight benchmark functions. Table 2: Lagrange multiplier estimates of the 30 runs solving the eight benchmark problems.... ..."

### Table 8 PSO searching process on Mushroom Iter Best Solution Fitness Value Feature Subset Length

"... In PAGE 33: ...33 Example 3 The process of the particle swarms searching for optimal solutions for dataset Mushroom is given in Table8 and Figure 5. Table 8 PSO searching process on Mushroom Iter Best Solution Fitness Value Feature Subset Length ... ..."