### Table 6: Performance of the best templates found during genetic algorithm search. Results for greedy search are also presented, for comparison.

1998

"... In PAGE 8: ... This behavior suggests that the genetic algorithm is working correctly but that it is not di cult to nd individual templates with low prediction errors. As shown in Table6 , the best templates found during the genetic algorithm search provide mean errors that are 2 to 12 percent less than the best templates found during the greedy search. The largest improvements are obtained on the CTC and SDSC95 workloads.... ..."

Cited by 80

### Table 6: Performance of the best templates found during genetic algorithm search. Results for greedy search are also presented, for comparison.

1998

"... In PAGE 8: ... This behavior suggests that the genetic algorithm is working correctly but that it is not di cult to nd individual templates with low prediction errors. As shown in Table6 , the best templates found during the genetic algorithm search provide mean errors that are 2 to 12 percent less than the best templates found during the greedy search. The largest improvements are obtained on the CTC and SDSC95 workloads.... ..."

Cited by 80

### Table 1 presents the average RMSE for each configu- ration of the greedy ensemble selection algorithm on each dataset. We notice that forward-prune is the best perform- ing configuration in three cases out of four. An interesting fact is that the two configurations that use the pruning set for evaluation (FP, BP) have better performance than the other two that use the training set. This strongly indicates that using a separate dataset for evaluation offers increased predictive accuracy to the greedy ensemble selection algo- rithm.

"... In PAGE 5: ... Table1 . Average errors of each algorithm on each dataset.... ..."

### Table 3: The pseudo-code for Greedy Search.

"... In PAGE 10: ... Typically, this algorithm demands fewer policy updates before convergence, but each updating step is more time- consuming than in the previous algorithm. A pseudo-code of Greedy Search is presented in Table3 . During each policy updating it is advantageous to distribute flows as follows: Suppose dj is assigned to root-clique Rj, and assume the policy for dj is the last policy that has been updated.... ..."

### Table 3: The pseudo-code for Greedy Search.

"... In PAGE 11: ... Typically, this algorithm demands fewer policy updates before convergence, but each updating step is more time- consuming than in the previous algorithm. A pseudo-code of Greedy Search is presented in Table3 . During each policy updating it is advantageous to distribute flows as follows: Suppose dj is assigned to root-clique Rj, and assume the policy for dj is the last policy that has been updated.... ..."

### Table 4: Average nonlinearity for combination of PJ PDBF and greedy algorithms.

"... In PAGE 8: ...alues f0; 1g are allowed. Thus the resulting function can have less unde ned values (i.e. ?) than starting function. PJ PDBF(size) choose random PDBF f of given size while (f is a new one) compute (any) permutation m satisfying (1) nd a pair (x; v) 2 M such that D is maximum f f[x 7! v] g return f attaining the highest nonlinearity The following Table4 presents an average nonlinearity of Boolean functions obtained by combination of PJ PDBF and greedy algorithm. Since we tried to maximize nonlinearity, the property P in greedy algorithm was nonlinearity.... ..."

### Table 7: Comparison of di erent sizing algorithms when sizing 16-bit buses under 2x-6x minimum pitch- spacing. SISS is single-net interconnect sizing and spacing algorithm applied to multiple nets in a greedy order; GISS/FAF and GISS/VAF are bottom-up dynamic programming algorithms; GISS/EPLR is the algorithm presented in this paper.

1999

"... In PAGE 24: ... The SISS algorithm obtains a local-optimal solution for the GISS problem. We compare the average HSPICE delay for solutions given by these algorithms in Table7 (average delay is our objective function). As seen from the table, the GISS/EPLR algorithm always achieves results better than the SISS solutions, with up to 39% delay reduction.... In PAGE 25: ... However, the capacitance model is no longer monotonically-constrained in the case of large pitch- spacings as shown by cef for center-to-edge spacing = 2:2 m in Figure 2(b). Therefore, the GISS/EPLR obtains better results for large pitch-spacings (5x and 6x cases in Table7 ). Because the GISS/EPLR algorithm is much faster and always achieves the best results, we suggest that the GISS/EPLR algorithm shall be used instead of other algorithms.... ..."

Cited by 9

### Table 1: Comparing Di erent Stochastic-Greedy Algorithms.

2005

"... In PAGE 19: ...2 Evaluation of the Optimization Algorithm Experiment D (Stochastic Algorithms). This experiment compared the perfor- mance of di erent stochastic-greedy algorithms ( Table1 ). Each of the stochastic algo- rithms is run for 1000 iterations, after the reduction rules have been applied.... In PAGE 19: ... Note that this experiment compares the elimination costs found by the algorithms for the case where T = 1, namely no condi- tioning is performed. The results presented in Table1 are a representative sample from the experiment that was performed on 100 data sets. In 100 data sets, the distribution of algorithms that found the lowest cost was as follows: Min-Weight - 4%, MCS - 9%, WMCS - 7%, Min-Fill - 25%, and Weighted Min-Fill - 76 %.... ..."

Cited by 10

### Table 1: The Greedy algorithm.

2005

"... In PAGE 18: ...dditional lines of pseudocode for iGreedyLB. Lines 5.1, 9.1 and 13.1-13.4 are new and should be added to the pseudocode of Table1 after lines 5, 9 and 13, while lines 7 and 10 are substitutes for the corresponding lines of Table 1.... In PAGE 18: ...dditional lines of pseudocode for iGreedyLB. Lines 5.1, 9.1 and 13.1-13.4 are new and should be added to the pseudocode of Table 1 after lines 5, 9 and 13, while lines 7 and 10 are substitutes for the corresponding lines of Table1 .... ..."

Cited by 7

### Table 1. The greedy algorithm

2006

Cited by 1