### Table 1. Average total error in 1000 iterations for graphs with existing solution

### Table 2: Iteration period of the graph assuming timing information in pattern 1

"... In PAGE 8: ... In particular, the di erence can be classi ed as the change of node n 1 apos;s computation time. Table2 shows the iteration periods (IP) of the outcomes of the graph in this scenario as well as their probabilities... ..."

### Table 2: Iteration period of the graph assuming timing information in pattern 1

"... In PAGE 8: ... In particular, the di erence can be classi ed as the change of node n1 apos;s computation time. Table2 shows the iteration periods (IP) of the outcomes of the graph in this scenario as well as their probabilities... ..."

### Table 3: Graph-based iterative Group Analysis of gene expression during the yeast diauxic shift.

2004

"... In PAGE 4: ... This connectivity between genes and functional classes is provided by GiGA. Table3 summarizes the results for the 20.5 hour time point, using two different networks, one for GeneOntology classes, and one for enzyme substrates, extracted from the SwissProt catalytic activity descriptors of yeast proteins.... In PAGE 9: ...entral biological processes detected by DeRisi et al. (1997) and by iGA (see Table 1 and 2). N, number of genes in each subgraph. Table3 : Graph-based iterative Group Analysis of gene expression during the yeast diauxic shift. (Continued) Page 9 of 10 (page number not for citation purposes)... ..."

### Table 1: Results from the high level synthesis benchmarks Graph Characteristics # Iterations Example

"... In PAGE 4: ... This is consistent with the Hopfield Neural Networks that deterministically converge to a minimum energy after a series of iterations (Table 1). Table1 shows that different numbers of iterations were obtained at different simulation runs. The reason is that... ..."

### Table 1: Results from the high level synthesis benchmarks Graph Characteristics # Iterations Example

"... In PAGE 4: ... This is consistent with the Hopfield Neural Networks that deterministically converge to a minimum energy after a series of iterations (Table 1). Table1 shows that different numbers of iterations were obtained at different simulation runs. The reason is that... ..."

### Table 1. Average results in 1000 iterations for graphs with existing solution Graph Total Error

2005

"... In PAGE 5: ... A higher number of connections can increase the speed of the convergence. Table1 shows the results of the three methods on the rst set of problems. The graphs are named after their number of vertices.... In PAGE 5: ...he parameter is equal to 0.005 for these results. Higher values have caused the tension vector algorithm to diverge. Table 2 shows the total error from Table1 as a percentage of the sum of all of the weights in the graph. From this table we can notice that the error is less that 3% in all of the cases, and at least half of the time less than 1%.... ..."

Cited by 3

### Table 3. Search time and number of iterations for instances with generic task graphs

2006

"... In PAGE 13: ... Difierently from the pipelined instances, here we schedule a single repetition of each task. Table3 summarizes the results. Each instance presented has been solved optimally.... ..."

Cited by 1

### Table 1. Number of iterations and running time (in seconds) for various graphs

"... In PAGE 10: ... Note that sometimes increasing B improves the quality of the layouts. Table1 shows the number of iterations for several graphs. For each graph we have computed a 2-D layout, so the results refer to the computation of two axes.... In PAGE 11: ... In [10] we describe another important advantage of this technique that enables an additional substantial reduction of time and space complexity. The actual running times of the algorithm are slightly longer than the results given in Table1 (mainly because of the added time for recomputing the residual distances at each iteration). The results in Figs.... ..."