### Table 3: Results of the comparison of different clustering approaches

2005

Cited by 21

### Table 3: Results of the comparison of different clustering approaches

2005

Cited by 21

### Table 4 Cluster accuracy and stability on yeast galactose data

2003

"... In PAGE 8: ... It is interesting that the spherical model of the IMM approach produces unstable clusters at both high and low noise levels. Yeast galactose data Table4 a,b show selected results on cluster accuracy and cluster stability on real yeast galactose data. The true mean column in Table 4a refers to clustering the true mean data R34.... In PAGE 8: ... Yeast galactose data Table 4a,b show selected results on cluster accuracy and cluster stability on real yeast galactose data. The true mean column in Table4 a refers to clustering the true mean data R34.8 Genome Biology 2003, Volume 4, Issue 5, Article R34 Yeung et al.... In PAGE 9: ... The highest level of cluster accuracy (adjusted Rand index = 0.968 in Table4 a) was obtained with several algorithms: centroid linkage hierarchical algorithm with Euclidean dis- tance and averaging over the repeated measurements; hier- archical model-based algorithm (MCLUST-HC); complete linkage hierarchical algorithm with SD-weighted distance; and IMM with complete linkage. Clustering with repeated measurements produced more accurate clusters than clus- tering with the estimated true mean data in most cases.... In PAGE 9: ... Clustering with repeated measurements produced more accurate clusters than clus- tering with the estimated true mean data in most cases. Table4 b shows that different clustering approaches lead to different cluster stability with respect to remeasured data. Similar to the results from the completely synthetic data, Euclidean distance tends to produce more stable clusters than correlation (both variability-weighted and average over repeated measurements).... ..."

### Table 3 Cluster accuracy and stability on the completely synthetic data with four repeated measurements at high noise level

2003

"... In PAGE 7: ...cluster quality over the approach of averaging over repeated measurements using the same algorithms at high noise level. In terms of cluster stability (see Table3 b), the following three approaches yield average adjusted Rand index above 0.900: the elliptical model of the IMM approach; the comment reviews reports deposited research interactions information refereed research http://genomebiology.... ..."

### Table 2 Cluster accuracy and stability on the completely synthetic data with four repeated measurements at low noise level

2003

"... In PAGE 6: ... The external knowledge is not used in computing cluster stability. Completely synthetic data at low noise level Table2 a,b shows selected results on cluster accuracy and cluster stability on the completely synthetic datasets with four simulated repeated measurements. Table 2a,b show results from average linkage, complete linkage and centroid linkage hierarchical algorithms, k-means, MCLUST-HC (a hierarchical model-based clustering algorithm from MCLUST) and IMM.... In PAGE 6: ... Completely synthetic data at low noise level Table 2a,b shows selected results on cluster accuracy and cluster stability on the completely synthetic datasets with four simulated repeated measurements. Table2 a,b show results from average linkage, complete linkage and centroid linkage hierarchical algorithms, k-means, MCLUST-HC (a hierarchical model-based clustering algorithm from MCLUST) and IMM. Both single linkage and DIANA produce very low-quality and unstable clusters and their adjusted Rand indices are not shown.... ..."

### Table 2: Comparison of ADCM approach and K-means clustering approach

"... In PAGE 8: ...has good ability to resist noise. In Table2 a detailed comparison of our global algorithm (ADCM), K-means, hierarchical clustering, and SOM is given. Conclusively, ADCM can be considered as an intuitively appealing, user-friendly (no need for predefinition of the number of clusters or the radius of the clusters) and fast clustering algorithm.... ..."

### Table 1: Required time (in seconds) for the clustering approaches.

1998

Cited by 123