### Table 4. Performances on the test set of the modular MCS architecture.

2003

"... In PAGE 8: ... Table 3. List of selected features for each service Feature Subsets Service Intrinsic Content Traffic Http 6 6 19 Ftp 5 9 19 Mail 5 19 ICMP 3 - 13 Private amp;Other 6 - 18 Misc 7 11 19 Table4 shows the classification results of the modular MCS architecture illustrated in Section 3. For each service, and for each feature subset, a number of classifiers have been trained and the one that attained the minimum overall error has been cho- sen.... In PAGE 8: ...In addition Table4 also shows the results of the MCS architecture obtained by choosing the classifiers and the fusion rules that minimized the sum of the false and missed alarm rates. It can be easily seen that both the classification cost and the false and missed alarm rates decrease dramatically with respect to the best KDD99 results, at the price of a slightly higher error rate.... ..."

Cited by 3

### Table 18: Function approximation: parameters used in the growing algorithm and architectures of modular trees. The architecture of subnetworks in resulting modular trees is the MLP with either 2-2-1 or 2-3-1.

"... In PAGE 13: ... To visualize the results, we selected a function as f(x; y) = (x2 ? y2) sin x 2 ; ? 10 x; y 10: (32) In the experiment, we used a training set with 625 samples to learn the mapping. All parameters used in the growing algorithm, architectures of both subnetworks and resulting modular trees are listed in Table18 . Obviously, the two resulting modular trees with di erent architectures of subnetworks share the same architecture.... ..."

### Table 18: Function approximation: parameters used in the growing algorithm and architectures of modular trees. The architecture of subnetworks in resulting modular trees is the MLP with either 2-2-1 or 2-3-1.

"... In PAGE 14: ... To visualize the results, we selected a function as f(x; y) = (x2 ? y2) sin x 2 ; ? 10 x; y 10: (32) In the experiment, we used a training set with 625 samples to learn the mapping. All parameters used in the growing algorithm, architectures of both subnetworks and resulting modular trees are listed in Table18 . Obviously, the two resulting modular trees with di erent architectures of subnetworks share the same architecture.... ..."

### Table 18: Function approximation: parameters used in the growing algorithm and architectures of modular trees. The architecture of subnetworks in resulting modular trees is the MLP with either 2-2-1 or 2-3-1.

1997

"... In PAGE 13: ... To visualize the results, we selected a function as CUB4DCBN DDB5BPB4DC BE A0 DD BE B5 D7CXD2 DC BE BN A0 BDBC AK DCBN DD AK BDBCBM (32) In the experiment, we used a training set with 625 samples to learn the mapping. All parameters used in the growing algorithm, architectures of both subnetworks and resulting modular trees are listed in Table18 . Obviously, the two resulting modular trees with different architectures of subnetworks share the same architecture.... ..."

Cited by 2

### Table 15: The image segmentation problem: architectures of resulting modular trees, (NH; NMLP ), with di erent architectures of subnetworks.

"... In PAGE 13: ... In our experiments, the architecture of subnetworks was chosen as the MLP with either 19-17-7 or 19-21-7. As a result, all parameters used in the growing algorithm are listed in Table 14 and the architectures of resulting modular trees based upon di erent overlapping factors in the splitting rule are also shown in Table15 . The testing results of these mod- ular trees corresponding to di erent overlapping factors are shown in Figure 14.... ..."

### Table 15: The image segmentation problem: architectures of resulting modular trees, (NH; NMLP ), with di erent architectures of subnetworks.

"... In PAGE 13: ... In our experiments, the architecture of subnetworks was chosen as the MLP with either 19-17-7 or 19-21-7. As a result, all parameters used in the growing algorithm are listed in Table 14 and the architectures of resulting modular trees based upon di erent overlapping factors in the splitting rule are also shown in Table15 . The testing results of... ..."

### Table 1: CLB usage and minimal clock cycle time of modular exponentiation architectures on Xilinx FPGAs

2001

"... In PAGE 10: ...Modular Exponentiation Table1 shows our results in terms of used CLBs (C) and the clock cycle time (T). Operands and exponents have the same bit lengths in all cases.... ..."

Cited by 25

### Table 1: CLB usage and minimal clock cycle time of modular exponentiation architectures on Xilinx FPGAs

2001

"... In PAGE 10: ...Modular Exponentiation Table1 shows our results in terms of used CLBs (C) and the clock cycle time (T). Operands and exponents have the same bit lengths in all cases.... ..."

Cited by 25

### Table 11: The speaker independent vowel recognition problem: architectures of resulting modular trees, (NH; NMLP ), with di erent architectures of subnetworks.

"... In PAGE 12: ... We generated several modular trees corresponding to di erent overlapping factors with the same data used by Robinson [47] and thereafter used his test data to evaluate the generalization ability of modular trees. As a result, all architectures of resulting modular trees are shown in Table11 and error rates produced by modular trees corresponding to di erent overlapping factors are illustrated in Figure 12. The architecture of the modular tree, MT (10-18-11), producing the best result (corresponding to the overlapping factor = 0:55) is illustrated in Figure 13.... ..."