Results 1 - 10
of
926
Table 6: running times of the parameterization algorithms
2004
Table 3. Example Extensible Operating Systems. The Exokernel and Solaris systems support multiple technologies for extensibility; each technology is shown independently. Most extensibility in Exokernel is achieved by modifying user-level libraries (the first entry), but Exokernel also provides the ability to download code into the kernel (the second entry). Solaris supports build time parameterization in the form of the addition of new file systems or scheduling classes (the first entry). At run-time, the super-user can dynamically assign processes to different scheduling classes (the second entry) and can dynamically download device drivers (the third entry).
1997
Cited by 12
Table 3. Example Extensible Operating Systems. The Exokernel and Solaris systems support multiple technologies for extensibility; each technology is shown independently. Most extensibility in Exokernel is achieved by modifying user-level libraries (the first entry), but Exokernel also provides the ability to download code into the kernel (the second entry). Solaris supports build time parameterization in the form of the addition of new file systems or scheduling classes (the first entry). At run-time, the super-user can dynamically assign processes to different scheduling classes (the second entry) and can dynamically download device drivers (the third entry).
Table 2: Statistics and timings.
"... In PAGE 8: ... Some examples of textured models are shown in Figure 10. Table2 shows the sizes of the data sets, the number of created charts, and the following statistics, obtained on a 1.3 GHz Pentium III (note that the timings for the packing algorithm are not included, since they are negligible): time to segment the model into charts; time to parameterize the charts.... ..."
Table1.Percentage of error rates Table2. Features and the parameters used Bayes Parzen KNN F. KNN Bayes Parzen KNN F. KNN
1997
"... In PAGE 20: ...ombination of the results of all the feature extraction algorithms. This is called the hybrid features. Since the cost of false alarm and missed defects are different, the total error can be taken as weighted combination of these two errors. Table1 and table 3 shows these errors in two different rows. In these tables the first row in a cell corresponds to error rate of missed defects.... ..."
Cited by 1
Table 3 The boosted methods against the weak leaner KNN for force X Compare with KNN Adaboost-M(KNN) A-boosting-M(KNN) Adaboost.M1(KNN)
2006
"... In PAGE 28: ...1.1 The Comparisons of Boosted Methods against the Corresponded Weak Learners Table3 gives the comparisons of Adaboost-M(KNN), A-boosting-M(KNN), Adaboost.M1(KNN) against the weak leaner KNN.... ..."
Table III. Parameter tuning in WORD, kNN and LLSF on Reuters version 3, validation test set Fixed Parameters Tested Values Best Choice(s) Best 11-ptAvgp Best microavgF1
1999
Cited by 347
Table 4: k-NN Query.
2000
"... In PAGE 8: ....2.3 k Nearest Neighbor Queries A k-NN query retrieves a set of objects such that for any two objects , . The algorithm for k-NN queries is shown in Table4 . Like the ba- sic k-NN algorithm, the algorithm uses a priority queue to navigate the nodes/objects in the database in increasing order of their distances from .... ..."
Cited by 91
Results 1 - 10
of
926