### Table V shows the variation in performance of the Joint classifier with neighbourhood size N and texton dictionary size S.

Cited by 1

### Table 6.2: Comparison of di ering sized dictionaries on the thyroid dataset for LP{SVMs. We used the RBF kernels 1 = ?3, 2 = ?1 and 3 = 2 and combined these widths in a kernel dictionary in all seven possible combinations (dictionaries of size one, two and three, i.e x;y denotes a kernel dictionary of size two with widths x and y). Larger dictionaries improve or are as good as the performance of their best subset and still result in sparse solutions.

1999

Cited by 5

### Table 3-2: Comparison of Dictionaries

"... In PAGE 17: ... /* h is in range 0..65535 */ h = ((unsigned short int)h1 lt; lt; 8)|(unsigned short int)h2; /* use division method to scale */ return h % HASH_TABLE_SIZE } Assuming n data items, the hash table size should be large enough to accommodate a reasonable number of entries. As seen in Table3 -1, a small table size substantially increases the average time to find a key. A hash table may be viewed as a collection of linked lists.... In PAGE 17: ... There is considerable leeway in the choice of table size. size time size time 1 869 128 9 2 432 256 6 4 214 512 4 8 106 1024 4 16 54 2048 3 32 28 4096 3 64 15 8192 3 Table3... In PAGE 25: ...ime. The algorithm should be efficient. This is especially true if a large dataset is expected. Table3 -2 compares the search time for each algorithm. Note that worst-case behavior for hash tables and skip lists is extremely unlikely.... In PAGE 25: ...Table3... In PAGE 25: ...Table 3-2: Comparison of Dictionaries Average time for insert, search, and delete operations on a database of 65,536 (216) randomly input items may be found in Table3 -3. For this test the hash table size was 10,009 and 16 index levels were allowed for the skip list.... In PAGE 25: ... Although there is some variation in the timings for the four methods, they are close enough so that other considerations should come into play when selecting an algorithm. method insert search delete hash table 18 8 10 unbalanced tree 37 17 26 red-black tree 40 16 37 skip list 48 31 35 Table3 -3: Average Time (us), 65536 Items, Random Input Table 3-4 shows the average search time for two sets of data: a random set, where all values are unique, and an ordered set, where values are in ascending order. Ordered input creates a worst- case scenario for unbalanced tree algorithms, as the tree ends up being a simple linked list.... In PAGE 25: ... Although there is some variation in the timings for the four methods, they are close enough so that other considerations should come into play when selecting an algorithm. method insert search delete hash table 18 8 10 unbalanced tree 37 17 26 red-black tree 40 16 37 skip list 48 31 35 Table 3-3: Average Time (us), 65536 Items, Random Input Table3 -4 shows the average search time for two sets of data: a random set, where all values are unique, and an ordered set, where values are in ascending order. Ordered input creates a worst- case scenario for unbalanced tree algorithms, as the tree ends up being a simple linked list.... In PAGE 25: ...6 seconds, while an unbalanced tree algorithm would take 1 hour. count hash table unbalanced tree red-black tree skip list 16 4 3 2 5 256 3 4 4 9 4,096 3 7 6 12 random input 65,536 8 17 16 31 16 3 4 2 4 256 74 7 4,096 3 1,033 6 11 ordered input 65,536 7 55,019 9 15 Table3... ..."

### Table 2: Communication volume for the sparse loops of the Conjugate Gradient Algorithm on 16 processors. The variations across processors are re ected in the Min, Max and Avg columns.

1997

"... In PAGE 21: ... 7.1 Communication Volume in Executor Table2 shows the communication volume in executor for 16 processors in a 4 4 processors mesh when computing the sparse loops of the CG algorithm. This communication is necessary for accumulating the local partial products in the array Q .... In PAGE 32: ... Lower-right: BRS distribution and BCSSTK29 input matrix. Table 1: Characteristics of benchmark matrices Table2 : Communication volume for the sparse loops of the Conjugate Gradient Algorithm on 16 processors. The variations across processors are re ected in the Min, Max and Avg columns.... ..."

Cited by 33

### Tables VI and VII present the Wire cap and Total cap variations due to CMP for designs having dense and sparse fills respectively. We observe that the variations in the wire and total capacitances due

### Table 4. Coverage of dictionary (in %) before and after application of spelling variation algorithm on the pronun- ciation dictionaries described in Table 1 (FN=forenames, SN=surnames, ST=streetnames, PL=placenames).

"... In PAGE 4: ... These rewrite rule sets were then used to make predic- tions for the remaining OOV words for each domain. Table4 shows the percentage improvement in coverage for the dictionaries obtained by using the algorithm. Clearly Table 3.... ..."

### Table 1: Examples of near-synonymic variation.

"... In PAGE 2: ... discrimination dictionaries, Edmonds (1999) gives a clas- sification of near-synonymic variation into 35 subcategories of the above four broad categories. Table1 gives several examples, which we will discuss briefly. Collocational variation involves the words or concepts with which a word can be combined, possibly idiomatically, in a well-formed sentence.... In PAGE 2: ... Many stylistic dimen- sions have been proposed by Hovy (1988), Nirenburg amp; De- frise (1992), Stede (1993), and others. Table1 illustrates two of the most common dimensions: Inebriated is formal while pissed is informal; annihilate is a more forceful way of say- ing ruin. Expressive variation can be used by a speaker to express his or her attitude towards a participant of the situ- ation being spoken about.... ..."