### TABLE II Codewords corresponding 1-out-of-6 code and 5-out-of-6 code

2004

### TABLE I SIZE GROWTH OF THE SUBSET I1 WITH SNR FOR THE LB-S BOUND AND COMPARISON OF THE KAT (WITH THE UPPER BOUND ON ij) AND LB-S BOUNDS FOR THE BCH (63, 10) CODE. smax IS THE LARGEST WEIGHT AND Bsmax(I1) IS ITS CORRESPONDING NUMBER OF CODEWORDS IN I1.

### TABLE I HUFFMAN CONVERSION TABLE FOR p1 =0.16921.OMLIS THE OUTPUT MEAN LENGTH OF THAT CODEWORD, AND P(INPUT) IS THE PROBABILITY OF OCCURRENCE OF THE CORRESPONDING INPUT TO THE HUFFMAN ENCODER.

### Table 3: Codewords and T-depletion code elements for the T-Code set S

1997

"... In PAGE 11: ... 2.3 Conversion between Variable-Length T-Code Codewords and T- Depletion Codewords In Table3 , we listed the T-Code codewords from S n281;1;3n29 n280;1;01n29 and their corresponding T-depletion codewords, but we did not explain howwe obtained one format from the other. We will now discuss these conversion issues for both directions:... ..."

Cited by 8

### Table 3 summarizes an experiment in which we took the probability distributions of English, German, Finnish and French (as in Table 1), adding to them the character distribution in Gadsby, and checked what happens if they are mutually interchanged. The rows correspond to the distributions which are used to generate the codewords (the assumed distribution), and the columns correspond to the distribution that actually occurs (the true distribution). For each pair of distributions (A; B), both Hu man and arithmetic codes were computed, and the table lists at the intersection of row A with column B by how much (in percent) the average Hu man codeword is longer than the average arithmetic codeword, i.e., the value (Ah=Aa ? 1) 100. A negative value thus means that for the given pair, Hu man codes do better than arithmetic codes.

1993

"... In PAGE 11: ... Hu man codes could have dealt easily also with probabilities that are zero, but for arithmetic codes, a probability of 0 causes problems. An interesting point about Table3 is the fact that so many entries are negative. The diagonal corresponds to assuming the true distribution, so clearly all the values there must be positive.... ..."

Cited by 13

### Table 1 presents a part of the whole 16x16 frequency table; lines correspond to INi codewords, and columns to OUTj ones. This table reads as follows: none of the possible evolutions for a data looking like the ones in the first class of the IN map are present in classes 1 to 8oftheOUT map (in fact 95% of those are in class number 13); 24% of the possible evolutions for a data looking like the ones in class two of the IN map are in class two of the OUT map, 13% belongs to class four of the OUT map, 3% to class five, etc.

2003

Cited by 6

### Table 2: Variable Length Codeword (VLC) Table (s denotes the sign bit)

"... In PAGE 5: ... In our method, only inverse entropy coding must be executed since our system is applied in the VLC domain. After a cover (compressed) video is inversely entropy coded, an MPEG compressed bitstream is represented using variable length codewords, as tabulated in Table2 . In the VLC table, each codeword corresponds to a run-level pair, denoted as (r;;l).... In PAGE 10: ... Here, we shall explain how the problem of bit-rate control can be dealt with. According to the VLC table ( Table2 ), the number of bits used to represent a run level pair satis es the following inequality: VLC length (r;;l) VLC length (r;;l+1);; (9) where the function VLC length ( ;; ) reports the number of bits used to represent the VLC codeword of a run level pair. Eq.... ..."

### Table 1: Weight-2 inputs generating turbo codewords of weights 6, 8, and 10.

1997

"... In PAGE 5: ... In our search for mappings of bit patterns corresponding to powers of the feedback polynomial that mapped into a bit pattern corresponding to a power of the feedback polynomial, we found none which would result in codeword weight 16 or less. Table1 describes in more detail the search results for codewords of weight 10 or less. The table speci es the positions of the non-zero bits in information frames of weight 2 that generate code sequences with weight 6, 8, or 10.... In PAGE 7: ... The simulated BER is reduced by a factor of about 2 relative to the original BER. Also shown is the distance-8 asymptote re ecting the weight-8 codewords listed in Table1... In PAGE 8: ...odi cation is shown in Fig. 5. The simulated BER is reduced by an additional factor of about 3, yielding a total reduction factor of approximately 7 relative to the original BER. The gure also shows the distance-10 asymptote re ecting the contributions of the weight-10 codewords in Table1 . The rate loss incurred in achieving this reduced BER was slight, amounting to only 34 information bits per frame, or about 0.... ..."

Cited by 7

### Table 2: Corresponding Frequencies of largest diagonal SEC entries for 8-connectivity neighborhood

"... In PAGE 5: ... Thus, the largest diagonal values correspond to the homogeneous regions of the dataset. Table2 lists the diagonal entries with the largest values. These entries correspond to pasture and forest codewords.... ..."