### Table 2: Compression results in bytes per number required to encode 10,000 random gaussian numbers with different coding schemes

in Optimal alphabet partitioning for semi-adaptive coding of sources with unknown sparse distributions

2003

"... In PAGE 8: ... To show the applicability of our approach to a wider class of applications, we then generated random gaussian numbers with different variances and used the greedy heuristic, Huffman coding of bytes, Huffman coding of pairs of bytes and Huffman coding of four byte words. Table2 and Table 3 show the results obtained with a source sequence of length 10000 and 100000 respectively. As before all costs including that of storing the Huffman table are included in the results.... ..."

Cited by 7

### TABLE II AVERAGE BER (Pbe) AND PROBABILITY OF DECODING FAILURE (Pdf = P[BER gt; 0 AFTER 24 ITERATIONS ]) AFTER 24 ITERATIONS, UNIFORM RANDOM Y , DISCRETE GAUSSIAN NOISE, RANDOM INTERLEAVER, CODE FROM [11], 1000 REPETITIONS FOR N = 1e4, 100 FOR N = 1e5.

2004

### Table 3: Compression results in bits per number required to encode 10,000 random Gaussian numbers with di erent coding schemes, where (a; b) means the mean is a and the standard deviation is b. For each coding scheme involving Hu man code, we list the cost of storing the Hu man table (\-table quot;), the cost of storing the data (\-data quot;), and the total cost (\-all quot;), for each compression result. We also show the corresponding 32-bit entropy in bits per number.

"... In PAGE 17: ... As before, we used the option of the best compression (\gzip -9 quot;) for gzip, and we also computed the 32-bit entropy. Table3 and Table 4 show the results obtained with a source sequence of length 10,000 and 100,000 respec- tively. As before each result is shown as the cost of storing the Hu man table, the cost of storing the data, and the total cost of storing both, for each coding scheme involving Hu man code.... ..."

Cited by 2

### Table 4: Compression results in bits per number required to encode 10,000 random Gaussian numbers with di erent coding schemes, where (a; b) means the mean is a and the standard deviation is b. For each coding scheme involving Hu man code, we list the cost of storing the Hu man table (\-table quot;), the cost of storing the data (\-data quot;), and the total cost (\-all quot;), for each compression result. We also show the corresponding 32-bit entropy in bits per number.

"... In PAGE 20: ... As before, we used the option of the best compression (\gzip -9 quot;) for gzip, and we also computed the 32-bit entropy. Table4 and Table 5 show the results obtained with a source sequence of length 10,000 and 100,000 respec- tively. As before each result is shown as the cost of storing the Hu man table, the cost of storing the data, and the total cost of storing both, for each coding scheme... ..."

Cited by 2

### Table 5: Compression results in bits per number required to encode 100,000 random Gaussian numbers with di erent coding schemes, where (a; b) means the mean is a and the standard deviation is b. For each coding scheme involving Hu man code, we list the cost of storing the Hu man table (\-table quot;), the cost of storing the data (\-data quot;), and the total cost (\-all quot;), for each compression result. We also show the corresponding 32-bit entropy in bits per number. We omit the entries for \32-bit quot; since these results are much worse than all other methods, similar to what we have seen in Table 4.

Cited by 2