Results 1  10
of
240
The length of a typical Huffman codeword
 IEEE Trans. Inform. Theory
, 1994
"... If pi (i = 1,...,N) is the probability of the ith letter of a memoryless source, the length li of the corresponding binary Huffman codeword can be very different from the value − log ∑pi. For a typical letter, however, li ∑ ≈ − log pi. More prepj < 2 −c(m−2)+2 cisely, P − m = where c ≈ 2.27. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
If pi (i = 1,...,N) is the probability of the ith letter of a memoryless source, the length li of the corresponding binary Huffman codeword can be very different from the value − log ∑pi. For a typical letter, however, li ∑ ≈ − log pi. More prepj < 2 −c(m−2)+2 cisely, P − m = where c ≈ 2.27.
Maximal codeword lengths in Huffman codes
 Computers & Mathematics with Applications
, 2000
"... computers & ..."
Tales of Huffman
"... We study the new problem of Huffmanlike codes subject to individual restrictions on the codeword lengths of a subset of the source words. These are prefix codes with minimal expected codeword length for a random source where additionally the codeword lengths of a subset of the source words is pr ..."
Abstract
 Add to MetaCart
We study the new problem of Huffmanlike codes subject to individual restrictions on the codeword lengths of a subset of the source words. These are prefix codes with minimal expected codeword length for a random source where additionally the codeword lengths of a subset of the source words
Bidirectional Huffman Coding
, 1989
"... Under what conditions can Huffman codes be efficiently decoded in both directions? The usual decoding procedure works also for backward decoding only if the code has the affix property, i.e., both prefix and suffix properties. Some affix Huffman codes are exhibited, and necessary conditions for the ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
for the existence of such codes are given. An algorithm is presented which, for a given set of codeword lengths, constructs an affix code, if there exists one. Since for many distributions there is no affix code giving the same compression as the Huffman code, a new algorithm for backward decoding of non
Twenty (or so) questions: boundedlength Huffman coding
, 2006
"... The game of Twenty Questions has long been used to illustrate binary source coding. Recently, a physical device has been developed which mimics the process of playing Twenty Questions, with the device supplying the questions and the user providing the answers. However, this game differs from Twent ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
quasiarithmetic convex coding penalty. In the case of minimizing average codeword length, both time and space complexity can be improved via an alternative reduction. This has, as a special case, a method for nonbinary lengthlimited Huffman coding, which was previously solved via dynamic programming with O
Twenty (or so) questions: Dary boundedlength Huffman coding
, 2006
"... The game of Twenty Questions has long been used to illustrate binary source coding. Recently, a physical device has been developed that mimics the process of playing Twenty Questions, with the device supplying the questions and the user providing the answers. However, this game differs from Twenty Q ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
of minimizing average codeword length, time complexity can often be improved via an alternative graphbased reduction. This has, as a special case, a method for nonbinary lengthlimited Huffman coding, which was previously solved via dynamic programming with O(n² lmax log D) time and O(n 2 log D) space
The Rényi redundancy of generalized Huffman codes
 IEEE Trans. Inf. Theory
, 1988
"... AbstractIf optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted averag ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
AbstractIf optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted
SelfSynchronization of Huffman Codes
, 2003
"... Variable length binary codes have been frequently used for communications since Huffman’s important paper on constructing minimum average length codes. One drawback of variable length codes is the potential loss of synchronization in the presence of channel errors. However, many variable length code ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
is corrupted by one or more bit errors, then as soon as the receiver by random chance correctly detects a selfsynchronizing string, the receiver can continue properly parsing the bit sequence into codewords. Most commonly used binary prefix codes, including Huffman codes, are “complete”, in the sense
Algorithms for Updating Huffman Codes
"... Abstract: Given a list W = [w1,…, wn] of n positive symbol weights, and a list L = [l1,…,ln] of n corresponding integer codeword lengths, it is required to find the new list L when a new value x is inserted in or when an existing value is deleted from the list of weights W. The presented algorithm ..."
Abstract
 Add to MetaCart
Abstract: Given a list W = [w1,…, wn] of n positive symbol weights, and a list L = [l1,…,ln] of n corresponding integer codeword lengths, it is required to find the new list L when a new value x is inserted in or when an existing value is deleted from the list of weights W. The presented algorithm
Quantuminspired Huffman Coding
"... ABSTRACT Huffman Compression, also known as Huffman Coding, is one of many compression techniques in use today. The two important features of Huffman coding are instantaneousness that is the codes can be interpreted as soon as they are received and variable length that is a most frequent symbol has ..."
Abstract
 Add to MetaCart
, the lengths of all the codewords are the same. A Huffmancodinginspired scheme for the storage of quantum information takes O(N(log N) a ) computational steps for a sequential implementation on nonparallel machines. The proposed algorithm, Quantuminspired Huffman coding of symbols with equal frequencies
Results 1  10
of
240