Results 1 
7 of
7
Universal estimation of erasure entropy
 IEEE Trans. Inf. Theory
"... Abstract—Erasure entropy rate differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the ba ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Abstract—Erasure entropy rate differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the basic and extended contexttree weighting (CTW) algorithms. Simulation results for those algorithms applied to Markov sources, tree sources, and English texts are compared to those obtained by fixedorder plugin estimators with different orders. Index Terms—Bidirectional context tree, contexttree weighting, data compression, entropy rate, universal algorithms, universal modeling. I.
Universal Estimation of Directed Information via Sequential Probability Assignments
"... Abstract—We propose four approaches to estimating the directed information rate between a pair of jointly stationary ergodic processes with the help of universal probability assignments. The four approaches yield estimators with different merits such as nonnegativity and boundedness. We establish ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract—We propose four approaches to estimating the directed information rate between a pair of jointly stationary ergodic processes with the help of universal probability assignments. The four approaches yield estimators with different merits such as nonnegativity and boundedness. We establish consistency of these estimators in various senses and derive nearoptimal rates of convergence in the minimax sense under mild conditions. The estimators carry over directly to estimating other information measures of stationary ergodic processes, such as entropy rate and mutual information rate, and provide alternatives to classical approaches in the existing literature. Guided by the theoretical results, we use context tree weighting as the vehicle for the implementations of the proposed estimators. Experiments on synthetic and real data are presented, demonstrating the potential of the proposed schemes in practice and the efficacy of directed information estimation as a tool for detecting and measuring causality and delay. Index Terms—Causal influence, context tree weighting, directed information, rate of convergence, universal probability assignment I.
6. (algorithm) Discrete Universal DEnoiser
"... 3. (slang) A term of address for a man. 4. (archaic) A dandy, a man who is very concerned about his dress and appearance. 5. (slang) A cool person of either sex. ..."
Abstract
 Add to MetaCart
(Show Context)
3. (slang) A term of address for a man. 4. (archaic) A dandy, a man who is very concerned about his dress and appearance. 5. (slang) A cool person of either sex.
Universal Lossless Compression of Erased Symbols
"... Abstract—A source X goes through an erasure channel whose output is Z. The goal is to compress losslessly X when the compressor knows X and Z and the decompressor knows Z. Wepropose a universal algorithm based on contexttree weighting (CTW), parameterized by a memorylength parameter `. We show tha ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—A source X goes through an erasure channel whose output is Z. The goal is to compress losslessly X when the compressor knows X and Z and the decompressor knows Z. Wepropose a universal algorithm based on contexttree weighting (CTW), parameterized by a memorylength parameter `. We show that if the erasure channel is stationary and memoryless, and X is stationary and ergodic, then the proposed algorithm achieves a compression rate of H(X0jX 01 0 `;Z`) bits per erasure. Index Terms—Contexttree weighting, discrete memoryless erasure channel, entropy, erasure entropy, side information, universal lossless compression.