Results 1  10
of
16
Representation of Mutual Information Via Input Estimates
"... Abstract—A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum meansquare error. This paper generalizes the link between information theory and estimation theory to arbitrary channe ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
(Show Context)
Abstract—A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum meansquare error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language. Index Terms—Computation of mutual information, extrinsic information, input estimation, lowdensity paritycheck (LDPC) codes, minimum mean square error (MMSE), mutual information, soft channel decoding. I.
The Information Lost in Erasures
, 2008
"... We consider sources and channels with memory observed through erasure channels. In particular, we examine the impact of sporadic erasures on the fundamental limits of lossless data compression, lossy data compression, channel coding, and denoising. We define the erasure entropy of a collection of ra ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
(Show Context)
We consider sources and channels with memory observed through erasure channels. In particular, we examine the impact of sporadic erasures on the fundamental limits of lossless data compression, lossy data compression, channel coding, and denoising. We define the erasure entropy of a collection of random variables as the sum of entropies of the individual variables conditioned on all the rest. The erasure entropy measures the information content carried by each symbol knowing its context. The erasure entropy rate is shown to be the minimal amount of bits per erasure required to recover the lost information in the limit of small erasure probability. When we allow recovery of the erased symbols within a prescribed degree of distortion, the fundamental tradeoff is described by the erasure rate–distortion function which we characterize. We show that in the regime of sporadic erasures, knowledge at the encoder of the erasure locations does not lower the rate required to achieve a given distortion. When no additional encoded information is available, the erased information is reconstructed solely on the basis of its context by a denoiser. Connections between erasure entropy and discrete denoising are developed. The decrease of the capacity of channels with memory due to sporadic memoryless erasures is also characterized in wide generality.
Multidirectional context sets with applications to universal denoising and compression
 in Proc. IEEE Int. Symp. Information Theory
, 2005
"... Abstract — The classical framework of context–tree models used in sequential decision problems such as compression and prediction is generalized to a setting in which the observations are multi–tracked or multi–directional, and for which it may be beneficial to consider contexts comprised of possibl ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
Abstract — The classical framework of context–tree models used in sequential decision problems such as compression and prediction is generalized to a setting in which the observations are multi–tracked or multi–directional, and for which it may be beneficial to consider contexts comprised of possibly differing numbers of symbols from each track or direction. Context set definitions, tree representations, and pruning algorithms are all extended from the classical uni–directional setting to the m– directional setting, with an emphasis on the case of m =2.We provide a simple example suggesting that determining (pruning) the best m–directional context set for m ≥ 3 is substantially more complex than in the case of m =2. After briefly describing how the multi–directional framework can be applied to universal data compression, we focus on its application to universal denoising, where we pair the proposed framework with a new technique for estimating the loss of a denoising algorithm based only on noisy observations. I.
Universal algorithms for channel decoding of uncompressed sources
 IEEE TRANS. INFORM. THEORY
, 2008
"... In many applications, an uncompressed source stream is systematically encoded by a channel code (which ignores the source redundancy) for transmission over a discrete memoryless channel. The decoder knows the channel and the code but does not know the source statistics. This paper proposes several ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
In many applications, an uncompressed source stream is systematically encoded by a channel code (which ignores the source redundancy) for transmission over a discrete memoryless channel. The decoder knows the channel and the code but does not know the source statistics. This paper proposes several universal channel decoders that take advantage of the source redundancy without requiring prior knowledge of its statistics.
Universal erasure entropy estimation
 In Proc. of the 2006 IEEE Intl. Symp. on Inform. Theory, (ISIT’06
, 2006
"... Abstract — Erasure entropy rate (introduced recently by Verdú and Weissman) differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, universal algorithms for estimating erasure entrop ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
(Show Context)
Abstract — Erasure entropy rate (introduced recently by Verdú and Weissman) differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, universal algorithms for estimating erasure entropy rate are proposed based on the basic and extended contexttree weighting (CTW) algorithms. Consistency results are shown for those CTW based algorithms. Simulation results for those algorithms applied to Markov sources, tree sources and English texts are compared to those obtained by fixedorder plugin estimators with different orders. An estimate of the erasure entropy of English texts based on the proposed algorithms is about 0.22 bits per letter, which can be compared to an estimate of about 1.3 bits per letter for the entropy rate of English texts by a similar CTW based algorithm.
A Universal Scheme for Wyner–Ziv Coding of Discrete Sources
, 2010
"... We consider the Wyner–Ziv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of slidingwindow processing follow ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We consider the Wyner–Ziv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of slidingwindow processing followed by Lempel–Ziv (LZ) compression, while the decompressor is based on a modification of the discrete universal denoiser (DUDE) algorithm to take advantage of side information. The new algorithms not only universally attain the fundamental limits, but also suggest a paradigm for practical WZ coding. The effectiveness of our approach is illustrated with experiments on binary images, and English text using a low complexity algorithm motivated by our class of universally optimal WZ codes.
Discrete denoising with shifts
 IEEE Trans. Inf. Theory
, 2007
"... We introduce SDUDE, a new algorithm for denoising DMCcorrupted data. The algorithm, which generalizes the recently introduced DUDE (Discrete Universal DEnoiser) of Weissman et al., aims to compete with a genie that has access, in addition to the noisy data, also to the underlying clean data, and c ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
We introduce SDUDE, a new algorithm for denoising DMCcorrupted data. The algorithm, which generalizes the recently introduced DUDE (Discrete Universal DEnoiser) of Weissman et al., aims to compete with a genie that has access, in addition to the noisy data, also to the underlying clean data, and can choose to switch, up to m times, between sliding window denoisers in a way that minimizes the overall loss. When the underlying data form an individual sequence, we show that the SDUDE performs essentially as well as this genie, provided that m is sublinear in the size of the data. When the clean data is emitted by a piecewise stationary process, we show that the SDUDE achieves the optimum distributiondependent performance, provided that the same sublinearity condition is imposed on the number of switches. To further substantiate the universal optimality of the SDUDE, we show that when the number of switches is allowed to grow linearly with the size of the data, any (sequence of) scheme(s) fails to compete in the above senses. Using dynamic programming, we derive an efficient implementation of the SDUDE, which has complexity (time and memory) growing only linearly with the data size and the number of switches m. Preliminary experimental results are presented, suggesting that SDUDE has the capacity to significantly improve on the performance attained by the original DUDE in applications where the nature of the data abruptly changes in time (or space), as is often the case in practice. Index Terms Discrete denoising, competitive analysis, individual sequence, universal algorithms, piecewise stationary processes, dynamic programming, discrete memoryless channel (DMC), switching experts, forwardbackward recursions. 1
Universal estimation of erasure entropy
 IEEE Trans. Inf. Theory
"... Abstract—Erasure entropy rate differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the ba ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Erasure entropy rate differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the basic and extended contexttree weighting (CTW) algorithms. Simulation results for those algorithms applied to Markov sources, tree sources, and English texts are compared to those obtained by fixedorder plugin estimators with different orders. Index Terms—Bidirectional context tree, contexttree weighting, data compression, entropy rate, universal algorithms, universal modeling. I.
2011 18th IEEE International Conference on Image Processing Adaptive Filtering of Raster Map Images Using Optimal Context Selection
"... Filtering of raster map images or more general class of paletteindexed images is considered as a discrete denoising problem with finite color output. Statistical features of local context are used to avoid damages of some specific but frequently occurring contexts caused by conventional filters. Se ..."
Abstract
 Add to MetaCart
Filtering of raster map images or more general class of paletteindexed images is considered as a discrete denoising problem with finite color output. Statistical features of local context are used to avoid damages of some specific but frequently occurring contexts caused by conventional filters. Several contextbased approaches have been developed using either fixed context templates or context tree modeling. However, these algorithms fail to reveal the local geometrical structures when the underlying contexts are also contaminated. To address this problem, we propose a novel contextbased voting method to identify the possible noisy pixels, which are excluded in the context selection and optimization. Experimental results show that the proposed context based filtering outperforms all other existing filters both for impulsive and Gaussian additive noise. Index Terms —Nonlinear filters, context modeling 1.
6. (algorithm) Discrete Universal DEnoiser
"... 3. (slang) A term of address for a man. 4. (archaic) A dandy, a man who is very concerned about his dress and appearance. 5. (slang) A cool person of either sex. ..."
Abstract
 Add to MetaCart
(Show Context)
3. (slang) A term of address for a man. 4. (archaic) A dandy, a man who is very concerned about his dress and appearance. 5. (slang) A cool person of either sex.