Results 1  10
of
12
Schemes for BiDirectional Modeling of Discrete Stationary Sources
, 2005
"... Adaptive models are developed to deal with bidirectional modeling of unknown discrete stationary sources, which can be generally applied to statistical inference problems such as noncausal universal discrete denoising that exploits bidirectional dependencies. Efficient algorithms for constructing ..."
Abstract

Cited by 16 (9 self)
 Add to MetaCart
(Show Context)
Adaptive models are developed to deal with bidirectional modeling of unknown discrete stationary sources, which can be generally applied to statistical inference problems such as noncausal universal discrete denoising that exploits bidirectional dependencies. Efficient algorithms for constructing those models are developed and implemented. Denoising is a primary focus of the application of those models, and we compare their performance to that of the DUDE algorithm [1] for universal discrete denoising.
Universal erasure entropy estimation
 In Proc. of the 2006 IEEE Intl. Symp. on Inform. Theory, (ISIT’06
, 2006
"... Abstract — Erasure entropy rate (introduced recently by Verdú and Weissman) differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, universal algorithms for estimating erasure entrop ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Abstract — Erasure entropy rate (introduced recently by Verdú and Weissman) differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, universal algorithms for estimating erasure entropy rate are proposed based on the basic and extended contexttree weighting (CTW) algorithms. Consistency results are shown for those CTW based algorithms. Simulation results for those algorithms applied to Markov sources, tree sources and English texts are compared to those obtained by fixedorder plugin estimators with different orders. An estimate of the erasure entropy of English texts based on the proposed algorithms is about 0.22 bits per letter, which can be compared to an estimate of about 1.3 bits per letter for the entropy rate of English texts by a similar CTW based algorithm.
A Universal Scheme for Wyner–Ziv Coding of Discrete Sources
, 2010
"... We consider the Wyner–Ziv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of slidingwindow processing follow ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We consider the Wyner–Ziv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of slidingwindow processing followed by Lempel–Ziv (LZ) compression, while the decompressor is based on a modification of the discrete universal denoiser (DUDE) algorithm to take advantage of side information. The new algorithms not only universally attain the fundamental limits, but also suggest a paradigm for practical WZ coding. The effectiveness of our approach is illustrated with experiments on binary images, and English text using a low complexity algorithm motivated by our class of universally optimal WZ codes.
Discrete denoising with shifts
 IEEE Trans. Inf. Theory
, 2007
"... We introduce SDUDE, a new algorithm for denoising DMCcorrupted data. The algorithm, which generalizes the recently introduced DUDE (Discrete Universal DEnoiser) of Weissman et al., aims to compete with a genie that has access, in addition to the noisy data, also to the underlying clean data, and c ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
We introduce SDUDE, a new algorithm for denoising DMCcorrupted data. The algorithm, which generalizes the recently introduced DUDE (Discrete Universal DEnoiser) of Weissman et al., aims to compete with a genie that has access, in addition to the noisy data, also to the underlying clean data, and can choose to switch, up to m times, between sliding window denoisers in a way that minimizes the overall loss. When the underlying data form an individual sequence, we show that the SDUDE performs essentially as well as this genie, provided that m is sublinear in the size of the data. When the clean data is emitted by a piecewise stationary process, we show that the SDUDE achieves the optimum distributiondependent performance, provided that the same sublinearity condition is imposed on the number of switches. To further substantiate the universal optimality of the SDUDE, we show that when the number of switches is allowed to grow linearly with the size of the data, any (sequence of) scheme(s) fails to compete in the above senses. Using dynamic programming, we derive an efficient implementation of the SDUDE, which has complexity (time and memory) growing only linearly with the data size and the number of switches m. Preliminary experimental results are presented, suggesting that SDUDE has the capacity to significantly improve on the performance attained by the original DUDE in applications where the nature of the data abruptly changes in time (or space), as is often the case in practice. Index Terms Discrete denoising, competitive analysis, individual sequence, universal algorithms, piecewise stationary processes, dynamic programming, discrete memoryless channel (DMC), switching experts, forwardbackward recursions. 1
Universal estimation of erasure entropy
 IEEE Trans. Inf. Theory
"... Abstract—Erasure entropy rate differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the ba ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Erasure entropy rate differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the basic and extended contexttree weighting (CTW) algorithms. Simulation results for those algorithms applied to Markov sources, tree sources, and English texts are compared to those obtained by fixedorder plugin estimators with different orders. Index Terms—Bidirectional context tree, contexttree weighting, data compression, entropy rate, universal algorithms, universal modeling. I.
The iDUDE framework for grayscale image denoising
 IEEE Trans. IP
"... Image denosing, impulse noise, discrete universal denoising, dude ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Image denosing, impulse noise, discrete universal denoising, dude
2011 18th IEEE International Conference on Image Processing Adaptive Filtering of Raster Map Images Using Optimal Context Selection
"... Filtering of raster map images or more general class of paletteindexed images is considered as a discrete denoising problem with finite color output. Statistical features of local context are used to avoid damages of some specific but frequently occurring contexts caused by conventional filters. Se ..."
Abstract
 Add to MetaCart
(Show Context)
Filtering of raster map images or more general class of paletteindexed images is considered as a discrete denoising problem with finite color output. Statistical features of local context are used to avoid damages of some specific but frequently occurring contexts caused by conventional filters. Several contextbased approaches have been developed using either fixed context templates or context tree modeling. However, these algorithms fail to reveal the local geometrical structures when the underlying contexts are also contaminated. To address this problem, we propose a novel contextbased voting method to identify the possible noisy pixels, which are excluded in the context selection and optimization. Experimental results show that the proposed context based filtering outperforms all other existing filters both for impulsive and Gaussian additive noise. Index Terms —Nonlinear filters, context modeling 1.
6. (algorithm) Discrete Universal DEnoiser
"... 3. (slang) A term of address for a man. 4. (archaic) A dandy, a man who is very concerned about his dress and appearance. 5. (slang) A cool person of either sex. ..."
Abstract
 Add to MetaCart
(Show Context)
3. (slang) A term of address for a man. 4. (archaic) A dandy, a man who is very concerned about his dress and appearance. 5. (slang) A cool person of either sex.
unknown title
"... Abstract — We introduce SDUDE, a new algorithm for denoising DMCcorrupted data. The algorithm, which generalizes the recently introduced DUDE (Discrete Universal DEnoiser) of Weissman et al., aims to compete with a genie that has access, in addition to the noisy data, also to the underlying clean ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — We introduce SDUDE, a new algorithm for denoising DMCcorrupted data. The algorithm, which generalizes the recently introduced DUDE (Discrete Universal DEnoiser) of Weissman et al., aims to compete with a genie that has access, in addition to the noisy data, also to the underlying clean data, and can choose to switch, up to m times, between sliding window denoisers in a way that minimizes the overall loss. When the underlying data form an individual sequence, we show that the SDUDE performs essentially as well as this genie, provided that m is sublinear in the size of the data. When the clean data is emitted by a piecewise stationary process, we show that the SDUDE achieves the optimum distributiondependent performance, provided that the same sublinearity condition is imposed on the number of switches. To further substantiate the universal optimality of the SDUDE, we show that when the number of switches is allowed to grow linearly with the size of the data, any (sequence of) scheme(s) fails to compete in the above senses. Using dynamic programming, we derive an efficient implementation of the SDUDE, which has complexity (time and memory) growing only linearly with the data size and the number of switches m. Preliminary experimental results are presented, suggesting that SDUDE has the capacity to significantly improve on the performance attained by the original DUDE in applications where the nature of the data abruptly changes in time (or space), as is often the case in practice. I.