Results 1  10
of
12
A universal WynerZiv scheme for discrete sources
 Proc. IEEE Int. Symp. Inform. Theory
, 2007
"... Abstract — We consider the WynerZiv (WZ) problem of ratedistortion coding with decoder side information, for the case where the source statistics are unknown or nonexistent. A new family of WZ coding algorithms is proposed and its universal optimality is proven. Encoding is based on a sliding wind ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
Abstract — We consider the WynerZiv (WZ) problem of ratedistortion coding with decoder side information, for the case where the source statistics are unknown or nonexistent. A new family of WZ coding algorithms is proposed and its universal optimality is proven. Encoding is based on a sliding window operation followed by LZ compression, while decoding is based on a natural extension of the Discrete Universal DEnoiser (DUDE) algorithm to the case where side information is present. The effectiveness of our approach is illustrated with experiments on binary images using a low complexity algorithm motivated by our class of universally optimal WZ codes. I.
A Universal Scheme for Wyner–Ziv Coding of Discrete Sources
, 2010
"... We consider the Wyner–Ziv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of slidingwindow processing follow ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We consider the Wyner–Ziv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of slidingwindow processing followed by Lempel–Ziv (LZ) compression, while the decompressor is based on a modification of the discrete universal denoiser (DUDE) algorithm to take advantage of side information. The new algorithms not only universally attain the fundamental limits, but also suggest a paradigm for practical WZ coding. The effectiveness of our approach is illustrated with experiments on binary images, and English text using a low complexity algorithm motivated by our class of universally optimal WZ codes.
Efficient Online Schemes for Encoding Individual Sequences with Side Information at the Decoder ∗
, 2009
"... We present adaptive online schemes for lossy encoding of individual sequences under the conditions of the WynerZiv (WZ) problem, i.e., the decoder has access to side information whose statistical dependency on the source is known. Both the source sequence and the side information consist of symbol ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
We present adaptive online schemes for lossy encoding of individual sequences under the conditions of the WynerZiv (WZ) problem, i.e., the decoder has access to side information whose statistical dependency on the source is known. Both the source sequence and the side information consist of symbols taking on values in a finite alphabet X. In the first part of this article, a set of fixedrate scalar source codes with zero delay is presented. We propose a randomized online coding scheme, which achieves asymptotically (and with high probability), the performance of the best source code in the set, uniformly over all source sequences. The scheme uses the same rate and has zero delay. We then present an efficient algorithm for implementing our online coding scheme in the case of a relatively small set of encoders. We also present an efficient algorithm for the case of a larger set of encoders with a structure, using the method of the weighted graph and the Weight Pushing Algorithm (WPA). In the second part of this article, we extend our results to the case of variablerate coding. A set of variablerate scalar source codes is presented. We generalize the randomized online coding scheme, to our case. This time, the performance is measured by the Lagrangian Cost (LC), which is defined as a weighted sum of the distortion and the length of the encoded sequence. We present an efficient algorithm for implementing our online variablerate coding scheme in the case of a relatively small set of encoders. We then consider the special case of lossless variablerate coding. An online scheme which use Huffman codes is presented. We show that this scheme can be implemented efficiently using the same graphic methods from the first part. Combining the results from former sections, we build a generalized efficient algorithm for structured set of variablerate encoders. Finally, we show how to generalize all the results to general distortion measures. The complexity of all the algorithms is no more than linear in the sequence length.
Universal Wyner–Ziv coding of discrete memoryless sources with known side information statistics
, 2008
"... Abstract — We consider a universal variant of the Wyner–Ziv problem of lossy source coding with decoder side information, where the marginal distribution of the side information is known, while the conditional distribution of the source sequence given the side information is only assumed to lie in a ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract — We consider a universal variant of the Wyner–Ziv problem of lossy source coding with decoder side information, where the marginal distribution of the side information is known, while the conditional distribution of the source sequence given the side information is only assumed to lie in a given parametric family. Our approach combines minimumdistance channel estimation with a recent scheme of Jalali, Verdú andWeissman based on discrete universal denoising with auxiliary information. Our scheme is universal under mild regularity conditions. We also give a concrete example of a family of sources satisfying the regularity conditions. I.
Achievability Results for Learning Under Communication Constraints
"... Abstract — The problem of statistical learning is to construct an accurate predictor of a random variable as a function of a correlated random variable on the basis of an i.i.d. training sample from their joint distribution. Allowable predictors are constrained to lie in some specified class, and th ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract — The problem of statistical learning is to construct an accurate predictor of a random variable as a function of a correlated random variable on the basis of an i.i.d. training sample from their joint distribution. Allowable predictors are constrained to lie in some specified class, and the goal is to approach asymptotically the performance of the best predictor in the class. We consider two settings in which the learning agent only has access to ratelimited descriptions of the training data, and present informationtheoretic bounds on the predictor performance achievable in the presence of these communication constraints. Our proofs do not assume any separation structure between compression and learning and rely on a new class of operational criteria specifically tailored to joint design of encoders and learning algorithms in rateconstrained settings. These operational criteria naturally lead to a learningtheoretic generalization of the ratedistortion function introduced recently by Kramer and Savari in the context of rateconstrained communication of probability distributions. I.
Perfectly Secure Encryption of Individual Sequences ∗
"... In analogy to the well–known notion of finite–state compressibility of individual sequences, due to Lempel and Ziv, we define a similar notion of “finite–state encryptability ” of an individual plaintext sequence, as the minimum asymptotic key rate that must be consumed by finite–state encrypters so ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
In analogy to the well–known notion of finite–state compressibility of individual sequences, due to Lempel and Ziv, we define a similar notion of “finite–state encryptability ” of an individual plaintext sequence, as the minimum asymptotic key rate that must be consumed by finite–state encrypters so as to guarantee perfect secrecy in a well–defined sense. Our main basic result is that the finite–state encryptability is equal to the finite–state compressibility for every individual sequence. This is in parallelism to Shannon’s classical probabilistic counterpart result, asserting that the minimum required key rate is equal to the entropy rate of the source. However, the redundancy, defined as the gap between the upper bound (direct part) and the lower bound (converse part) in the encryption problem, turns out to decay at a different rate (in fact, much slower) than the analogous redundancy associated with the compression problem. We also extend our main theorem in several directions, allowing: (i) availability of side information (SI) at the encrypter/decrypter/eavesdropper, (ii) lossy reconstruction at the decrypter, and (iii) the combination of both lossy reconstruction and SI, in the spirit of the Wyner–Ziv problem. Index Terms: Information–theoretic security, Shannon’s cipher system, secret key, perfect secrecy, individual sequences, finite–state machine, compressibility, incremental parsing, Lempel–
with Side Information at the Decoder ∗
, 2009
"... We present adaptive online schemes for lossy encoding of individual sequences under the conditions of the WynerZiv (WZ) problem, i.e., the decoder has access to side information whose statistical dependency on the source is known. Both the source sequence and the side information consist of symbol ..."
Abstract
 Add to MetaCart
(Show Context)
We present adaptive online schemes for lossy encoding of individual sequences under the conditions of the WynerZiv (WZ) problem, i.e., the decoder has access to side information whose statistical dependency on the source is known. Both the source sequence and the side information consist of symbols taking on values in a finite alphabet X. In the first part of this article, a set of fixedrate scalar source codes with zero delay is presented. We propose a randomized online coding scheme, which achieves asymptotically (and with high probability), the performance of the best source code in the set, uniformly over all source sequences. The scheme uses the same rate and has zero delay. We then present an efficient algorithm for implementing our online coding scheme in the case of a relatively small set of encoders. We also present an efficient algorithm for the case of a larger set of encoders with a structure, using the method of the weighted graph and the Weight Pushing Algorithm (WPA). In the second part of this article, we extend our results to the case of variablerate coding. A set of variablerate scalar source codes is presented. We generalize the randomized online coding scheme, to our case.
Universal Coding for Lossless and Lossy Complementary Delivery Problems
, 2008
"... This paper deals with a coding problem called complementary delivery, where messages from two correlated sources are jointly encoded and each decoder reproduces one of two messages using the other message as the side information. Both lossless and lossy universal complementary delivery coding scheme ..."
Abstract
 Add to MetaCart
(Show Context)
This paper deals with a coding problem called complementary delivery, where messages from two correlated sources are jointly encoded and each decoder reproduces one of two messages using the other message as the side information. Both lossless and lossy universal complementary delivery coding schemes are investigated. In the lossless case, it is demonstrated that a universal complementary delivery code can be constructed by only combining two SlepianWolf codes. Especially, it is shown that a universal lossless complementary delivery code, for which error probability is exponentially tight, can be constructed from two linear SlepianWolf codes. In the lossy case, a universal complementary delivery coding scheme based on WynerZiv codes is proposed. While the proposed scheme cannot attain the optimal ratedistortion tradeoff in general, the rateloss is upper bounded by a universal constant under some mild conditions. The proposed schemes allows us to apply any SlepianWolf and WynerZiv codes to complementary delivery coding.