Results 1 - 10
of
42
Generalized multiple description coding with correlating transforms
- IEEE Trans. Inform. Theory
, 2001
"... Abstract—Multiple description (MD) coding is source coding in which several descriptions of the source are produced such that various reconstruction qualities are obtained from different subsets of the descriptions. Unlike multiresolution or layered source coding, there is no hierarchy of descriptio ..."
Abstract
-
Cited by 82 (2 self)
- Add to MetaCart
(Show Context)
Abstract—Multiple description (MD) coding is source coding in which several descriptions of the source are produced such that various reconstruction qualities are obtained from different subsets of the descriptions. Unlike multiresolution or layered source coding, there is no hierarchy of descriptions; thus, MD coding is suitable for packet erasure channels or networks without priority provisions. Generalizing work by Orchard, Wang, Vaishampayan, and Reibman, a transform-based approach is developed for producing descriptions of an-tuple source,. The descriptions are sets of transform coefficients, and the transform coefficients of different descriptions are correlated so that missing coefficients can be estimated. Several transform optimization results are presented for memoryless Gaussian sources, including a complete solution of the aP, aPcase with arbitrary weighting of the descriptions. The technique is effective only when independent components of the source have differing variances. Numerical studies show that this method performs well at low redundancies, as compared to uniform MD scalar quantization. Index Terms—Erasure channels, integer-to-integer transforms, packet networks, robust source coding.
Multiple Description Coding with Many Channels
- IEEE TRANS. INFORM. THEORY
, 2003
"... An achievable region for the-channel multiple description coding problem is presented. This region generalizes twochannel results of El Gamal and Cover and of Zhang and Berger. It further generalizes three-channel results of Gray and Wyner and of Zhang and Berger. A source that is successively refi ..."
Abstract
-
Cited by 74 (1 self)
- Add to MetaCart
An achievable region for the-channel multiple description coding problem is presented. This region generalizes twochannel results of El Gamal and Cover and of Zhang and Berger. It further generalizes three-channel results of Gray and Wyner and of Zhang and Berger. A source that is successively refinable on chains is shown to be successively refinable on trees. A new outer bound on the rate-distortion (RD) region for memoryless Gaussian sources with mean squared error distortion is also derived. The achievable region meets this outer bound for certain symmetric cases.
Source-channel diversity for parallel channels
- IEEE TRANSACTIONS ON INFORMATION THEORY
, 2005
"... We consider transmitting a source across a pair of independent, nonergodic channels with random states (e.g., slow-fading channels) so as to minimize the average distortion. The general problem is unsolved. Hence, we focus on comparing two commonly used source and channel encoding systems which corr ..."
Abstract
-
Cited by 49 (5 self)
- Add to MetaCart
(Show Context)
We consider transmitting a source across a pair of independent, nonergodic channels with random states (e.g., slow-fading channels) so as to minimize the average distortion. The general problem is unsolved. Hence, we focus on comparing two commonly used source and channel encoding systems which correspond to exploiting diversity either at the physical layer through parallel channel coding or at the application layer through multiple description (MD) source coding. For on–off channel models, source coding diversity offers better performance. For channels with a continuous range of reception quality, we show the reverse is true. Specifically, we introduce a new figure of merit called the distortion exponent which measures how fast the average distortion decays with signal-to-noise ratio. For continuous-state models such as additive white Gaussian noise (AWGN) channels with multiplicative Rayleigh fading, optimal channel coding diversity at the physical layer is more efficient than source coding diversity at the application layer in that the former achieves a better distortion exponent. Finally, we consider a third decoding architecture: MD encoding with joint source–channel decoding. We show that this architecture achieves the same distortion exponent as systems with optimal channel coding diversity for continuous-state channels, and maintains the advantages of MD systems for on–off channels. Thus, the MD system with joint decoding achieves the best performance from among the three architectures considered, on both continuous-state and on–off channels.
Vector Gaussian multiple description with individual and central receivers
- IEEE Trans. Information Theory
, 2007
"... The problem of L multiple descriptions of a stationary and ergodic Gaussian source with two levels of receivers is investigated. Each of the first level receivers receive (an arbitrary subset) k of the L descriptions, (k < L). The second level receiver receives all L descriptions. All the receive ..."
Abstract
-
Cited by 42 (3 self)
- Add to MetaCart
(Show Context)
The problem of L multiple descriptions of a stationary and ergodic Gaussian source with two levels of receivers is investigated. Each of the first level receivers receive (an arbitrary subset) k of the L descriptions, (k < L). The second level receiver receives all L descriptions. All the receivers, both at the first level and the second level, reconstruct the source using the subset of descriptions they receive. The corresponding reconstructions are subject to quadratic distortion constraints. Our main result is the derivation of an outer bound on the sum rate of the descriptions so that the distortion constraints are met. We show that a natural analog-digital separation architecture involving joint Gaussian vector quantizers and a binning scheme meets this outer bound with equality for several scenarios. These scenarios include the case when the distortion constraints are symmetric and the case for general distortion constraints with k = 2 and L = 3. We also show the robustness of this architecture: the distortions achieved are no larger when used to describe any non-Gaussian source with the same covariance matrix. 1
Information Theoretic Proofs of Entropy Power Inequalities
, 2007
"... While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon’s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using ..."
Abstract
-
Cited by 29 (2 self)
- Add to MetaCart
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon’s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn’s identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman’s Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verdú used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder’s generalized EPI for linear transformations of the random variables, Takano and Johnson’s EPI for dependent variables, Liu and Viswanath’s covariance-constrained EPI, and Costa’s concavity inequality for the entropy power.
Multiple Description Quantization Via Gram-Schmidt Orthogonalization
, 2005
"... The multiple description (MD) problem has received considerable attention as a model of information transmission over unreliable channels. A general framework for designing efficient multiple description quantization schemes is proposed in this paper. We provide a systematic treatment of the El Gama ..."
Abstract
-
Cited by 29 (11 self)
- Add to MetaCart
(Show Context)
The multiple description (MD) problem has received considerable attention as a model of information transmission over unreliable channels. A general framework for designing efficient multiple description quantization schemes is proposed in this paper. We provide a systematic treatment of the El Gamal-Cover (EGC) achievable MD rate-distortion region, and show that any point in the EGC region can be achieved via a successive quantization scheme along with quantization splitting. For the quadratic Gaussian case, the proposed scheme has an intrinsic connection with the Gram-Schmidt orthogonalization, which implies that the whole Gaussian MD rate-distortion region is achievable with a sequential dithered lattice-based quantization scheme as the dimension of the (optimal) lattice quantizers becomes large. Moreover, this scheme is shown to be universal for all i.i.d. smooth sources with performance no worse than that for an i.i.d. Gaussian source with the same variance and asymptotically optimal at high resolution. A class of low-complexity MD scalar quantizers in the proposed general framework also is constructed and is illustrated geometrically; the performance is analyzed in the high resolution regime, which exhibits a noticeable improvement over the existing MD scalar quantization schemes.
Universal Multiple Description Scalar Quantization: Analysis and Design
- IEEE Trans. Information Theory
, 2004
"... This paper introduces a new high-rate analysis of the entropy-constrained multiple description scalar quantizer (ECMDSQ). The analysis provides insight into the structure of the ECMDSQ, suggesting the non-optimality of uniform central quantizer cells with finite diagonals in the index assignment ..."
Abstract
-
Cited by 22 (3 self)
- Add to MetaCart
(Show Context)
This paper introduces a new high-rate analysis of the entropy-constrained multiple description scalar quantizer (ECMDSQ). The analysis provides insight into the structure of the ECMDSQ, suggesting the non-optimality of uniform central quantizer cells with finite diagonals in the index assignment matrix, as well as a method to approximate optimal cell sizes. Based on these insights, a Universal Multiple Description Scalar Quantizer (UMDSQ) is proposed which can achieve nearly the same performance as the fully optimized ECMDSQ [1], at much lower design complexity. The design requires selection of only two parameters, and the resulting UMDSQ can provide a continuum of trade-o# points between the central and side distortions as the two parameters are varied.
Optimal Filter Banks for Multiple Description Coding: Analysis and Synthesis
- IEEE Transactions on Information Theory
"... Multiple Description (MD) coding is a source coding technique for information transmission over unreliable networks. In MD coding, the coder generates several different descriptions of the same signal and the decoder can produce a useful reconstruction of the source with any received subset of these ..."
Abstract
-
Cited by 20 (4 self)
- Add to MetaCart
Multiple Description (MD) coding is a source coding technique for information transmission over unreliable networks. In MD coding, the coder generates several different descriptions of the same signal and the decoder can produce a useful reconstruction of the source with any received subset of these descriptions. In this paper we study the problem of MD coding of stationary Gaussian sources with memory. First, we compute an approximate MD rate distortion region for these sources, which we prove to be asymptotically tight at high rates. This region generalizes the MD rate distortion region of El Gamal, Cover and Ozarow for memoryless Gaussian sources. Then, we develop an algorithm for the design of optimal two-channel biorthogonal filter banks for MD coding of Gaussian sources. We show that optimal filters are obtained by allocating the redundancy over frequency with a reverse "water-filling" strategy. Finally, we present experimental results which show the effectiveness of our filter banks in the low complexity, low rate regime.
Dithered Lattice-Based Quantizers for Multiple Descriptions
- IEEE Trans. Inf. Theory
, 2002
"... Multiple description source coding is aimed at achieving graceful degradation in reconstruction with respect to losing portions of the code, with the cost of some redundancy. We examine multiple description schemes which use entropy coded dithered lattice quantizers (ECDQ). We propose two techniques ..."
Abstract
-
Cited by 18 (1 self)
- Add to MetaCart
(Show Context)
Multiple description source coding is aimed at achieving graceful degradation in reconstruction with respect to losing portions of the code, with the cost of some redundancy. We examine multiple description schemes which use entropy coded dithered lattice quantizers (ECDQ). We propose two techniques, one based on successive refinement (SR), and the other a dithered and periodic version of the multiple description scalar quantizer (MDSQ) with distributed cells proposed by Vaishampayan. Similarly to the single description case, both techniques are universal in nature, and are equivalent to additive noise channels. This allows to derive analytical...