Results 1  10
of
43
Distributed video coding
 PROC. OF THE IEEE 93 (2005) 71–83
, 2005
"... Distributed coding is a new paradigm for video compression, ..."
Abstract

Cited by 311 (11 self)
 Add to MetaCart
(Show Context)
Distributed coding is a new paradigm for video compression,
Design of Optimal Quantizers for Distributed Source Coding
 In Proc. IEEE Data Compression Conf. (DCC), Snowbird, UT
, 2003
"... We address the problem of designing optimal quantizers for distributed source coding. The generality of our formulation includes both the symmetric and asymmetric scenarios, together with a number of coding schemes, such as the ideal coding achieving a rate equal to the joint conditional entropy ..."
Abstract

Cited by 40 (5 self)
 Add to MetaCart
(Show Context)
We address the problem of designing optimal quantizers for distributed source coding. The generality of our formulation includes both the symmetric and asymmetric scenarios, together with a number of coding schemes, such as the ideal coding achieving a rate equal to the joint conditional entropy of the quantized sources given the side information. We show the optimality conditions quantizers must satisfy, and generalize the Lloyd algorithm for its design.
Lossless and near lossless source coding for multiple access networks
 IEEE Trans. Inform. Theory
, 2003
"... Abstract—A multiple access source code (MASC) is a source code designed for the following network configuration: a pair of correlated information sequences ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
(Show Context)
Abstract—A multiple access source code (MASC) is a source code designed for the following network configuration: a pair of correlated information sequences
Highrate quantization and transform coding with side information at the decoder
 EURASIP Journal on Signal Processing
, 2006
"... We extend highrate quantization theory to WynerZiv coding, i.e., lossy source coding with side information at the decoder. Ideal SlepianWolf coders are assumed, thus rates are conditional entropies of quantization indices given the side information. This theory is applied to the analysis of ortho ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
We extend highrate quantization theory to WynerZiv coding, i.e., lossy source coding with side information at the decoder. Ideal SlepianWolf coders are assumed, thus rates are conditional entropies of quantization indices given the side information. This theory is applied to the analysis of orthonormal block transforms for WynerZiv coding. A formula for the optimal rate allocation and an approximation to the optimal transform are derived. The case of noisy highrate quantization and transform coding is included in our study, in which a noisy observation of source data is available at the encoder, but we are interested in estimating the unseen data at the decoder, with the help of side information. We implement a transformdomain WynerZiv video coder that encodes frames independently but decodes them conditionally. Experimental results show that using the discrete cosine transform results in a ratedistortion improvement with respect to the pixeldomain coder. Transform coders of noisy images for different communication constraints are compared. Experimental results show that the noisy WynerZiv transform coder achieves a performance close to the case in which the side information is also available at the encoder.
Tracking the best quantizer
, 2008
"... An algorithm is presented for online prediction that allows to track the best expert efficiently even when the number of experts is exponentially large, provided that the set of experts has a certain additive structure. As an example, we work out the case where each expert is represented by a path ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
(Show Context)
An algorithm is presented for online prediction that allows to track the best expert efficiently even when the number of experts is exponentially large, provided that the set of experts has a certain additive structure. As an example, we work out the case where each expert is represented by a path in a directed graph and the loss of each expert is the sum of the weights over the edges in the path. These results are then used to construct universal limiteddelay schemes for lossy coding of individual sequences. In particular, we consider the problem of tracking the best scalar quantizer that is adaptively matched to the source sequence with piecewise different behavior. A randomized algorithm is presented which can perform, on any source sequence, asymptotically as well as the best scalar quantization algorithm that is matched to the sequence and is allowed to change the employed quantizer for a given number of times. The complexity of the algorithm is quadratic in the sequence length, but at the price of some deterioration in performance, the complexity can be made linear. Analogous results are obtained for sequential multiresolution and multiple description scalar quantization of individual sequences.
Joint EntropyConstrained Multiterminal Quantization
, 2002
"... As a ratedistortion extension to the SlepianWolf problem, we study the entropyconstrained design of a multiterminal quantizer for coding two correlated continuous sources. The designed quantizer can then be combined with a lossless encoder operating close to the SlepianWolf bound. Two design meth ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
As a ratedistortion extension to the SlepianWolf problem, we study the entropyconstrained design of a multiterminal quantizer for coding two correlated continuous sources. The designed quantizer can then be combined with a lossless encoder operating close to the SlepianWolf bound. Two design methods are presented, both optimizing a Lagrangian cost measure involving the distortion and the information rate. The first method is a simple descent algorithm, while the second is based on index reuse of a highresolution quantizer. Numerical results are displayed. The soundness of the index reuse method is shown, while confirming the advantages of entropy constraints over simple entropy limitations. I.
On efficient quantizer design for robust distributed source coding
 In Proc. DCC
, 2006
"... This paper considers the design of efficient quantizers for a distributed source coding system. The information is encoded at independent terminals and transmitted across separate channels, any of which may fail. The scenario subsumes a wide range of vector quantization problems. Greedy descent meth ..."
Abstract

Cited by 11 (10 self)
 Add to MetaCart
(Show Context)
This paper considers the design of efficient quantizers for a distributed source coding system. The information is encoded at independent terminals and transmitted across separate channels, any of which may fail. The scenario subsumes a wide range of vector quantization problems. Greedy descent methods rely heavily on initialization, and the presence of numerous ‘poor ’ local optima on the distortion cost surface strongly motivates the use of a global design algorithm. We propose a deterministic annealing approach for the design of all components of a generic distributed source coding system. Our approach avoids many poor local optima, is independent of initialization, and does not assume any prior information on the underlying source distribution. Simulation results show significant gains over an iterative greedy algorithm. 1
Distributed predictive coding for spatiotemporally correlated sources
 in Proc. IEEE Int. Symp. Information Theory
, 2007
"... Abstract—Distributed coding of correlated sources with memory poses a number of considerable challenges that threaten its practical application, particularly (but not only) in the context of sensor networks. This problem is strongly motivated by the obvious observation that most common sources exhib ..."
Abstract

Cited by 9 (8 self)
 Add to MetaCart
(Show Context)
Abstract—Distributed coding of correlated sources with memory poses a number of considerable challenges that threaten its practical application, particularly (but not only) in the context of sensor networks. This problem is strongly motivated by the obvious observation that most common sources exhibit temporal correlations that may be at least as important as spatial or intersource correlations. This paper presents an analysis of the underlying tradeoffs, paradigms for coding systems, and approaches for distributed predictive coder design optimization. Motivated by practical limitations on both complexity and delay (especially for dense sensor networks) the focus here is on predictive coding. From the source coding perspective, the most basic tradeoff (and difficulty) is due to conflicts that arise between distributed coding and prediction, wherein “standard ” distributed quantization of the prediction errors, if coupled with imposition of zero decoder drift, would drastically compromise the predictor performance and hence the ability to exploit temporal correlations. Another challenge arises from instabilities in the design of closedloop predictors, whose impact has been observed in the past, but is greatly exacerbated in the case of distributed coding. In the distributed predictive coder design, we highlight the fundamental tradeoffs encountered within a more general paradigm where decoder drift is allowable or unavoidable, and must be effectively accounted for and controlled. We derive an overall design optimization method for distributed predictive coding that avoids the pitfalls of naive distributed predictive quantization and produces an optimized low complexity and low delay coding system. The proposed iterative algorithms for distributed predictive coding subsume traditional singlesource predictive coding and memoryless distributed coding as extreme special cases. Index Terms—Distributed quantization, predictive coding, sensor networks. T I.
State estimation over packet dropping networks using multiple description coding
 Automatica
"... For state estimation over a communication network, efficiency and reliability of the network are critical issues. The presence of packet dropping and communication delay can greatly impair our ability to measure and predict states. In this paper, multiple description (MD) codes, a type of network so ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
For state estimation over a communication network, efficiency and reliability of the network are critical issues. The presence of packet dropping and communication delay can greatly impair our ability to measure and predict states. In this paper, multiple description (MD) codes, a type of network source codes, are used to compensate for this effect on Kalman filtering. We consider two packet dropping models: in one model, packet dropping occurs according to an independent and identically distributed (i.i.d.) Bernoulli random process and in the other model, packet dropping is bursty and occurs according to a Markov chain. We show that MD codes greatly improve the statistical stability and performance of Kalman filter over a large set of packet loss scenarios in both cases. Our conclusions are verified by simulation results.
Distributed Beamforming in Wireless Multiuser RelayInterference Networks with Quantized Feedback
 IEEE Trans. On information theory
"... ar ..."
(Show Context)