Results 1 -
3 of
3
1The Lossy Common Information of Correlated Sources
"... Abstract—The two most prevalent notions of common infor-mation (CI) are due to Wyner and Gács-Körner and both the notions can be stated as two different characteristic points in the lossless Gray-Wyner region. Although the information theoretic characterizations for these two CI quantities can be ea ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract—The two most prevalent notions of common infor-mation (CI) are due to Wyner and Gács-Körner and both the notions can be stated as two different characteristic points in the lossless Gray-Wyner region. Although the information theoretic characterizations for these two CI quantities can be easily evalu-ated for random variables with infinite entropy (eg., continuous random variables), their operational significance is applicable only to the lossless framework. The primary objective of this paper is to generalize these two CI notions to the lossy Gray-Wyner network, which hence extends the theoretical foundation to general sources and distortion measures. We begin by deriving a single letter characterization for the lossy generalization of Wyner’s CI, defined as the minimum rate on the shared branch of the Gray-Wyner network, maintaining minimum sum transmit rate when the two decoders reconstruct the sources subject to individual distortion constraints. To demonstrate its use, we compute the CI of bivariate Gaussian random variables for the entire regime of distortions. We then similarly generalize Gács and Körner’s definition to the lossy framework. The latter half of the paper focuses on studying the tradeoff between the total transmit rate and receive rate in the Gray-Wyner network. We show that this tradeoff yields a contour of points on the surface of the Gray-Wyner region, which passes through both the Wyner and Gács-Körner operating points, and thereby provides a unified framework to understand the different notions of CI. We further show that this tradeoff generalizes the two notions of CI to the excess sum transmit rate and receive rate regimes, respectively. Index Terms—Common information, Gray-Wyner network, Multiterminal source coding
ON RELAXING THE STRICT HIERARCHICAL CONSTRAINTS IN LAYERED CODING OF AUDIO SIGNALS
"... Scalable coders generate hierarchically layered bitstreams to serve content at different quality levels, wherein the base layer provides a coarse quality reconstruction and successive layers incrementally refine the quality. However, it is widely recognized that there is an inherent performance pena ..."
Abstract
- Add to MetaCart
(Show Context)
Scalable coders generate hierarchically layered bitstreams to serve content at different quality levels, wherein the base layer provides a coarse quality reconstruction and successive layers incrementally refine the quality. However, it is widely recognized that there is an inherent performance penalty due to the scalable coding structure, when compared to indepen-dently encoded copies. To mitigate this loss we propose a layered compression framework, having roots in information theoretic concepts, which relaxes the strict hierarchical con-straints, wherein only a “subset ” of the information of a lower quality level is shared with higher quality levels. In other words, there is flexibility to also have “private ” information at each quality level, beside information that is common to multiple levels. We employ this framework within the MPEG Scalable AAC and propose an optimization scheme to jointly select parameters of all the layers. Experimental evaluation results demonstrate the utility of the flexibility provided by the proposed framework. Index Terms — Audio compression, audio streaming, layered coding, scalable audio coding, joint optimization
The Lossy Common Information of Correlated Sources
"... Abstract — The two most prevalent notions of common information (CI) are due to Wyner and Gács–Körner and both the notions can be stated as two different characteristic points in the lossless Gray–Wyner region. Although the information theoretic characterizations for these two CI quantities can be e ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract — The two most prevalent notions of common information (CI) are due to Wyner and Gács–Körner and both the notions can be stated as two different characteristic points in the lossless Gray–Wyner region. Although the information theoretic characterizations for these two CI quantities can be easily evaluated for random variables with infinite entropy (e.g., continuous random variables), their operational significance is applicable only to the lossless framework. The primary objective of this paper is to generalize these two CI notions to the lossy Gray–Wyner network, which hence extends the theoretical foundation to general sources and distortion measures. We begin by deriving a single letter characterization for the lossy generalization of Wyner’s CI, defined as the minimum rate on the shared branch of the Gray– Wyner network, maintaining minimum sum transmit rate when the two decoders reconstruct the sources subject to individual distortion constraints. To demonstrate its use, we compute the CI of bivariate Gaussian random variables for the entire regime of distortions. We then similarly generalize Gács and Körner’s definition to the lossy framework. The latter half of this paper focuses on studying the tradeoff between the total transmit rate and receive rate in the Gray–Wyner network. We show that this tradeoff yields a contour of points on the surface of the Gray–Wyner region, which passes through both the Wyner and Gács–Körner operating points, and thereby provides a unified framework to understand the different notions of CI. We further show that this tradeoff generalizes the two notions of CI to the excess sum transmit rate and receive rate regimes, respectively. Index Terms — Common information, Gray-Wyner network, multiterminal source coding. I.