Results 1  10
of
19
Source Coding With a Side Information “Vending Machine”
"... Abstract—We study source coding in the presence of side information, when the system can take actions that affect the availability, quality, or nature of the side information. We begin by extending the WynerZiv problem of source coding with decoder side information to the case where the decoder is ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
(Show Context)
Abstract—We study source coding in the presence of side information, when the system can take actions that affect the availability, quality, or nature of the side information. We begin by extending the WynerZiv problem of source coding with decoder side information to the case where the decoder is allowed to choose actions affecting the side information. We then consider the setting where actions are taken by the encoder, based on its observation of the source. Actions may have costs that are commensurate with the quality of the side information they yield, and an overall persymbol cost constraint may be imposed. We characterize the achievable tradeoffs between rate, distortion, and cost in some of these problem settings. Among our findings is the fact that even in the absence of a cost constraint, greedily choosing the action associated with the “best ” side information is, in general, suboptimal. A few examples are worked out. Index Terms—Actions, data acquisition, rate distortion, side information, source coding, vending machine, WynerZiv coding. I.
Chaos forgets and remembers: Measuring information creation, destruction, and storage.
 Phys. Lett. A,
, 2014
"... The hallmark of deterministic chaos is that it creates informationthe rate being given by the KolmogorovSinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system's intrinsic unpredictability. Here, we show that ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
(Show Context)
The hallmark of deterministic chaos is that it creates informationthe rate being given by the KolmogorovSinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system's intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created informationthe ephemeral informationis forgotten and a portionthe bound informationis remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute. The world is replete with systems that generate informationinformation that is then encoded in a variety of ways: Erratic ant behavior eventually leads to intricate, structured colony nests [1, 2]; thermally fluctuating magnetic spins form complex domain structures To ground this claim, consider two systems. The first, a fair coin: Each flip is independent of the others, leading to a simple uncorrelated randomness. As a result, no statistical fluctuation is predictively informative. For the second system consider a stock traded via a financial market: While its price is unpredictable, the direction and magnitude of fluctuations can hint at its future behavior. (This, at least, is the guiding assumption of the nowglobal financial engineering industry.) We make this distinction rigorous here, dividing a system's information generation into a component that is relevant to temporal structure and a component divorced from it. We show that the temporal component captures the system's internal information processing and, therefore, is of practical interest when harnessing the chaotic nature of physical systems to build novel machines and devices We observe these systems via an optimal measuring instrumentcalled a generating partitionthat encodes all of their behaviors in a stationary process: A distribution Pr(. . . , X −2 , X −1 , X 0 , X 1 , X 2 , . . .) over a biinfinite sequence of random variables with shiftinvariant statistics. A contiguous block of observations X t:t+ begins at index t and extends for length . (The index is inclusive on the left and exclusive on the right.) If an index is infinite, we leave it blank. So, a process is compactly denoted Pr(X : ). Our analysis splits X : into three segments: the present X 0 , a single observation; the past X :0 , everything prior; and future X 1: , everything that follows. The informationtheoretic relationships between these three random variable segments are graphically expressed in a Vennlike diagram, known as an Idiagram [12]; see
The Capacity of the Gaussian Erasure Channel
"... Abstract — This paper finds the capacity of linear timeinvariant systems observed in additive Gaussian noise through a memoryless erasure channel. This problem requires obtaining the asymptotic spectral distribution of a submatrix of a nonnegative definite Toeplitz matrix obtained by retaining each ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Abstract — This paper finds the capacity of linear timeinvariant systems observed in additive Gaussian noise through a memoryless erasure channel. This problem requires obtaining the asymptotic spectral distribution of a submatrix of a nonnegative definite Toeplitz matrix obtained by retaining each column/row independently and with identical probability. We show that the optimum normalized power spectral density is the waterfilling solution for reduced signaltonoise ratio, where the gap to the actual signaltonoise ratio depends on both the erasure probability and the channel transfer function. We find asymptotic expressions for the capacity in the sporadic erasure and sporadic nonerasure regimes as well as the low and high signaltonoise regimes. I.
Lossy Source Coding with Gaussian or Erased SideInformation
"... Abstract—In this paper we find properties that are shared between two seemingly unrelated lossy source coding setups with sideinformation. The first setup is when the source and sideinformation are jointly Gaussian and the distortion measure is quadratic. The second setup is when the sideinformati ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Abstract—In this paper we find properties that are shared between two seemingly unrelated lossy source coding setups with sideinformation. The first setup is when the source and sideinformation are jointly Gaussian and the distortion measure is quadratic. The second setup is when the sideinformation is an erased version of the source. We begin with the observation that in both these cases the WynerZiv and conditional ratedistortion functions are equal. We further find that there is a continuumofoptimalstrategiesfortheconditionalratedistortion problem in both these setups. Next, we consider the case when there are two decoders with access to different sideinformation sources. For the case when the encoder has access to the sideinformation we establish bounds on the ratedistortion function and a sufficient condition for tightness. Under this condition, we find a characterization of the ratedistortion function for physicallydegradedsideinformation.Thischaracterizationholds for both the Gaussian and erasure setups. I.
WYNERZIV CODING WITH UNCERTAIN SIDE INFORMATION QUALITY
"... WynerZiv coding of continuous sources with uncertain side information quality is defined by modeling the correlation noise as a Gaussianmixture. The analysis of the theoretical ratedistortion performance is presented, along with a coding solution not relying on the presence of a feedback channel. ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
WynerZiv coding of continuous sources with uncertain side information quality is defined by modeling the correlation noise as a Gaussianmixture. The analysis of the theoretical ratedistortion performance is presented, along with a coding solution not relying on the presence of a feedback channel. The attainable performance of the coding scheme is derived, and a brief discussion on implementation issues concludes the paper. 1.
Heegardberger and cascade source coding problems with common reconstruction constraints,” arXiv:1112.1762v3
, 2011
"... ar ..."
(Show Context)
Steganography using gibbs random fields
 Proceedings of the 12th ACM Multimedia & Security Workshop, MM&#38;Sec ’10
, 2010
"... Many steganographic algorithms for empirical covers are designed to minimize embedding distortion. In this work, we provide a general framework and practical methods for embedding with an arbitrary distortion function that does not have to be additive over pixels and thus can consider interactions a ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Many steganographic algorithms for empirical covers are designed to minimize embedding distortion. In this work, we provide a general framework and practical methods for embedding with an arbitrary distortion function that does not have to be additive over pixels and thus can consider interactions among embedding changes. The framework evolves naturally from a parallel made between steganography and statistical physics. The Gibbs sampler is the key tool for simulating the impact of optimal embedding and for constructing practical embedding algorithms. The proposed framework reduces the design of secure steganography in empirical covers to the problem of finding suitable local potentials for the distortion function that correlate with statistical detectability in practice. We work out the proposed methodology in detail for a specific choice of the distortion function and validate the approach through experiments.
Universal estimation of erasure entropy
 IEEE Trans. Inf. Theory
"... Abstract—Erasure entropy rate differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the ba ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Erasure entropy rate differs from Shannon’s entropy rate in that the conditioning occurs with respect to both the past and the future, as opposed to only the past (or the future). In this paper, consistent universal algorithms for estimating erasure entropy rate are proposed based on the basic and extended contexttree weighting (CTW) algorithms. Simulation results for those algorithms applied to Markov sources, tree sources, and English texts are compared to those obtained by fixedorder plugin estimators with different orders. Index Terms—Bidirectional context tree, contexttree weighting, data compression, entropy rate, universal algorithms, universal modeling. I.
Shitz), “The twotap inputerasure gaussian channel and its application to cellular communications
 in Proc. Allerton Conference on Communication, Control, and Computing
"... Abstract — This paper considers the inputerasure Gaussian channel. In contrast to the outputerasure channel where erasures are applied to the output of a linear timeinvariant (LTI) system, here erasures, known to the receiver, are applied to the inputs of the LTI system. Focusing on the case wher ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Abstract — This paper considers the inputerasure Gaussian channel. In contrast to the outputerasure channel where erasures are applied to the output of a linear timeinvariant (LTI) system, here erasures, known to the receiver, are applied to the inputs of the LTI system. Focusing on the case where the input symbols are independent and identically distributed (i.i.d)., it is shown that the two channels (input and outputerasure) are equivalent. Furthermore, assuming that the LTI system consists of a twotap finite impulse response (FIR) filter, and using simple properties of tridiagonal matrices, an achievable rate expression is presented in the form of an infinite sum. The results are then used to study the benefits of joint multicell processing (MCP) over singlecell processing (SCP) in a simple linear cellular uplink, where each mobile terminal is received by only the two nearby basestations (BSs). Specifically, the analysis accounts for ergodic shadowing that simultaneously blocks the mobile terminal (MT) signal from being received by the two BS. It is shown that the resulting ergodic percell capacity with optimal MCP is equivalent to that of the twotap inputerasure channel. Finally, the same cellular uplink is addressed by accounting for dynamic user activity, which is modelled by assuming that each MT is randomly selected to be active or to remain silent throughout the whole transmission block. For this alternative model, a similar equivalence results to the inputerasure channel are reported. I.
Universal Lossless Compression of Erased Symbols
"... Abstract—A source X goes through an erasure channel whose output is Z. The goal is to compress losslessly X when the compressor knows X and Z and the decompressor knows Z. Wepropose a universal algorithm based on contexttree weighting (CTW), parameterized by a memorylength parameter `. We show tha ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—A source X goes through an erasure channel whose output is Z. The goal is to compress losslessly X when the compressor knows X and Z and the decompressor knows Z. Wepropose a universal algorithm based on contexttree weighting (CTW), parameterized by a memorylength parameter `. We show that if the erasure channel is stationary and memoryless, and X is stationary and ergodic, then the proposed algorithm achieves a compression rate of H(X0jX 01 0 `;Z`) bits per erasure. Index Terms—Contexttree weighting, discrete memoryless erasure channel, entropy, erasure entropy, side information, universal lossless compression.