#### DMCA

## Cascade multiterminal source coding (2009)

### Cached

### Download Links

Venue: | IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY. AUTHORIZED LICENSED USE LIMITED TO: STANFORD UNIVERSITY. DOWNLOADED ON MARCH 02,2010 AT 16:56:04 EST FROM IEEE XPLORE. RESTRICTIONS APPLY |

Citations: | 22 - 6 self |

### Citations

1247 |
Noiseless coding of correlated information sources
- Slepian, Wolf
- 2004
(Show Context)
Citation Context ...n in the information theory community over the years. The results of Slepian-Wolf encoding and communication through the Multiple Access Channel (MAC) are surprising and encouraging. Slepian and Wolf =-=[8]-=- showed that separate encoders can compress correlated sources losslessly at the same rate as a single encoder. Ahlswede [9] and Liao [10] fully characterized the capacity region for the general memor... |

1044 | The rate-distortion function for source coding with side information at the decoder II
- Wyner
- 1978
(Show Context)
Citation Context ...[2] and the multiple access channel setting in [3]. In the cascade multiterminal network, the answer breaks down quite intuitively. For the message from Encoder 1 to Encoder 2, use Wyner-Ziv encoding =-=[4]-=- to communicate the function values. Then apply lossless compression to the function values at Encoder 2. Computing functions of data in a Wyner-Ziv setting was introduced by Yamamoto [5], and the opt... |

209 |
Multi-way communication channels
- Ahlswede
- 1973
(Show Context)
Citation Context ...tiple Access Channel (MAC) are surprising and encouraging. Slepian and Wolf [8] showed that separate encoders can compress correlated sources losslessly at the same rate as a single encoder. Ahlswede =-=[9]-=- and Liao [10] fully characterized the capacity region for the general memoryless MAC, making it the only multi-user memoryless Z nchannel setting that is solved in its full generality. Thus, the fea... |

139 | Computation over multiple-access channels
- Nazer, Gastpar
- 2007
(Show Context)
Citation Context ... decoder? Computing functions of observations in a network has been considered in various other settings, such as the two-node back-and-forth setting of [2] and the multiple access channel setting in =-=[3]-=-. In the cascade multiterminal network, the answer breaks down quite intuitively. For the message from Encoder 1 to Encoder 2, use Wyner-Ziv encoding [4] to communicate the function values. Then apply... |

138 | Coding for computing
- Orlitsky, Roche
- 2001
(Show Context)
Citation Context ...n to the function values at Encoder 2. Computing functions of data in a Wyner-Ziv setting was introduced by Yamamoto [5], and the optimal rate for lossless computation was shown by Orlitsky and Roche =-=[6]-=- to be the conditional graph entropy on an appropriate graph. A particular function for which the optimal rates are easy to identify is the encoding of binary sums of binary symmetric X and Y that are... |

131 |
Multiple access channels
- Liao
- 1972
(Show Context)
Citation Context ...Channel (MAC) are surprising and encouraging. Slepian and Wolf [8] showed that separate encoders can compress correlated sources losslessly at the same rate as a single encoder. Ahlswede [9] and Liao =-=[10]-=- fully characterized the capacity region for the general memoryless MAC, making it the only multi-user memoryless Z nchannel setting that is solved in its full generality. Thus, the feasibility of de... |

101 | Lossy source coding
- Berger, Gibson
- 1980
(Show Context)
Citation Context ...the feasibility of describing two correlated data sources through a noisy MAC is not solved. Furthermore, allowing the source coding to be done with loss raises even more uncertainty. Berger and Tung =-=[11]-=- first considered the multiterminal source coding problem, where correlated sources are encoded separately with loss. Even when no noisy channel is involved, the optimal rate region is not known, but ... |

91 |
How to encode the modulo-two sum of binary sources
- Korner, Marton
- 1979
(Show Context)
Citation Context .... A particular function for which the optimal rates are easy to identify is the encoding of binary sums of binary symmetric X and Y that are equal with probability p, as proposed by Korner and Marton =-=[7]-=-. For this computation, the required rates are R1 ≥ h(p) and R2 ≥ h(p), where h is the binary entropy function. Curiously, the same rates are required in the standard multiterminal source coding setti... |

75 | Toward a theory of in-network computation in wireless sensor networks
- Gharavi, Kumar
- 2006
(Show Context)
Citation Context ...airs if the variance of X is greater than the variance of Y . I. INTRODUCTION Distributed data collection, such as aggregating measurements in a sensor network, has been investigated from many angles =-=[1]-=-. Various algorithms exist for passing messages to neighbors in order to collect information or compute functions of data. Here we join in the investigation of the minimum descriptions needed to quant... |

68 | Rate region of the quadratic Gaussian two-encoder source-coding problem,” Information Theory
- Wagner, Tavildar, et al.
- 1938
(Show Context)
Citation Context ...terminal source coding problem, where correlated sources are encoded separately with loss. Even when no noisy channel is involved, the optimal rate region is not known, but ongoing progress continues =-=[12]-=- [13]. The cascade multiterminal source coding setting is similar to multiterminal source coding considered by Berger and Tung in that two sources of information are encoded in a distributed fashion w... |

40 |
Wyner-Ziv theory for a general function of the correlated sources
- Yamamoto
- 1982
(Show Context)
Citation Context ...Ziv encoding [4] to communicate the function values. Then apply lossless compression to the function values at Encoder 2. Computing functions of data in a Wyner-Ziv setting was introduced by Yamamoto =-=[5]-=-, and the optimal rate for lossless computation was shown by Orlitsky and Roche [6] to be the conditional graph entropy on an appropriate graph. A particular function for which the optimal rates are e... |

16 | Two-terminal distributed source coding with alternating messages for function computation
- Ma, Ishwar
- 2008
(Show Context)
Citation Context ...ded to reliably calculate Zi = f(Xi, Yi) at the decoder? Computing functions of observations in a network has been considered in various other settings, such as the two-node back-and-forth setting of =-=[2]-=- and the multiple access channel setting in [3]. In the cascade multiterminal network, the answer breaks down quite intuitively. For the message from Encoder 1 to Encoder 2, use Wyner-Ziv encoding [4]... |

15 | Lossy source coding for a cascade communication system with side-informations
- Vasudevan, Tian, et al.
- 2006
(Show Context)
Citation Context ...r may need to estimate both X and Y , X only, Y only, or some other function of both, such as the sum of two jointly Gaussian random variables, considered in Section V-A. Vasudevan, Tian, and Diggavi =-=[14]-=- looked at a similar cascade communication system with a relay. In their setting, the decoder has side information, and the relay has access to a physically degraded version of it. Because of the degr... |

8 |
A dichotomy of functions f(x,y) of correlated sources (x,y) from the viewpoint of the achievable rate region. Information Theory
- Han, Kobayashi
- 1987
(Show Context)
Citation Context ...t are tight. One obvious condition is when X, Y , and Z form 2 The optimal rate region for computing functions of data in the standard multiterminal source coding network is currently an open problem =-=[17]-=-. the Markov chain X −Y −Z. In this case, there is no need to send a message I from Encoder 1, and the only requirement for achievability is that R2 ≥ I(Y ; Z). Another class of joint distributions p0... |

8 | Capacity of Coordinated Actions
- Cover, Permuter
- 2007
(Show Context)
Citation Context ...tropy limit. C. Markov Coordination It is possible to talk about achieving a joint distribution of coordinated actions p(x, y, z) = p0(x, y)p(z|x, y) without referring to a distortion function, as in =-=[18]-=-. Under some conditions of the joint distribution, the bounds Rin and Rout are tight. One obvious condition is when X, Y , and Z form 2 The optimal rate region for computing functions of data in the s... |

6 |
On network coding of independent and dependent sources in line networks
- Bakshi, Effros, et al.
- 2007
(Show Context)
Citation Context ...ch involving similar network settings can be found in [15], where Gu and Effros consider a more general network but with the restriction that the information Y is a function of the information X, and =-=[16]-=-, where Bakshi et. al. identify the optimal rate region for lossless encoding of independent sources in a longer cascade (line) network. In this paper we present inner and outer bounds on the general ... |

5 | An achievable rate region for distributed source coding with reconstruction of an arbitrary function of the sources
- Krithivasan, Pradhan
- 2008
(Show Context)
Citation Context ...nal source coding problem, where correlated sources are encoded separately with loss. Even when no noisy channel is involved, the optimal rate region is not known, but ongoing progress continues [12] =-=[13]-=-. The cascade multiterminal source coding setting is similar to multiterminal source coding considered by Berger and Tung in that two sources of information are encoded in a distributed fashion with l... |

5 |
On multi-resolution coding and a two-hop network
- Gu, Effros
- 2006
(Show Context)
Citation Context ...decoder does not have side information. Thus, the relay is faced with coalescing the two pieces of information into a single message. Other research involving similar network settings can be found in =-=[15]-=-, where Gu and Effros consider a more general network but with the restriction that the information Y is a function of the information X, and [16], where Bakshi et. al. identify the optimal rate regio... |

1 | Wyner - ziv theory for a general function ofthe correlated sources (corresp - Yamamoto - 1982 |

1 | Rate region ofthe quadratic gaussian two-encoder source-coding problem - Wagner, Tavildar, et al. - 2008 |