Results 1  10
of
21
Coordination Capacity
, 2009
"... We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communicatio ..."
Abstract

Cited by 47 (17 self)
 Add to MetaCart
(Show Context)
We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communication constraints. Specifically, in a network with communication rates {Ri,j} between the nodes, we ask what is the set of all achievable joint distributions p(x1,..., xm) of actions at the nodes on the network. Several networks are solved, including arbitrarily large cascade networks. Distributed cooperation can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the influence of one part of a physical system on another.
Network Coding for Computing: CutSet Bounds
, 2011
"... The following network computing problem is considered. Source nodes in a directed acyclic network generate independent messages and a single receiver node computes a target function f of the messages. The objective is to maximize the average number of times f can be computed per network usage, i.e. ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
(Show Context)
The following network computing problem is considered. Source nodes in a directed acyclic network generate independent messages and a single receiver node computes a target function f of the messages. The objective is to maximize the average number of times f can be computed per network usage, i.e., the “computing capacity”. The network coding problem for a singlereceiver network is a special case of the network computing problem in which all of the source messages must be reproduced at the receiver. For network coding with a single receiver, routing is known to achieve the capacity by achieving the network mincut upper bound. We extend the definition of mincut to the network computing problem and show that the mincut is still an upper bound on the maximum achievable rate and is tight for computing (using coding) any target function in multiedge tree networks. It is also tight for computing linear target functions in any network. We also study the bound’s tightness for different classes of target functions. In particular, we give a lower bound on the computing capacity in terms of the Steiner tree packing number and a different bound for symmetric functions. We also show that for certain networks and target functions, the computing capacity can be less than an arbitrarily small fraction of the mincut bound.
COMMUNICATION IN NETWORKS FOR COORDINATING BEHAVIOR
, 2009
"... in my opinion, it ..."
(Show Context)
Distributed Lossy Averaging
, 2009
"... AbstractAn information theoretic form ulation of distributed averaging is presented. We assume a network with m nodes each observing an i.i.d, source; the nodes communicate and perform local processing with the goal of computing the average of the sources to within a prescribed mean squared error d ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
AbstractAn information theoretic form ulation of distributed averaging is presented. We assume a network with m nodes each observing an i.i.d, source; the nodes communicate and perform local processing with the goal of computing the average of the sources to within a prescribed mean squared error distortion. The network rate distortion function R * (D) for a 2node network with correlated Gaussian sources is established. A general cutset lower bound on R * (D) with independent Gaussian sources is established and shown to be achievable to within a factor of 2 via a centralized protocol. A lower bound on the network rate distortion function for distributed weightedsum protocols that is larger than the cutset bound by a factor of log m is established. An upper bound on the expected network rate distortion function for gossipbased weightedsum protocols that is only a factor of log log m larger than this lower bound is established. The results suggest that using distributed protocols results in a factor of log m increase in communication relative to centralized protocols. I.
Cascade, Triangular, and TwoWay Source Coding With Degraded Side Information at the Second User
"... Abstract—In this paper, we consider the cascade and triangular ratedistortion problems where the same side information is available at the source node and user 1, and the side information available at user 2 is a degraded version of the side information at the source node and user 1. We characteriz ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Abstract—In this paper, we consider the cascade and triangular ratedistortion problems where the same side information is available at the source node and user 1, and the side information available at user 2 is a degraded version of the side information at the source node and user 1. We characterize the ratedistortion region for these problems. For the cascade setup, we show that, at user 1, decoding and rebinning the codeword sent by the source node for user 2 is optimum. We then extend our results to the twoway cascade and triangular setting, where the source node is interested in lossy reconstruction of the side information at user 2 via a rate limited link from user 2 to the source node. We characterize the ratedistortion regions for these settings. Complete explicit characterizations for all settings are given in the quadratic Gaussian case. We conclude with two further extensions: a triangular source coding problem with a helper, and an extension of our twoway cascade setting in the quadratic Gaussian case. Index Terms—Cascade source coding, triangular source coding, twoway source coding, quadratic Gaussian, source coding with a helper. I.
On cooperation in multiterminal computation and rate distortion
 In Information Theory Proceedings (ISIT), 2012 IEEE International Symposium on
, 2012
"... A receiver wants to compute a function of two correlated sources separately observed by two transmitters. In the system model of interest, one of the transmitters may send some data to the other transmitter in a cooperation phase before both transmitters convey data to the receiver. What is the mini ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
A receiver wants to compute a function of two correlated sources separately observed by two transmitters. In the system model of interest, one of the transmitters may send some data to the other transmitter in a cooperation phase before both transmitters convey data to the receiver. What is the minimum number of noiseless bits that need to be communicated by each transmitter to the receiver for a given number of cooperation bits? This paper investigates both the function computation and the rate distortion versions of this problem; in the first case, the receiver wants to compute the function exactly and in the second case the receiver wants to compute the function within some distortion. For the function computation version, a general inner bound to the rate region is exhibited and shown to be tight in a number of cases: the function is partially invertible, full cooperation, oneround pointtopoint communication, tworound pointtopoint communication, and cascade. As a corollary, it is shown that one bit of cooperation may arbitrarily reduce the amount of information both transmitters need to convey to the receiver. For the rate distortion version, an inner bound to the rate region is exhibited which always includes, and sometimes strictly, the convex hull of KaspiBerger’s related inner bounds. I.
Cascade and triangular source coding with side information at the first two nodes
 IN PROC. INF. THEORY APPL. WORKSHOP
, 2012
"... We consider the cascade and triangular ratedistortion problem where side information is known to the source encoder and to the first user but not to the second user. We characterize the ratedistortion region for these problems, as well as some of their extensions. For the quadratic Gaussian case, ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
We consider the cascade and triangular ratedistortion problem where side information is known to the source encoder and to the first user but not to the second user. We characterize the ratedistortion region for these problems, as well as some of their extensions. For the quadratic Gaussian case, we show that it is sufficient to consider jointly Gaussian distributions, which leads to an explicit solution.
Heegardberger and cascade source coding problems with common reconstruction constraints,” arXiv:1112.1762v3
, 2011
"... ar ..."
(Show Context)
Twoway Source Coding Through a Relay
"... Abstract — A 3node lossy source coding problem for a 2DMS (X1,X2) is considered. Source nodes 1 and 2 observe X1 and X2, respectively, and each wishes to reconstruct the other source with a prescribed distortion. To achieve these goals, nodes 1 and 2 send descriptions of their sources to relay nod ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract — A 3node lossy source coding problem for a 2DMS (X1,X2) is considered. Source nodes 1 and 2 observe X1 and X2, respectively, and each wishes to reconstruct the other source with a prescribed distortion. To achieve these goals, nodes 1 and 2 send descriptions of their sources to relay node 3. The relay node then broadcasts a joint description to the source nodes. A cutset outer bound and a compress–linear code inner bound are established and shown to coincide in several special cases. A compute–compress inner bound is then presented and shown to outperform the compress–linear code in some cases. An outer bound based on Kaspi’s converse for the twoway source coding problem is shown to be strictly tighter than the cutset outer bound. I.
Empirical Coordination in a Triangular Multiterminal Network
"... Abstract—In this paper, we investigate the problem of the empirical coordination in a triangular multiterminal network. A triangular multiterminal network consists of three terminals where two terminals observe two external i.i.d correlated sequences. The third terminal wishes to generate a sequenc ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract—In this paper, we investigate the problem of the empirical coordination in a triangular multiterminal network. A triangular multiterminal network consists of three terminals where two terminals observe two external i.i.d correlated sequences. The third terminal wishes to generate a sequence with desired empirical joint distribution. For this problem, we derive inner and outer bounds on the empirical coordination capacity region. It is shown that the capacity region of the degraded source network and the inner and outer bounds on the capacity region of the cascade multiterminal network can be directly obtained from our inner and outer bounds. For a cipher system, we establish key distribution over a network with a reliable terminal, using the results of the empirical coordination. As another example, the problem of rate distortion in the triangular multiterminal network is investigated in which a distributed doubly symmetric binary source is available. I.