Results 1  10
of
112
The capacity of a quantum channel for simultaneous transmission of classical and quantum information
, 2008
"... ..."
Coordination Capacity
, 2009
"... We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communicatio ..."
Abstract

Cited by 47 (17 self)
 Add to MetaCart
(Show Context)
We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communication constraints. Specifically, in a network with communication rates {Ri,j} between the nodes, we ask what is the set of all achievable joint distributions p(x1,..., xm) of actions at the nodes on the network. Several networks are solved, including arbitrarily large cascade networks. Distributed cooperation can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the influence of one part of a physical system on another.
Communication requirements for generating correlated random variables
 in Proc. IEEE Int. Symp. Information Theory (ISIT
, 2008
"... Abstract — Two familiar notions of correlation are rediscovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wyner’s “common information ” coincides with the minimum description rate ..."
Abstract

Cited by 40 (10 self)
 Add to MetaCart
(Show Context)
Abstract — Two familiar notions of correlation are rediscovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wyner’s “common information ” coincides with the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon’s mutual information. This work characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description. I.
On quantum statistical inference
 J. Roy. Statist. Soc. B
, 2001
"... [Read before The Royal Statistical Society at a meeting organized by the Research Section ..."
Abstract

Cited by 36 (5 self)
 Add to MetaCart
[Read before The Royal Statistical Society at a meeting organized by the Research Section
The Communication Complexity of Correlation
"... Let X and Y be finite nonempty sets and (X, Y) a pair of random variables taking values in X × Y. We consider communication protocols between two parties, Alice and Bob, for generating X and Y. Alice is provided an x ∈ X generated according to the distribution of X, and is required to send a messa ..."
Abstract

Cited by 34 (12 self)
 Add to MetaCart
Let X and Y be finite nonempty sets and (X, Y) a pair of random variables taking values in X × Y. We consider communication protocols between two parties, Alice and Bob, for generating X and Y. Alice is provided an x ∈ X generated according to the distribution of X, and is required to send a message to Bob in order to enable him to generate y ∈ Y, whose distribution is the same as that of Y X=x. Both parties have access to a shared random string generated in advance. Let T (X: Y) be the minimum (over all protocols) of the expected number of bits Alice needs to transmit to achieve this. We show that
Distilling common randomness from bipartite quantum states
, 2008
"... The problem of converting noisy quantum correlations between two parties into noiseless classical ones using a limited amount of oneway classical communication is addressed. A singleletter formula for the optimal tradeoff between the extracted common randomness and classical communication rate is ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
(Show Context)
The problem of converting noisy quantum correlations between two parties into noiseless classical ones using a limited amount of oneway classical communication is addressed. A singleletter formula for the optimal tradeoff between the extracted common randomness and classical communication rate is obtained for the special case of classicalquantum correlations. The resulting curve is intimately related to the quantum compression with classical side information tradeoff curve Q ∗ (R) of Hayden, Jozsa and Winter. For a general initial state we obtain a similar result, with a singleletter formula, when we impose a tensor product restriction on the measurements performed by the sender; without this restriction the tradeoff is given by the regularization of this function. Of particular interest is a quantity we call “distillable common randomness ” of a state: the maximum overhead of the common randomness over the oneway classical communication if the latter is unbounded. It is an operational measure of (total) correlation in a quantum state. For classicalquantum correlations it is given by the Holevo mutual information of its associated ensemble, for pure states it is the entropy of entanglement. In general, it is given by an optimization problem over measurements and regularization; for the case of separable states we show that this can be singleletterized.
On the theory of network equivalence
 In IEEE Inform. Theory Workshop (ITW
, 2009
"... Abstract—We describe an equivalence result for network capacity. Roughly, our main result is as follows. Given a network of noisy, independent, memoryless links, a collection of demands can be met on the given network if and only if it can be met on another network where each noisy link is replaced ..."
Abstract

Cited by 18 (6 self)
 Add to MetaCart
(Show Context)
Abstract—We describe an equivalence result for network capacity. Roughly, our main result is as follows. Given a network of noisy, independent, memoryless links, a collection of demands can be met on the given network if and only if it can be met on another network where each noisy link is replaced by a noiseless bit pipe with throughput equal to the noisy link capacity. This result was previously known only for multicast connections. I.
Entanglementassisted capacity of quantum multipleaccess channels
 IEEE Transactions on Information Theory
, 2008
"... ..."
(Show Context)