Results 1  10
of
1,239
Network information flow
 IEEE TRANS. INFORM. THEORY
, 2000
"... We introduce a new class of problems called network information flow which is inspired by computer network applications. Consider a pointtopoint communication network on which a number of information sources are to be mulitcast to certain sets of destinations. We assume that the information source ..."
Abstract

Cited by 1961 (24 self)
 Add to MetaCart
(Show Context)
We introduce a new class of problems called network information flow which is inspired by computer network applications. Consider a pointtopoint communication network on which a number of information sources are to be mulitcast to certain sets of destinations. We assume that the information sources are mutually independent. The problem is to characterize the admissible coding rate region. This model subsumes all previously studied models along the same line. In this paper, we study the problem with one information source, and we have obtained a simple characterization of the admissible coding rate region. Our result can be regarded as the Maxflow Mincut Theorem for network information flow. Contrary to one’s intuition, our work reveals that it is in general not optimal to regard the information to be multicast as a “fluid” which can simply be routed or replicated. Rather, by employing coding at the nodes, which we refer to as network coding, bandwidth can in general be saved. This finding may have significant impact on future design of switching systems.
Cooperative diversity in wireless networks: efficient protocols and outage behavior
 IEEE TRANS. INFORM. THEORY
, 2004
"... We develop and analyze lowcomplexity cooperative diversity protocols that combat fading induced by multipath propagation in wireless networks. The underlying techniques exploit space diversity available through cooperating terminals’ relaying signals for one another. We outline several strategies ..."
Abstract

Cited by 1940 (31 self)
 Add to MetaCart
(Show Context)
We develop and analyze lowcomplexity cooperative diversity protocols that combat fading induced by multipath propagation in wireless networks. The underlying techniques exploit space diversity available through cooperating terminals’ relaying signals for one another. We outline several strategies employed by the cooperating radios, including fixed relaying schemes such as amplifyandforward and decodeandforward, selection relaying schemes that adapt based upon channel measurements between the cooperating terminals, and incremental relaying schemes that adapt based upon limited feedback from the destination terminal. We develop performance characterizations in terms of outage events and associated outage probabilities, which measure robustness of the transmissions to fading, focusing on the high signaltonoise ratio (SNR) regime. Except for fixed decodeandforward, all of our cooperative diversity protocols are efficient in the sense that they achieve full diversity (i.e., secondorder diversity in the case of two terminals), and, moreover, are close to optimum (within 1.5 dB) in certain regimes. Thus, using distributed antennas, we can provide the powerful benefits of space diversity without need for physical arrays, though at a loss of spectral efficiency due to halfduplex operation and possibly at the cost of additional receive hardware. Applicable to any wireless setting, including cellular or ad hoc networks—wherever space constraints preclude the use of physical arrays—the performance characterizations reveal that large power or energy savings result from the use of these protocols.
The ratedistortion function for source coding with side information at the decoder
 IEEE Trans. Inform. Theory
, 1976
"... AbstractLet {(X,, Y,J}r = 1 be a sequence of independent drawings of a pair of dependent random variables X, Y. Let us say that X takes values in the finite set 6. It is desired to encode the sequence {X,} in blocks of length n into a binary stream*of rate R, which can in turn be decoded as a seque ..."
Abstract

Cited by 1055 (1 self)
 Add to MetaCart
AbstractLet {(X,, Y,J}r = 1 be a sequence of independent drawings of a pair of dependent random variables X, Y. Let us say that X takes values in the finite set 6. It is desired to encode the sequence {X,} in blocks of length n into a binary stream*of rate R, which can in turn be decoded as a sequence { 2k}, where zk E %, the reproduction alphabet. The average distorjion level is (l/n) cl = 1 E[D(X,,z&, where D(x, $ 2 0, x E I, 2 E J, is a preassigned distortion measure. The special assumption made here is that the decoder has access to the side information {Yk}. In this paper we determine the quantity R*(d). defined as the infimum of rates R such that (with E> 0 arbitrarily small and with suitably large n) communication is possible in the above setting at an average distortion level (as defined above) not exceeding d + E. The main result is that R*(d) = inf[Z(X,Z) Z(Y,Z)], where the infimum is with respect to all auxiliary random variables Z (which take values in a finite set 3) that satisfy: i) Y,Z conditiofally independent given X; ii) there exists a functionf: “Y x E +.%, such that E[D(X,f(Y,Z))] 5 d. Let Rx, y(d) be the ratedistortion function which results when the encoder as well as the decoder has access to the side information {Y,}. In nearly all cases it is shown that when d> 0 then R*(d)> Rx, y(d), so that knowledge of the side information at the encoder permits transmission of the {X,} at a given distortion level using a smaller transmission rate. This is in contrast to the situation treated by Slepian and Wolf [5] where, for arbitrarily accurate reproduction of {X,}, i.e., d = E for any E> 0, knowledge of the side information at the encoder does not allow a reduction of the transmission rate.
Distributed Source Coding Using Syndromes (DISCUS): Design and Construction
 IEEE TRANS. INFORM. THEORY
, 1999
"... We address the problem of distributed source coding, i.e. compression of correlated sources that are not colocated and/or cannot communicate with each other to minimize their joint description cost. In this work we tackle the related problem of compressing a source that is correlated with anothe ..."
Abstract

Cited by 407 (9 self)
 Add to MetaCart
We address the problem of distributed source coding, i.e. compression of correlated sources that are not colocated and/or cannot communicate with each other to minimize their joint description cost. In this work we tackle the related problem of compressing a source that is correlated with another source which is however available only at the decoder. In contrast to prior informationtheoretic approaches, we introduce a new constructive and practical framework for tackling the problem based on the judicious incorporation of channel coding principles into this source coding problem. We dub our approach as DIstributed Source Coding Using Syndromes (DISCUS). We focus in this paper on trellisstructured consructions of the framework to illustrate its utility. Simulation results confirm the power of DISCUS, opening up a new and exciting constructive playingground for the distributed source coding problem. For the distributed coding of correlated i.i.d. Gaussian sources that are ...
Nested Linear/Lattice Codes for Structured Multiterminal Binning
, 2002
"... Network information theory promises high gains over simple pointtopoint communication techniques, at the cost of higher complexity. However, lack of structured coding schemes limited the practical application of these concepts so far. One of the basic elements of a network code is the binning sch ..."
Abstract

Cited by 352 (15 self)
 Add to MetaCart
Network information theory promises high gains over simple pointtopoint communication techniques, at the cost of higher complexity. However, lack of structured coding schemes limited the practical application of these concepts so far. One of the basic elements of a network code is the binning scheme. Wyner and other researchers proposed various forms of coset codes for efficient binning, yet these schemes were applicable only for lossless source (or noiseless channel) network coding. To extend the algebraic binning approach to lossy source (or noisy channel) network coding, recent work proposed the idea of nested codes, or more specifically, nested paritycheck codes for the binary case and nested lattices in the continuous case. These ideas connect network information theory with the rich areas of linear codes and lattice codes, and have strong potential for practical applications. We review these recent developments and explore their tight relation to concepts such as combined shaping and precoding, coding for memories with defects, and digital watermarking. We also propose a few novel applications adhering to a unified approach.
Distributed video coding
 PROC. OF THE IEEE 93 (2005) 71–83
, 2005
"... Distributed coding is a new paradigm for video compression, ..."
Abstract

Cited by 311 (11 self)
 Add to MetaCart
(Show Context)
Distributed coding is a new paradigm for video compression,
Common Randomness in Information Theory and Cryptography Part II: CR capacity
 IEEE Trans. Inform. Theory
, 1993
"... The CR capacity of a twoteminal model is defined as the maximum rate of common randomness that the terminals can generate using resources specified by the given model. We determine CR capacity for several models, including those whose statistics depend on unknown parameters. The CR capacity is show ..."
Abstract

Cited by 311 (13 self)
 Add to MetaCart
The CR capacity of a twoteminal model is defined as the maximum rate of common randomness that the terminals can generate using resources specified by the given model. We determine CR capacity for several models, including those whose statistics depend on unknown parameters. The CR capacity is shown to be achievable robustly, by common randomness of nearly uniform distribution no matter what the unknown parameters are. Our CR capacity results are relevant for the problem of identification capacity, and also yield a new result on the regular (transmission) capacity of arbitrarily varying channels with feedback. Key words: common randomness, identification capacity, correlated sources, arbitrarily varying channel, feedback, randomization. I. Csisz'ar was partially supported by the Hungarian National Foundation for Scientific Research, Grant T16386. 1 Introduction Suppose two terminals, called Terminal X and Terminal Y, have resources such as access to side information and communica...
Sum Capacity of a Gaussian Vector Broadcast Channel
 IEEE Trans. Inform. Theory
, 2002
"... This paper characterizes the sum capacity of a class of nondegraded Gaussian vectB broadcast channels where a singletransmitter with multiple transmit terminals sends independent information to multiple receivers. Coordinat+[ is allowed among the transmit teminals, but not among the different recei ..."
Abstract

Cited by 280 (21 self)
 Add to MetaCart
(Show Context)
This paper characterizes the sum capacity of a class of nondegraded Gaussian vectB broadcast channels where a singletransmitter with multiple transmit terminals sends independent information to multiple receivers. Coordinat+[ is allowed among the transmit teminals, but not among the different receivers. The sum capacity is shown t be a saddlepoint of a Gaussian mu al informat]R game, where a signal player chooses a tansmit covariance matrix to maximize the mutual information, and a noise player chooses a fictitious noise correlation to minimize the mutual information. This result holds fort he class of Gaussian channels whose saddlepoint satisfies a full rank condition. Furt her,t he sum capacity is achieved using a precoding method for Gaussian channels with additive side information noncausally known at the transmitter. The optimal precoding structure is shown t correspond to a decisionfeedback equalizer that decomposes t e broadcast channel into a series of singleuser channels with intk ference presubtract] at the transmiter.