Results 1  10
of
57
Computeandforward: Harnessing interference through structured codes
 IEEE TRANS. INF. THEORY
, 2009
"... ..."
The approximate capacity of the manytoone and onetomany Gaussian interference channels
 in Proc. Allerton Conf. Commun. Control Comput
, 2007
"... region of the twouser Gaussian interference channel to within 1 bit/s/Hz. A natural goal is to apply this approach to the Gaussian interference channel with an arbitrary number of users. We make progress towards this goal by finding the capacity region of the manytoone and onetomany Gaussian in ..."
Abstract

Cited by 137 (9 self)
 Add to MetaCart
(Show Context)
region of the twouser Gaussian interference channel to within 1 bit/s/Hz. A natural goal is to apply this approach to the Gaussian interference channel with an arbitrary number of users. We make progress towards this goal by finding the capacity region of the manytoone and onetomany Gaussian interference channels to within a constant number of bits. The result makes use of a deterministic model to provide insight into the Gaussian channel. The deterministic model makes explicit the dimension of signal level. A central theme emerges: the use of lattice codes for alignment of interfering signals on the signal level. Index Terms—Capacity, interference alignment, interference channel, lattice codes, multiuser channels. I.
The case for structured random codes in network capacity theorems
 in Proceedings of the IEEE Information Theory Workshop (ITW 2007), (Lake Tahoe, CA
, 2007
"... Random coding arguments are the backbone of most channel capacity achievability proofs. In this paper, we show that in their standard form, such arguments are insufficient for proving some network capacity theorems: structured coding arguments, such as random linear or lattice codes, attain higher r ..."
Abstract

Cited by 54 (10 self)
 Add to MetaCart
(Show Context)
Random coding arguments are the backbone of most channel capacity achievability proofs. In this paper, we show that in their standard form, such arguments are insufficient for proving some network capacity theorems: structured coding arguments, such as random linear or lattice codes, attain higher rates. Historically, structured codes have been studied as a stepping stone to practical constructions. However, Körner and Marton demonstrated their usefulness for capacity theorems through the derivation of the optimal rate region of a distributed functional source coding problem. Here, we use multicasting over finite field and Gaussian multipleaccess networks as canonical examples to demonstrate that even if we want to send bits over a network, structured codes succeed where simple random codes fail. Beyond network coding, we also consider distributed computation over noisy channels and a special relaytype problem. I.
Lattices are Everywhere
"... As bees and crystals (and people selling oranges in the market) know it for many years, lattices provide efficient structures for packing, covering, quantization and channel coding. In the recent years, interesting links were found between lattices and coding schemes for multiterminal networks. Thi ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
(Show Context)
As bees and crystals (and people selling oranges in the market) know it for many years, lattices provide efficient structures for packing, covering, quantization and channel coding. In the recent years, interesting links were found between lattices and coding schemes for multiterminal networks. This tutorial paper covers close to 20 years of my research in the area; of enjoying the beauty of lattice codes, and discovering their power in dithered quantization, dirty paper coding, WynerZiv DPCM, modulolattice modulation, distributed interference cancelation, and more. I.
Nested lattice codes for Gaussian relay networks with interference
 IEEE Trans. Inf. Theory
, 2011
"... ar ..."
(Show Context)
Computeandforward: Harnessing interference with structured codes
 in Proceedings of the IEEE International Symposium on Information Theory (ISIT 2008
, 2008
"... Abstract — For a centralized encoder and decoder, a channel matrix is simply a set of linear equations that can be transformed into parallel channels. We develop a similar approach to multiuser networks: we view interference as creating linear equations of codewords and that a receiver’s goal is to ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
(Show Context)
Abstract — For a centralized encoder and decoder, a channel matrix is simply a set of linear equations that can be transformed into parallel channels. We develop a similar approach to multiuser networks: we view interference as creating linear equations of codewords and that a receiver’s goal is to collect a full rank set of such equations. Our new relaying technique, computeandforward, uses structured codes to reliably compute functions over channels. This allows the relays to efficiently recover a linear functions of codewords without recovering the individual codewords. Thus, our scheme can work with the structure of the interference while removing the effects of the noise at the relay. We apply our scheme to a Gaussian relay network with interference and achieve better rates than either compressandforward or decodeandforward for certain regimes. Traditionally, structured codes have been studied for their
The Rate Loss of SingleLetter Characterization: The “Dirty” Multiple Access Channel
"... For general memoryless systems, the typical information theoretic solution when exists has a “singleletter” form. This reflects the fact that optimum performance can be approached by a random code (or a random binning scheme), generated using independent and identically distributed copies of some ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
(Show Context)
For general memoryless systems, the typical information theoretic solution when exists has a “singleletter” form. This reflects the fact that optimum performance can be approached by a random code (or a random binning scheme), generated using independent and identically distributed copies of some singleletter distribution. Is that the form of the solution of any (information theoretic) problem? In fact, some counter examples are known. The most famous is the “two help one ” problem: Korner and Marton showed that if we want to decode the modulotwo sum of two binary sources from their independent encodings, then linear coding is better than random coding. In this paper we provide another counter example, the “doublydirty ” multiple access channel (MAC). Like the KornerMarton problem, this is a multiterminal scenario where side information is distributed among several terminals; each transmitter knows part of the channel interference but the receiver is not aware of any part of it. We give an explicit solution for the capacity region of a binary version of the doublydirty MAC, demonstrate how the capacity region can be approached using a linear coding scheme, and prove that the “best known singleletter region” is strictly contained in it. We also state a conjecture regarding a similar rate loss of single letter characterization in the Gaussian case.
Multiple access channels with states causally known at transmitters,” November 2010, submitted to IEEE Transactions on Information Theory, available online at http://arxiv.org/abs/1011.6639
"... Abstract—It has been recently shown by Lapidoth and Steinberg that strictly causal state information can be beneficial in multiple access channels (MACs). Specifically, it was proved that the capacity region of a twouser MAC with independent states, each known strictly causally to one encoder, can ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
(Show Context)
Abstract—It has been recently shown by Lapidoth and Steinberg that strictly causal state information can be beneficial in multiple access channels (MACs). Specifically, it was proved that the capacity region of a twouser MAC with independent states, each known strictly causally to one encoder, can be enlarged by letting the encoders send compressed past state information to the decoder. In this study, a generalization of the said strategy is proposed whereby the encoders compress also the past transmitted codewords along with the past state sequences. The proposed scheme uses a combination of longmessage encoding, compression of the past state sequences and codewords without binning, and joint decoding over all transmission blocks. The proposed strategy has been recently shown by Lapidoth and Steinbergtostrictlyimprove upon the original one. Capacity results are then derived for a class of channels that include twouser moduloadditive statedependent MACs. Moreover, the proposed scheme is extended to statedependent MACs with an arbitrary number of users. Finally, output feedback is introduced and an example is provided to illustrate the interplay between feedback and availability of strictly causal state information in enlarging the capacity region. Index Terms—Longmessage encoding, multiple access channels (MACs), output feedback, quantizeforward, statedependent channels, strictly causal state information. I.
Distributed Source Coding using Abelian Group Codes: Extracting Performance from Structure
"... In this work, we consider a distributed source coding problem with a joint distortion criterion depending on the sources and the reconstruction. This includes as a special case the problem of computing a function of the sources to within some distortion and also the classic SlepianWolf problem [12 ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
(Show Context)
In this work, we consider a distributed source coding problem with a joint distortion criterion depending on the sources and the reconstruction. This includes as a special case the problem of computing a function of the sources to within some distortion and also the classic SlepianWolf problem [12], BergerTung problem [5], WynerZiv problem [4], YeungBerger problem [6] and the AhlswedeKornerWyner problem [3], [13]. While the prevalent trend in information theory has been to prove achievability results using Shannon’s random coding arguments, using structured random codes offer rate gains over unstructured random codes for many problems. Motivated by this, we present a new achievable ratedistortion region (an inner bound to the performance limit) for this problem for discrete memoryless sources based on “good” structured random nested codes built over abelian groups. We demonstrate rate gains for this problem over traditional coding schemes using random unstructured codes. For certain sources and distortion functions, the new rate region is strictly bigger than the BergerTung rate region, which has been the best known achievable rate region for this problem till now. Further, there is no known unstructured random coding scheme that achieves these rate gains. Achievable performance limits for singleuser source coding using abelian group codes are also obtained as parts of the proof of the main coding theorem. As a corollary, we also prove that nested linear codes achieve the Shannon ratedistortion bound in the singleuser setting. Note that while group codes retain some structure, they are more general than linear codes which can only be built over finite fields which are known to exist only for certain sizes.
The finitedimensional Witsenhausen counterexample
, 2009
"... Recently, we considered a vector version of Witsenhausen’s counterexample and connected the problem formulation to an informationtheoretic problem called “assisted interference suppression” that was itself inspired by work on the socalled “cognitive radio channel.” We used a new lower bound to sho ..."
Abstract

Cited by 13 (8 self)
 Add to MetaCart
Recently, we considered a vector version of Witsenhausen’s counterexample and connected the problem formulation to an informationtheoretic problem called “assisted interference suppression” that was itself inspired by work on the socalled “cognitive radio channel.” We used a new lower bound to show that in that limit of infinite vector length, certain quantizationbased strategies are provably within a constant factor of the optimal cost for all possible problem parameters. In this paper, finite vector lengths are considered with the vector length being viewed as an additional problem parameter. By applying the “spherepacking ” philosophy, a lower bound to the optimal cost for this finitelength problem is derived that uses appropriate shadows of the infinitelength bounds. We also introduce latticebased quantization strategies for any finite length. Using the new finitelength lower bound, we show that the latticebased strategies achieve within a constant factor of the optimal cost uniformly over all possible problem parameters, including the vector length. For Witsenhausen’s original problem — which corresponds to the scalar case — latticebased strategies attain within a factor of 8 of the optimal cost. Based on observations in the scalar case and the infinitedimensional case, we also conjecture what the optimal strategies could be for any finite vector length.