Results 11  20
of
125
A Layered Lattice Coding Scheme for a Class of Three User Gaussian Interference Channels
 Allerton Conf. on Communication, Control, and Computing
, 2008
"... Abstract—The paper studies a class of three user Gaussian interference channels. A new layered lattice coding scheme is introduced as a transmission strategy. The use of lattice codes allows for an “alignment ” of the interference observed at each receiver. The layered lattice coding is shown to ach ..."
Abstract

Cited by 45 (4 self)
 Add to MetaCart
(Show Context)
Abstract—The paper studies a class of three user Gaussian interference channels. A new layered lattice coding scheme is introduced as a transmission strategy. The use of lattice codes allows for an “alignment ” of the interference observed at each receiver. The layered lattice coding is shown to achieve more than one degree of freedom for a class of interference channels and also achieves rates which are better than the rates obtained using the HanKobayashi coding scheme. I.
Lattices for distributed source coding: Jointly Gaussian sources and reconstruction of a linear function
 IEEE TRANSACTIONS ON INFORMATION THEORY, SUBMITTED
, 2007
"... Consider a pair of correlated Gaussian sources (X1, X2). Two separate encoders observe the two components and communicate compressed versions of their observations to a common decoder. The decoder is interested in reconstructing a linear combination of X1 and X2 to within a meansquare distortion of ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
(Show Context)
Consider a pair of correlated Gaussian sources (X1, X2). Two separate encoders observe the two components and communicate compressed versions of their observations to a common decoder. The decoder is interested in reconstructing a linear combination of X1 and X2 to within a meansquare distortion of D. We obtain an inner bound to the optimal ratedistortion region for this problem. A portion of this inner bound is achieved by a scheme that reconstructs the linear function directly rather than reconstructing the individual components X1 and X2 first. This results in a better rate region for certain parameter values. Our coding scheme relies on lattice coding techniques in contrast to more prevalent random coding arguments used to demonstrate achievable rate regions in information theory. We then consider the case of linear reconstruction of K sources and provide an inner bound to the optimal ratedistortion region. Some parts of the inner bound are achieved using the following coding structure: lattice vector quantization followed by “correlated” latticestructured binning.
Capacity of Symmetric KUser Gaussian Very Strong Interference Channels
 In IEEE Global Telecommunication Conference
, 2008
"... Abstract—This paper studies a symmetric K user Gaussian interference channel with K transmitters and K receivers. A “very strong ” interference regime is derived for this channel setup. A “very strong ” interference regime is one where the capacity region of the interference channel is the same as t ..."
Abstract

Cited by 43 (11 self)
 Add to MetaCart
Abstract—This paper studies a symmetric K user Gaussian interference channel with K transmitters and K receivers. A “very strong ” interference regime is derived for this channel setup. A “very strong ” interference regime is one where the capacity region of the interference channel is the same as the capacity region of the channel with no interference. In this regime, the interference can be perfectly canceled by all the receivers without incurring any rate penalties. A “very strong ” interference condition for an example symmetric K user deterministic interference channel is also presented. I.
Lattices are Everywhere
"... As bees and crystals (and people selling oranges in the market) know it for many years, lattices provide efficient structures for packing, covering, quantization and channel coding. In the recent years, interesting links were found between lattices and coding schemes for multiterminal networks. Thi ..."
Abstract

Cited by 40 (3 self)
 Add to MetaCart
(Show Context)
As bees and crystals (and people selling oranges in the market) know it for many years, lattices provide efficient structures for packing, covering, quantization and channel coding. In the recent years, interesting links were found between lattices and coding schemes for multiterminal networks. This tutorial paper covers close to 20 years of my research in the area; of enjoying the beauty of lattice codes, and discovering their power in dithered quantization, dirty paper coding, WynerZiv DPCM, modulolattice modulation, distributed interference cancelation, and more. I.
Superposition coding for sideinformation channels
"... Abstract—We present simple, practical codes designed for the binary and Gaussian dirtypaper channels. We show that the dirtypaper decoding problem can be transformed into an equivalent multipleaccess decoding problem, for which we apply superposition coding. Our concept is a generalization of the ..."
Abstract

Cited by 31 (3 self)
 Add to MetaCart
Abstract—We present simple, practical codes designed for the binary and Gaussian dirtypaper channels. We show that the dirtypaper decoding problem can be transformed into an equivalent multipleaccess decoding problem, for which we apply superposition coding. Our concept is a generalization of the nested lattices approach of Zamir, Shamai, and Erez. In a theoretical setting, our constructions are capable of achieving capacity using random component codes and maximumlikelihood decoding. We also present practical implementations of the constructions, and simulation results for both dirtypaper channels. Our results for the Gaussian dirtypaper channel are on par with the best known results for nested lattices. We discuss the binary dirtytape channel, for which we present a simple, effective coding technique. Finally, we propose a framework for extending our approach to general Gel’fand–Pinsker channels. Index Terms—Dirty paper, dirty tape, multipleaccess channel (MAC), side information, superposition coding. I.
Nested lattice codes for Gaussian relay networks with interference
 IEEE Trans. Inf. Theory
, 2011
"... ar ..."
The Gaussian interference relay channel: Improved achievable rates and sum rate upperbounds using a potent relay
 IEEE Transactions on Information Theory, Special Issue on Interference Networks
, 2011
"... Abstract—We consider the Gaussian interference channel with an intermediate relay as a main building block for cooperative interference networks. On the achievability side, we consider compressandforward based strategies. Specifically, a generalized compressandforward strategy, where the destina ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
(Show Context)
Abstract—We consider the Gaussian interference channel with an intermediate relay as a main building block for cooperative interference networks. On the achievability side, we consider compressandforward based strategies. Specifically, a generalized compressandforward strategy, where the destinations jointly decode the compression indices and the source messages, is shown to improve upon the compressandforward strategy which sequentially decodes the compression indices and source messages, and the recently proposed generalized hashandforward strategy. We also construct a nested lattice code based computeandforward relaying scheme, which outperforms other relaying schemes when the direct link is weak. In this case, it is shown that, with a relay, the interference link can be useful for decoding the source messages. Noting the need for upperbounding the capacity for this channel, we propose a new technique with which the sum rate can be bounded. In particular, the sum capacity is upperbounded by considering the channel when the relay node has abundant power and is named potent for that reason. For the Gaussian interference relay channel with potent relay, we study the strong and the weak interference regimes and establish the sum capacity, which, in turn, serve as upperbounds for the sum capacity of the GIFRC with finite relay power. Numerical results demonstrate that upperbounds are tighter than the cutset bound, and coincide with known achievable sum rates for many scenarios of interest. Additionally, the degrees of freedom of the GIFRC are shown to be 2 when the relay has large power, achievable using compressandforward. Index Terms—Generalized compressandforward, interference relay channel, lattice codes, potent relay outerbound, sum capacity.
Joint Source Channel Coding with Side Information Using Hybrid Digital Analog Codes
"... Abstract — We study the joint source channel coding problem of transmitting an analog source over a Gaussian channel in two cases (i) the presence of interference known only to the transmitter and (ii) in the presence of side information known only to the receiver. We introduce hybrid digital analo ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
(Show Context)
Abstract — We study the joint source channel coding problem of transmitting an analog source over a Gaussian channel in two cases (i) the presence of interference known only to the transmitter and (ii) in the presence of side information known only to the receiver. We introduce hybrid digital analog forms of the Costa and WynerZiv coding schemes. Our schemes are based on random coding arguments and are different from the nested lattice schemes by Kochman and Zamir that uses dithered quantization. We also discuss superimposed digital and analog schemes for the above problems which show that there are infinitely many schemes for achieving the optimal distortion for these problems. This provides an extension of the schemes by Bross et al to the interference/side information case. I. INTRODUCTION AND PROBLEM STATEMENT For the classical problem of transmitting K samples of
Computeandforward: Harnessing interference with structured codes
 in Proceedings of the IEEE International Symposium on Information Theory (ISIT 2008
, 2008
"... Abstract — For a centralized encoder and decoder, a channel matrix is simply a set of linear equations that can be transformed into parallel channels. We develop a similar approach to multiuser networks: we view interference as creating linear equations of codewords and that a receiver’s goal is to ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
(Show Context)
Abstract — For a centralized encoder and decoder, a channel matrix is simply a set of linear equations that can be transformed into parallel channels. We develop a similar approach to multiuser networks: we view interference as creating linear equations of codewords and that a receiver’s goal is to collect a full rank set of such equations. Our new relaying technique, computeandforward, uses structured codes to reliably compute functions over channels. This allows the relays to efficiently recover a linear functions of codewords without recovering the individual codewords. Thus, our scheme can work with the structure of the interference while removing the effects of the noise at the relay. We apply our scheme to a Gaussian relay network with interference and achieve better rates than either compressandforward or decodeandforward for certain regimes. Traditionally, structured codes have been studied for their
Secret key generation for correlated Gaussian sources
 IEEE Trans. Inform. Theory
, 2012
"... Abstract—Secret key generation by multiple terminals is considered based on their observations of jointly distributed Gaussian signals, followed by public communication among themselves. Exploiting an inherent connection between secrecy generation and lossy data compression, two main contributions ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Secret key generation by multiple terminals is considered based on their observations of jointly distributed Gaussian signals, followed by public communication among themselves. Exploiting an inherent connection between secrecy generation and lossy data compression, two main contributions are made. The first is a characterization of strong secret key capacity, and entails a converse proof technique that is valid for realvalued (and not necessarily Gaussian) as well as finitevalued signals. The capacity formula acquires a simple form when the terminals observe “symmetrically correlated ” jointly Gaussian signals. For the latter setup with two terminals, considering schemes that involve quantization at one terminal, the best rate of an achievable secret key is characterized as a function of quantization rate; secret key capacity is attained as the quantization rate tends to infinity. Structured codes are shown to attain the optimum tradeoff between secret key rate and quantization rate, constituting our second main contribution. Index Terms—Linear code, multiterminal Gaussian source model, nested lattice code, public communication, quantization, secret key capacity, strong secrecy. I.