Results 1  10
of
79
The secrecy capacity of the MIMO wiretap channel
, 2008
"... We consider the MIMO wiretap channel, that is a MIMO broadcast channel where the transmitter sends some confidential information to one user which is a legitimate receiver, while the other user is an eavesdropper. Perfect secrecy is achieved when the the transmitter and the legitimate receiver can c ..."
Abstract

Cited by 166 (1 self)
 Add to MetaCart
We consider the MIMO wiretap channel, that is a MIMO broadcast channel where the transmitter sends some confidential information to one user which is a legitimate receiver, while the other user is an eavesdropper. Perfect secrecy is achieved when the the transmitter and the legitimate receiver can communicate at some positive rate, while insuring that the eavesdropper gets zero bits of information. In this paper, we compute the perfect secrecy capacity of the multiple antenna MIMO broadcast channel, where the number of antennas is arbitrary for both the transmitter and the two receivers.
Capacity bounds for the Gaussian interference channel
 IEEE TRANS. INFORM. THEORY
"... The capacity region of the twouser Gaussian Interference Channel (IC) is studied. Three classes of channels are considered: weak, onesided, and mixed Gaussian ICs. For the weak Gaussian IC, a new outer bound on the capacity region is obtained that outperforms previously known outer bounds. The cha ..."
Abstract

Cited by 158 (6 self)
 Add to MetaCart
The capacity region of the twouser Gaussian Interference Channel (IC) is studied. Three classes of channels are considered: weak, onesided, and mixed Gaussian ICs. For the weak Gaussian IC, a new outer bound on the capacity region is obtained that outperforms previously known outer bounds. The channel sum capacity for some certain range of the channel parameters is derived. It is shown that when Gaussian codebooks are used, the full HanKobayashi achievable rate region can be obtained by using the naive HanKobayashi achievable scheme over three frequency bands (equivalently, three subspaces). For the onesided Gaussian IC, a new proof for Sato’s outer bound is presented. We derive the full HanKobayashi achievable rate region when Gaussian code books are utilized. For the mixed Gaussian IC, a new outer bound is obtained that again outperforms previously known outer bounds. For this case, the channel sum capacity for all ranges of parameters is derived. It is proved that the full HanKobayashi achievable rate region using Gaussian codebooks is equivalent to that of the onesided Gaussian IC for a particular range of the channel gains.
Gaussian interference network: Sum capacity . . .
, 2008
"... Establishing the capacity region of a Gaussian interference network is an open problem in information theory. Recent progress on this problem has led to the characterization of the capacity region of a general two user Gaussian interference channel within one bit. In this paper, we develop new, impr ..."
Abstract

Cited by 133 (5 self)
 Add to MetaCart
(Show Context)
Establishing the capacity region of a Gaussian interference network is an open problem in information theory. Recent progress on this problem has led to the characterization of the capacity region of a general two user Gaussian interference channel within one bit. In this paper, we develop new, improved outer bounds on the capacity region. Using these bounds, we show that treating interference as noise achieves the sum capacity of the two user Gaussian interference channel in a low interference regime, where the interference parameters are below certain thresholds. We then generalize our techniques and results to Gaussian interference networks with more than two users. In particular, we demonstrate that the total interference threshold, below which treating interference as noise achieves the sum capacity, increases with the number of users.
The Secrecy Capacity Region of the Gaussian MIMO MultiReceiver Wiretap Channel
, 2009
"... In this paper, we consider the Gaussian multipleinput multipleoutput (MIMO) multireceiver wiretap channel in which a transmitter wants to have confidential communication with an arbitrary number of users in the presence of an external eavesdropper. We derive the secrecy capacity region of this ch ..."
Abstract

Cited by 70 (23 self)
 Add to MetaCart
(Show Context)
In this paper, we consider the Gaussian multipleinput multipleoutput (MIMO) multireceiver wiretap channel in which a transmitter wants to have confidential communication with an arbitrary number of users in the presence of an external eavesdropper. We derive the secrecy capacity region of this channel for the most general case. We first show that even for the singleinput singleoutput (SISO) case, existing converse techniques for the Gaussian scalar broadcast channel cannot be extended to this secrecy context, to emphasize the need for a new proof technique. Our new proof technique makes use of the relationships between the minimummeansquareerror and the mutual information, and equivalently, the relationships between the Fisher information and the differential entropy. Using the intuition gained from the converse proof of the SISO channel, we first prove the secrecy capacity region of the degraded MIMO channel, in which all receivers have the same number of antennas, and the noise covariance matrices can be arranged according to a positive semidefinite order. We then generalize this result to the aligned case, in which all receivers have the same number of antennas, however there is no order among the noise covariance matrices. We accomplish this task by using the channel enhancement technique. Finally, we find the secrecy capacity region of the general MIMO channel by using some limiting arguments on the secrecy capacity region of the aligned MIMO channel. We show that the capacity achieving coding scheme is a variant of dirtypaper coding with Gaussian signals.
A note on the secrecy capacity of the multiantenna wiretap channel,” Arxiv preprint arXiv:0710.4105
, 2007
"... Recently, the secrecy capacity of the multiantenna wiretap channel was characterized by Khisti and Wornell [1] using a Satolike argument. This note presents an alternative characterization using a channel enhancement argument. This characterization relies on an extremal entropy inequality recently ..."
Abstract

Cited by 62 (4 self)
 Add to MetaCart
Recently, the secrecy capacity of the multiantenna wiretap channel was characterized by Khisti and Wornell [1] using a Satolike argument. This note presents an alternative characterization using a channel enhancement argument. This characterization relies on an extremal entropy inequality recently proved in the context of multiantenna broadcast channels, and is directly built on the physical intuition regarding to the optimal transmission strategy in this communication scenario. 1
Generalized entropy power inequalities and monotonicity properties of information
 IEEE Trans. Inform. Theory
, 2007
"... New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary ..."
Abstract

Cited by 53 (8 self)
 Add to MetaCart
(Show Context)
New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variancestandardized sums. 1
Providing Secrecy With Structured Codes: Tools and Applications to TwoUser Gaussian Channels
, 2009
"... Recent results have shown that structured codes can be used to construct good channel codes, source codes and physical layer network codes for Gaussian channels. For Gaussian channels with secrecy constraints, however, efforts to date rely on random codes. In this work, we advocate that structured c ..."
Abstract

Cited by 45 (17 self)
 Add to MetaCart
Recent results have shown that structured codes can be used to construct good channel codes, source codes and physical layer network codes for Gaussian channels. For Gaussian channels with secrecy constraints, however, efforts to date rely on random codes. In this work, we advocate that structured codes are useful for providing secrecy, and show how to compute the secrecy rate when structured codes are used. In particular, we solve the problem of bounding equivocation rates with one important class of structured codes, i.e., nested lattice codes. Having established this result, we next demonstrate the use of structured codes for secrecy in twouser Gaussian channels. In particular, with structured codes, we prove that a positive secure degree of freedom is achievable for a large class of fully connected Gaussian channels as long as the channel is not degraded. By way of this, for these channels, we establish that structured codes outperform Gaussian random codes at high SNR. This class of channels include the twouser multiple access wiretap channel, the twouser interference channel with confidential messages and the twouser interference wiretap channel. A notable consequence of this result is that, unlike the case with Gaussian random codes, using structured codes for both transmission and cooperative jamming, it is possible to achieve an arbitrary large secrecy rate given enough power.
Inner and outer bounds for the Gaussian cognitive interference channel and new capacity results
 IEEE Trans. Inf. Theory
"... Abstract—The capacity of the Gaussian cognitive interference channel, a variation of the classical twouser interference channel where one of the transmitters (referred to as cognitive) has knowledge of both messages, is known in several parameter regimes but remains unknown in general. This paper ..."
Abstract

Cited by 30 (13 self)
 Add to MetaCart
(Show Context)
Abstract—The capacity of the Gaussian cognitive interference channel, a variation of the classical twouser interference channel where one of the transmitters (referred to as cognitive) has knowledge of both messages, is known in several parameter regimes but remains unknown in general. This paper provides a comparative overview of this channel model as it proceeds through the following contributions. First, several outer bounds are presented: a) a new outer bound based on the idea of a broadcast channel with degraded message sets, and b) an outer bound obtained by transforming the channel into channels with known capacity. Next, a compact Fourier–Motzkin eliminated version of the largest known inner bound derived for the discrete memoryless cognitive interference channel is presented and specialized to the Gaussian noise case, where several simplified schemes with jointly Gaussian input are evaluated in closed form and later used to prove a number of results. These include a new set of capacity results for: a) the “primary decodes cognitive ” regime, a subset of the “strong interference ” regime that is not included in the “very strong interference” regime for which capacity was known, and b) the “Schannel in strong interference ” in which the primary transmitter does not interfere with the cognitive receiver and the primary receiver experiences strong interference. Next, for a general Gaussian channel the capacity is determined to within one bit/s/Hz and to within a factor two regardless of the channel parameters, thus establishing rate performance guarantees at high and low SNR, respectively. The paper concludes with numerical evaluations and comparisons of the various simplified achievable rate regions and outer bounds in parameter regimes where capacity is unknown, leading to further insight on the capacity region. Index Terms—Broadcast channel with degraded message sets, capacity in the primary decodes cognitive regime, capacity for the Zchannel in strong interference, capacity to within one bit, capacity to within a factor of two, cognitive interference channel, inner bound, outer bound. I.
Information Theoretic Proofs of Entropy Power Inequalities
, 2007
"... While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon’s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon’s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum meansquare error (MMSE), which are derived from de Bruijn’s identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariancepreserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman’s Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verdú used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder’s generalized EPI for linear transformations of the random variables, Takano and Johnson’s EPI for dependent variables, Liu and Viswanath’s covarianceconstrained EPI, and Costa’s concavity inequality for the entropy power.
A vector generalization of Costa’s entropypower inequality with applications
 IEEE TRANS. INF. THEORY
, 2010
"... This paper considers an entropypower inequality (EPI) of Costa and presents a natural vector generalization with a real positive semidefinite matrix parameter. This new inequality is proved using a perturbation approach via a fundamental relationship between the derivative of mutual information and ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
This paper considers an entropypower inequality (EPI) of Costa and presents a natural vector generalization with a real positive semidefinite matrix parameter. This new inequality is proved using a perturbation approach via a fundamental relationship between the derivative of mutual information and the minimum meansquare error (MMSE) estimate in linear vector Gaussian channels. As an application, a new extremal entropy inequality is derived from the generalized Costa EPI and then used to establish the secrecy capacity regions of the degraded vector Gaussian broadcast channel with layered confidential messages.