Results 1  10
of
94
The Secrecy Capacity Region of the Gaussian MIMO MultiReceiver Wiretap Channel
, 2009
"... In this paper, we consider the Gaussian multipleinput multipleoutput (MIMO) multireceiver wiretap channel in which a transmitter wants to have confidential communication with an arbitrary number of users in the presence of an external eavesdropper. We derive the secrecy capacity region of this ch ..."
Abstract

Cited by 70 (23 self)
 Add to MetaCart
(Show Context)
In this paper, we consider the Gaussian multipleinput multipleoutput (MIMO) multireceiver wiretap channel in which a transmitter wants to have confidential communication with an arbitrary number of users in the presence of an external eavesdropper. We derive the secrecy capacity region of this channel for the most general case. We first show that even for the singleinput singleoutput (SISO) case, existing converse techniques for the Gaussian scalar broadcast channel cannot be extended to this secrecy context, to emphasize the need for a new proof technique. Our new proof technique makes use of the relationships between the minimummeansquareerror and the mutual information, and equivalently, the relationships between the Fisher information and the differential entropy. Using the intuition gained from the converse proof of the SISO channel, we first prove the secrecy capacity region of the degraded MIMO channel, in which all receivers have the same number of antennas, and the noise covariance matrices can be arranged according to a positive semidefinite order. We then generalize this result to the aligned case, in which all receivers have the same number of antennas, however there is no order among the noise covariance matrices. We accomplish this task by using the channel enhancement technique. Finally, we find the secrecy capacity region of the general MIMO channel by using some limiting arguments on the secrecy capacity region of the aligned MIMO channel. We show that the capacity achieving coding scheme is a variant of dirtypaper coding with Gaussian signals.
Weighted SumRate Maximization using Weighted MMSE for MIMOBC Beamforming Design
 IEEE Trans. on Wireless Comm
, 2008
"... Abstract—This paper studies linear transmit filter design for Weighted SumRate (WSR) maximization in the Multiple Input Multiple Output Broadcast Channel (MIMOBC). The problem of finding the optimal transmit filter is nonconvex and intractable to solve using low complexity methods. Motivated by r ..."
Abstract

Cited by 59 (2 self)
 Add to MetaCart
(Show Context)
Abstract—This paper studies linear transmit filter design for Weighted SumRate (WSR) maximization in the Multiple Input Multiple Output Broadcast Channel (MIMOBC). The problem of finding the optimal transmit filter is nonconvex and intractable to solve using low complexity methods. Motivated by recent results highlighting the relationship between mutual information and Minimum Mean Square Error (MMSE), this paper establishes a relationship between weighted sumrate and weighted MMSE in the MIMOBC. The relationship is used to propose two low complexity algorithms for finding a local weighted sumrate optimum based on alternating optimization. Numerical results studying sumrate show that the proposed algorithms achieve high performance with few iterations. Index Terms—MIMO systems, transceiver design, smart antennas, antennas and propagation. I.
Estimation in Gaussian Noise: Properties of the minimum meansquare error
 IEEE Trans. Inf. Theory
, 2011
"... Abstract—Consider the minimum meansquare error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signaltonoise ratio (SNR) as well as a functional of the input distribution (of the random variable t ..."
Abstract

Cited by 44 (12 self)
 Add to MetaCart
(Show Context)
Abstract—Consider the minimum meansquare error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signaltonoise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all positive SNR, and in fact a real analytic function in SNR under mild conditions. The key to these regularity results is that the posterior distribution conditioned on the observation through Gaussian channels always decays at least as quickly as some Gaussian density. Furthermore, simple expressions for the first three derivatives of the MMSE with respect to the SNR are obtained. It is also shown that, as functions of the SNR, the curves for the MMSE of a Gaussian input and that of a nonGaussian input cross at most once over all SNRs. These properties lead to simple proofs of the facts that Gaussian inputs achieve both the secrecy capacity of scalar Gaussian wiretap channels and the capacity of scalar Gaussian broadcast channels, as well as a simple proof of the entropy power inequality in the special case where one of the variables is Gaussian. Index Terms—Entropy, estimation, Gaussian broadcast channel, Gaussian noise, Gaussian wiretap channel, minimum mean square error (MMSE), mutual information. I.
MIMO Gaussian Channels With Arbitrary Inputs: Optimal Precoding and Power Allocation
, 2010
"... In this paper, we investigate the linear precoding and power allocation policies that maximize the mutual information for general multipleinput–multipleoutput (MIMO) Gaussian channels with arbitrary input distributions, by capitalizing on the relationship between mutual information and minimum me ..."
Abstract

Cited by 34 (6 self)
 Add to MetaCart
(Show Context)
In this paper, we investigate the linear precoding and power allocation policies that maximize the mutual information for general multipleinput–multipleoutput (MIMO) Gaussian channels with arbitrary input distributions, by capitalizing on the relationship between mutual information and minimum meansquare error (MMSE). The optimal linear precoder satisfies a fixedpoint equation as a function of the channel and the input constellation. For nonGaussian inputs, a nondiagonal precoding matrix in general increases the information transmission rate, even for parallel noninteracting channels. Whenever precoding is precluded, the optimal power allocation policy also satisfies a fixedpoint equation; we put forth a generalization of the mercury/waterfilling algorithm, previously proposed for parallel noninterfering channels, in which the mercury level accounts not only for the nonGaussian input distributions, but also for the
MIMO radar waveform design based on mutual information and minimum meansquare error estimation
 IEEE Transactions on Aerospace and Electronic Systems
, 2007
"... Abstract — This paper addresses the problem of radar waveform design for target identification and classification. Both the ordinary radar with a single transmitter and receiver and the recently proposed multipleinput multipleoutput (MIMO) radar are considered. A random target impulse response is ..."
Abstract

Cited by 30 (4 self)
 Add to MetaCart
(Show Context)
Abstract — This paper addresses the problem of radar waveform design for target identification and classification. Both the ordinary radar with a single transmitter and receiver and the recently proposed multipleinput multipleoutput (MIMO) radar are considered. A random target impulse response is used to model the scattering characteristics of the extended (nonpoint) target, and two radar waveform design problems with constraints on waveform power have been investigated. The first one is to design waveforms that maximize the conditional mutual information (MI) between the random target impulse response and the reflected waveforms given the knowledge of transmitted waveforms. The second one is to find transmitted waveforms that minimize the meansquare error (MSE) in estimating the target impulse response. Our analysis indicates that under the same total power constraint, these two criteria lead to the same solution for a matrix which specifies the essential part of the optimum waveform design. The solution employs waterfilling to allocate the limited power appropriately. We also present an asymptotic formulation which requires less knowledge of the statistical model of the target. Index Terms — Multipleinput multipleoutput (MIMO) radar, radar waveform design, identification, classification, extended radar targets, mutual information (MI), minimum meansquare error (MMSE), waveform diversity. I.
Information Theoretic Proofs of Entropy Power Inequalities
, 2007
"... While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon’s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon’s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum meansquare error (MMSE), which are derived from de Bruijn’s identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariancepreserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman’s Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verdú used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder’s generalized EPI for linear transformations of the random variables, Takano and Johnson’s EPI for dependent variables, Liu and Viswanath’s covarianceconstrained EPI, and Costa’s concavity inequality for the entropy power.
A Tour of Modern Image Filtering  New insights and methods, both practical and theoretical
 IEEE SIGNAL PROCESSING MAGAZINE [106]
, 2013
"... Recent developments in computational imaging and restoration have heralded the arrival and convergence of several powerful methods for adaptive processing of multidimensional data. Examples include moving least square (from graphics), the bilateral filter (BF) and anisotropic diffusion (from compute ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
Recent developments in computational imaging and restoration have heralded the arrival and convergence of several powerful methods for adaptive processing of multidimensional data. Examples include moving least square (from graphics), the bilateral filter (BF) and anisotropic diffusion (from computer vision), boosting, kernel, and spectral methods (from machine learning), nonlocal means (NLM) and its variants (from signal processing), Bregman iterations (from applied math), kernel regression, and iterative scaling (from statistics). While these approaches found their inspirations in diverse fields of nascence, they are deeply connected. Digital Object Identifier 10.1109/MSP.2011.2179329 Date of publication: 5 December 2012 In this article, I present a practical and accessible framework to understand some of the basic underpinnings of these methods, with the intention of leading the reader to a broad understanding of how they interrelate. I also illustrate connections between these techniques and more classical (empirical) Bayesian approaches. The proposed framework is used to arrive at new insights and methods, both practical and theoretical. In particular, several novel optimality properties of algorithms in wide use such as blockmatching and threedimensional (3D) filtering (BM3D), and methods for their iterative improvement (or nonexistence thereof) are discussed. A general approach is laid out to enable the performance analysis and subsequent improvement of many existing filtering algorithms. While much of the material discussed is applicable to the wider class of linear degradation models beyond noise (e.g., blur,) to keep matters focused, we consider the problem of denoising here.
Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels
 IEEE Trans. Inf. Theory
, 2009
"... Abstract—Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum m ..."
Abstract

Cited by 27 (5 self)
 Add to MetaCart
(Show Context)
Abstract—Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to informationtheoretic quantities through differentiation, the Hessian of the mutual information and the entropy are derived. These expressions are then used to assess the concavity properties of mutual information and entropy under different channel conditions and also to derive a multivariate version of an entropy power inequality due to Costa. Index Terms—Concavity properties, differential entropy, entropy power, Fisher information matrix, Gaussian noise, Hessian matrices, linear vector Gaussian channels, minimum meansquare
Representation of Mutual Information Via Input Estimates
"... Abstract—A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum meansquare error. This paper generalizes the link between information theory and estimation theory to arbitrary channe ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
(Show Context)
Abstract—A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum meansquare error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language. Index Terms—Computation of mutual information, extrinsic information, input estimation, lowdensity paritycheck (LDPC) codes, minimum mean square error (MMSE), mutual information, soft channel decoding. I.
A vector generalization of Costa’s entropypower inequality with applications
 IEEE TRANS. INF. THEORY
, 2010
"... This paper considers an entropypower inequality (EPI) of Costa and presents a natural vector generalization with a real positive semidefinite matrix parameter. This new inequality is proved using a perturbation approach via a fundamental relationship between the derivative of mutual information and ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
This paper considers an entropypower inequality (EPI) of Costa and presents a natural vector generalization with a real positive semidefinite matrix parameter. This new inequality is proved using a perturbation approach via a fundamental relationship between the derivative of mutual information and the minimum meansquare error (MMSE) estimate in linear vector Gaussian channels. As an application, a new extremal entropy inequality is derived from the generalized Costa EPI and then used to establish the secrecy capacity regions of the degraded vector Gaussian broadcast channel with layered confidential messages.