Results 1 - 10
of
96
Estimation in Gaussian Noise: Properties of the minimum mean-square error
- IEEE Trans. Inf. Theory
, 2011
"... Abstract—Consider the minimum mean-square error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signal-to-noise ratio (SNR) as well as a functional of the input distribution (of the random variable t ..."
Abstract
-
Cited by 44 (12 self)
- Add to MetaCart
(Show Context)
Abstract—Consider the minimum mean-square error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signal-to-noise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all positive SNR, and in fact a real analytic function in SNR under mild conditions. The key to these regularity results is that the posterior distribution conditioned on the observation through Gaussian channels always decays at least as quickly as some Gaussian density. Furthermore, simple expressions for the first three derivatives of the MMSE with respect to the SNR are obtained. It is also shown that, as functions of the SNR, the curves for the MMSE of a Gaussian input and that of a non-Gaussian input cross at most once over all SNRs. These properties lead to simple proofs of the facts that Gaussian inputs achieve both the secrecy capacity of scalar Gaussian wiretap channels and the capacity of scalar Gaussian broadcast channels, as well as a simple proof of the entropy power inequality in the special case where one of the variables is Gaussian. Index Terms—Entropy, estimation, Gaussian broadcast channel, Gaussian noise, Gaussian wiretap channel, minimum mean square error (MMSE), mutual information. I.
MIMO Gaussian Channels With Arbitrary Inputs: Optimal Precoding and Power Allocation
, 2010
"... In this paper, we investigate the linear precoding and power allocation policies that maximize the mutual information for general multiple-input–multiple-output (MIMO) Gaussian channels with arbitrary input distributions, by capitalizing on the relationship between mutual information and minimum me ..."
Abstract
-
Cited by 34 (6 self)
- Add to MetaCart
(Show Context)
In this paper, we investigate the linear precoding and power allocation policies that maximize the mutual information for general multiple-input–multiple-output (MIMO) Gaussian channels with arbitrary input distributions, by capitalizing on the relationship between mutual information and minimum mean-square error (MMSE). The optimal linear precoder satisfies a fixed-point equation as a function of the channel and the input constellation. For non-Gaussian inputs, a nondiagonal precoding matrix in general increases the information transmission rate, even for parallel noninteracting channels. Whenever precoding is precluded, the optimal power allocation policy also satisfies a fixed-point equation; we put forth a generalization of the mercury/waterfilling algorithm, previously proposed for parallel noninterfering channels, in which the mercury level accounts not only for the non-Gaussian input distributions, but also for the
Monotonic decrease of the non-Gaussianness of the sum of independent random variables: a simple proof.
- IEEE Trans. Inform. Theory,
, 2006
"... ..."
(Show Context)
Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels
- IEEE Trans. Inf. Theory
, 2009
"... Abstract—Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum m ..."
Abstract
-
Cited by 27 (5 self)
- Add to MetaCart
(Show Context)
Abstract—Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, the Hessian of the mutual information and the entropy are derived. These expressions are then used to assess the concavity properties of mutual information and entropy under different channel conditions and also to derive a multivariate version of an entropy power inequality due to Costa. Index Terms—Concavity properties, differential entropy, entropy power, Fisher information matrix, Gaussian noise, Hessian matrices, linear vector Gaussian channels, minimum mean-square
Representation of Mutual Information Via Input Estimates
"... Abstract—A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channe ..."
Abstract
-
Cited by 26 (4 self)
- Add to MetaCart
(Show Context)
Abstract—A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language. Index Terms—Computation of mutual information, extrinsic information, input estimation, low-density parity-check (LDPC) codes, minimum mean square error (MMSE), mutual information, soft channel decoding. I.
Mismatched estimation and relative entropy
- IEEE Trans. Inf. Theory
, 2010
"... Abstract—A random variable with distribution is observed in Gaussian noise and is estimated by a mismatched minimum meansquare estimator that assumes that the distribution is, instead of. This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation e ..."
Abstract
-
Cited by 25 (3 self)
- Add to MetaCart
(Show Context)
Abstract—A random variable with distribution is observed in Gaussian noise and is estimated by a mismatched minimum meansquare estimator that assumes that the distribution is, instead of. This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability. Index Terms—Divergence, free probability, minimum meansquare error (MMSE) estimation, mutual information, relative entropy, Shannon theory, statistics. I.
MMSE dimension
- in Proc. 2010 IEEE Int. Symp. Inf. Theory
, 2010
"... Abstract—If is standard Gaussian, the minimum meansquare error (MMSE) of estimating a random variable based on + vanishes at least as fast as 1 as.We define the MMSE dimension of as the limit as of the product of and the MMSE. MMSE dimension is also shown to be the asymptotic ratio of nonlinear MMSE ..."
Abstract
-
Cited by 21 (9 self)
- Add to MetaCart
Abstract—If is standard Gaussian, the minimum meansquare error (MMSE) of estimating a random variable based on + vanishes at least as fast as 1 as.We define the MMSE dimension of as the limit as of the product of and the MMSE. MMSE dimension is also shown to be the asymptotic ratio of nonlinear MMSE to linear MMSE. For discrete, absolutely continuous or mixed distribution we show that MMSE dimension equals Rényi’s information dimension. However, for a class of self-similar singular (e.g., Cantor distribution), we show that the product of and MMSE oscillates around information dimension periodically in (dB). We also show that these results extend considerably beyond Gaussian noise under various technical conditions. Index Terms—Additive noise, Bayesian statistics, Gaussian noise, high-SNR asymptotics, minimum mean-square error (MMSE), mutual information, non-Gaussian noise, Rényi information dimension. where the infimum in (1) is over all Borel measurable. When is related to through an additive-noise channel with gain, i.e., (3) where is independent of, we denote
Communications-inspired projection design with application to compressive sensing
- Imaging Sciences, SIAM Journal on
, 2012
"... ar ..."
(Show Context)
On optimal precoding in linear vector Gaussian channels with arbitrary input distribution
, 904
"... Abstract—The design of the precoder the maximizes the mutual information in linear vector Gaussian channels with an arbitrary input distribution is studied. Precisely, the precoder optimal left singular vectors and singular values are derived. The characterization of the right singular vectors is le ..."
Abstract
-
Cited by 14 (1 self)
- Add to MetaCart
(Show Context)
Abstract—The design of the precoder the maximizes the mutual information in linear vector Gaussian channels with an arbitrary input distribution is studied. Precisely, the precoder optimal left singular vectors and singular values are derived. The characterization of the right singular vectors is left, in general, as an open problem whose computational complexity is then studied in three cases: Gaussian signaling, low SNR, and high SNR. For the Gaussian signaling case and the low SNR regime, the dependence of the mutual information on the right singular vectors vanishes, making the optimal precoder design problem easy to solve. In the high SNR regime, however, the dependence on the right singular vectors cannot be avoided and we show the difficulty of computing the optimal precoder through an NPhardness analysis. I.
Bit and power loading for OFDM-based three-node relaying communications
- IEEE Trans. Signal Process
, 2008
"... Abstract—Bit and power loading (BPL) techniques have been in-tensively investigated for the single-link communications. In this paper, we propose a margin-adaptive BPL approach for orthog-onal frequency-division multiplexing (OFDM) systems assisted by a single cooperative relay. This orthogonal half ..."
Abstract
-
Cited by 9 (0 self)
- Add to MetaCart
(Show Context)
Abstract—Bit and power loading (BPL) techniques have been in-tensively investigated for the single-link communications. In this paper, we propose a margin-adaptive BPL approach for orthog-onal frequency-division multiplexing (OFDM) systems assisted by a single cooperative relay. This orthogonal half-duplex relay op-erates either in the selection detection-and-forward (SDF) mode or in the amplify-and-forward (AF) mode. Maximum-ratio com-bining is employed at the destination to attain the achievable dis-tributed spatial diversity-gain. Assuming perfect channel knowl-edge is available at all nodes, the proposed approach is to minimize the transmit-power consumption at the target throughput (average number of bits/symbol) and the target link performance. With re-spect to various power-constraint conditions, we investigate two distributed resource-allocation strategies, namely flexible power ratio (FLPR) and fixed power ratio (FIPR). The FLPR strategy is proposed for scenarios without individual local power constraint. The source power and relay power have a flexible ratio for each subcarrier. The FIPR strategy is proposed for scenarios with indi-vidual local power constraint. The source power and relay power have a fixed ratio for each subcarrier. Computer simulations are carried out to evaluate the proposed approach with respect to the relay location. Significant performance improvement is observed in terms of both the symbol-error-rate and the transmit-power ef-ficiency. Index Terms—Amplify-and-forward (AF), bit and power loading (BPL), detection-and-forward, orthogonal frequency-di-vision multiplexing (OFDM), relay. I.