Results 1  10
of
15
Estimation in Gaussian Noise: Properties of the minimum meansquare error
 IEEE Trans. Inf. Theory
, 2011
"... Abstract—Consider the minimum meansquare error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signaltonoise ratio (SNR) as well as a functional of the input distribution (of the random variable t ..."
Abstract

Cited by 46 (13 self)
 Add to MetaCart
(Show Context)
Abstract—Consider the minimum meansquare error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signaltonoise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all positive SNR, and in fact a real analytic function in SNR under mild conditions. The key to these regularity results is that the posterior distribution conditioned on the observation through Gaussian channels always decays at least as quickly as some Gaussian density. Furthermore, simple expressions for the first three derivatives of the MMSE with respect to the SNR are obtained. It is also shown that, as functions of the SNR, the curves for the MMSE of a Gaussian input and that of a nonGaussian input cross at most once over all SNRs. These properties lead to simple proofs of the facts that Gaussian inputs achieve both the secrecy capacity of scalar Gaussian wiretap channels and the capacity of scalar Gaussian broadcast channels, as well as a simple proof of the entropy power inequality in the special case where one of the variables is Gaussian. Index Terms—Entropy, estimation, Gaussian broadcast channel, Gaussian noise, Gaussian wiretap channel, minimum mean square error (MMSE), mutual information. I.
Dispersion of Gaussian channels
 IEEE Int. Symp. Inform. Theory
, 2009
"... Abstract—The minimum blocklength required to achieve a given rate and error probability can be easily and tightly approximated from two key channel parameters: the capacity and the channel dispersion. The channel dispersion gauges the variability of the channel relative to a deterministic bit pipe w ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
Abstract—The minimum blocklength required to achieve a given rate and error probability can be easily and tightly approximated from two key channel parameters: the capacity and the channel dispersion. The channel dispersion gauges the variability of the channel relative to a deterministic bit pipe with the same capacity. This paper finds the dispersion of the additive white Gaussian noise (AWGN) channel, the parallel AWGN channel, and the Gaussian channel with nonwhite noise and intersymbol interference. I.
Functional Properties of Minimum MeanSquare Error and Mutual Information
"... Abstract—In addition to exploring its various regularity properties, we show that the minimum meansquare error (MMSE) is a concave functional of the input–output joint distribution. In the case of additive Gaussian noise, the MMSE is shown to be weakly continuous in the input distribution and Lipsc ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Abstract—In addition to exploring its various regularity properties, we show that the minimum meansquare error (MMSE) is a concave functional of the input–output joint distribution. In the case of additive Gaussian noise, the MMSE is shown to be weakly continuous in the input distribution and Lipschitz continuous with respect to the quadratic Wasserstein distance for peaklimited inputs. Regularity properties of mutual information are also obtained. Several applications to information theory and the central limit theorem are discussed. Index Terms—Bayesian statistics, central limit theorem, Gaussian noise, minimum meansquare error (MMSE), mutual information, nonGaussian noise. I.
On Discrete Alphabets for the Twouser Gaussian Interference Channel with One Receiver Lacking Knowledge of the Interfering Codebook
"... Abstract—In multiuser information theory it is often assumed that every node in the network possesses all codebooks used in the network. This assumption is however impractical in distributed adhoc and cognitive networks. This work considers the twouser Gaussian Interference Channel with one Obliv ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Abstract—In multiuser information theory it is often assumed that every node in the network possesses all codebooks used in the network. This assumption is however impractical in distributed adhoc and cognitive networks. This work considers the twouser Gaussian Interference Channel with one Oblivious Receiver (GICOR), i.e., one receiver lacks knowledge of the interfering cookbook while the other receiver knows both codebooks. We ask whether, and if so how much, the channel capacity of the GICOR is reduced compared to that of the classical GIC where both receivers know all codebooks. Intuitively, the oblivious receiver should not be able to jointly decode its intended message along with the unintended interfering message whose codebook is unavailable. We demonstrate that in strong and very strong interference, where joint decoding is capacity achieving for the classical GIC, lack of codebook knowledge does not reduce performance in terms of generalized degrees of freedom (gDoF). Moreover, we show that the sumcapacity of the symmetric GICOR is to within O(log(log(SNR))) of that of the classical GIC. The key novelty of the proposed achievable scheme is the use of a discrete input alphabet for the nonoblivious transmitter, whose cardinality is appropriately chosen as a function of SNR. I.
On Gaussian Interference Channels with Mixed Gaussian and Discrete Inputs
"... Abstract—This paper studies the sumrate of a class of memoryless, realvalued additive white Gaussian noise interference channels (IC) achievable by treating interference as noise (TIN). We develop and analytically characterize the rates achievable by a new strategy that uses superpositions of Gau ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract—This paper studies the sumrate of a class of memoryless, realvalued additive white Gaussian noise interference channels (IC) achievable by treating interference as noise (TIN). We develop and analytically characterize the rates achievable by a new strategy that uses superpositions of Gaussian and discrete random variables as channel inputs. Surprisingly, we demonstrate that TIN is sumgeneralized degrees of freedom optimal and can achieve to within an additive gap of O(1) or O(log log(SNR)) to the symmetric sumcapacity of the classical IC. We also demonstrate connections to other channels such as the IC with partial codebook knowledge and the block asynchronous IC. I.
Achievable SecondOrder Coding Rates for the Wiretap Channel
"... Abstract—We derive lower bounds to the secondorder coding rates for the wiretap channel. The decoding error probability and the information leakage measured in terms of the variational distance secrecy metric are fixed at some constants r and s respectively. We leverage on the connection between wi ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract—We derive lower bounds to the secondorder coding rates for the wiretap channel. The decoding error probability and the information leakage measured in terms of the variational distance secrecy metric are fixed at some constants r and s respectively. We leverage on the connection between wiretap channel coding and channel resolvability to derive tighter secrecy bounds than those available in the literature. We then use central limit theoremstyle analysis to evaluate these bounds for the discrete memoryless wiretap channel with cost constraints and the Gaussian wiretap channel. Index Terms—Secondorder coding rates, Dispersion analysis, Wiretap channel, Informationtheoretic secrecy
The Capacity Loss of Dense Constellations
"... Abstract—We determine the loss in capacity incurred by using signal constellations with a bounded support over general complexvalued additivenoise channels for suitably high signaltonoise ratio. Our expression for the capacity loss recovers the power loss of 1.53dB for square signal constellatio ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract—We determine the loss in capacity incurred by using signal constellations with a bounded support over general complexvalued additivenoise channels for suitably high signaltonoise ratio. Our expression for the capacity loss recovers the power loss of 1.53dB for square signal constellations. I.
1Functional Properties of Minimum Meansquare Error and Mutual Information
"... Abstract—In addition to exploring its various regularity properties, we show that the minimum meansquare error (MMSE) is a concave functional of the inputoutput joint distribution. In the case of additive Gaussian noise, the MMSE is shown to be weakly continuous in the input distribution and Lips ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—In addition to exploring its various regularity properties, we show that the minimum meansquare error (MMSE) is a concave functional of the inputoutput joint distribution. In the case of additive Gaussian noise, the MMSE is shown to be weakly continuous in the input distribution and Lipschitz continuous with respect to the quadratic Wasserstein distance for peaklimited inputs. Regularity properties of mutual information are also obtained. Several applications to information theory and the central limit theorem are discussed. Index Terms—Bayesian statistics, minimum meansquare error (MMSE), mutual information, Gaussian noise, nonGaussian noise, central limit theorem. I.
On the TwoUser Interference Channel With Lack of Knowledge of the Interference Codebook at One Receiver
"... Abstract — In multiuser information theory, it is often assumed that every node in the network possesses all codebooks used in the network. This assumption may be impractical in distributed ad hoc, cognitive, or heterogeneous networks. This paper considers the twouser interference channel with one ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — In multiuser information theory, it is often assumed that every node in the network possesses all codebooks used in the network. This assumption may be impractical in distributed ad hoc, cognitive, or heterogeneous networks. This paper considers the twouser interference channel with one oblivious receiver (ICOR), i.e., one receiver lacks knowledge of the interfering cookbook, whereas the other receiver knows both codebooks. This paper asks whether, and if so how much, the channel capacity of the ICOR is reduced compared with that of the classical IC where both receivers know all codebooks. A novel outer bound is derived and shown to be achievable to within a gap for the class of injective semideterministic ICORs; the gap is shown to be zero for injective fully deterministic ICORs. An exact capacity result is shown for the general memoryless ICOR when the nonoblivious receiver experiences very strong interference. For the linear deterministic ICOR that models the Gaussian noise channel at high SNR, nonindependent identically distributed. Bernoulli(1/2) input bits are shown to achieve points not achievable by i.i.d. Bernoulli(1/2) input bits used in the same achievability scheme. For the realvalued Gaussian ICOR, the gap is shown to be at most 1/2 bit per channel use, even though the set of optimal input distributions for the derived outer bound could not be determined. Toward understanding the Gaussian ICOR, an achievability strategy is evaluated in which the input alphabets at the nonoblivious transmitter are a mixture of discrete and Gaussian random variables, where the cardinality of the discrete part is appropriately chosen as a function of the channel parameters. Surprisingly, as the oblivious receiver intuitively should not be able to jointly decode the intended and interfering messages (whose codebook is unavailable), it is shown that with this choice of input, the capacity region of the symmetric Gaussian ICOR is to within 1/2 log (12πe) ≈ 3.34 bits (per channel use per user) of an outer bound for the classical Gaussian IC with full codebook knowledge at both receivers.
CapacityAchieving Probabilistic Shaping for Noisy and Noiseless Channels
"... Diese Dissertation ist auf den Internetseiten der Hochschulbibliothek online verfügbar. Acknowledgments I want to thank Prof. Rudolf Mathar for the freedom to pursue my ideas during my time at his institute. The TI group was my second home for four years and a half, thank you all. Thanks to Daniel ..."
Abstract
 Add to MetaCart
Diese Dissertation ist auf den Internetseiten der Hochschulbibliothek online verfügbar. Acknowledgments I want to thank Prof. Rudolf Mathar for the freedom to pursue my ideas during my time at his institute. The TI group was my second home for four years and a half, thank you all. Thanks to Daniel and Gernot, Chunhui and Milan, Fabian and Steven, Andreas and Martijn for collaboration, trips around the world, coffee, and friendship. Special thanks to Prof. Valdemar Cardoso da Rocha Junior and Prof. Cecilio Pimentel for the continuous support. I am grateful to my father Prof. Siegfried Böcherer for all the telephone calls that helped me to get the math at least partially right. Prof. Gerhard Kramer read my dissertation cover to cover, which is the best reward I can think of. Finally, I thank my wife Noêmia and our children Izabel and Rafael for reminding me on a daily basis that work is not the only thing that matters. 2