Results 1 -
5 of
5
Performance of Polar Codes for Channel and Source Coding
"... Polar codes, introduced recently by Arıkan, are the first family of codes known to achieve capacity of symmetric channels using a low complexity successive cancellation decoder. Although these codes, combined with successive cancellation, are optimal in this respect, their finite-length performance ..."
Abstract
-
Cited by 72 (3 self)
- Add to MetaCart
Polar codes, introduced recently by Arıkan, are the first family of codes known to achieve capacity of symmetric channels using a low complexity successive cancellation decoder. Although these codes, combined with successive cancellation, are optimal in this respect, their finite-length performance is not record breaking. We discuss several techniques through which their finite-length performance can be improved. We also study the performance of these codes in the context of source coding, both lossless and lossy, in the single-user context as well as for distributed applications.
A new approach for mutual information analysis of large dimensional multi-antenna chennels
- 4004, 2008. FOR CERTAIN STATISTICS OF GRAM RANDOM MATRICES 41
"... This paper adresses the behaviour of the mutual information of correlated MIMO Rayleigh channels when the numbers of transmit and receive antennas converge to + ∞ at the same rate. Using a new and simple approach based on Poincaré-Nash inequality and on an integration by parts formula, it is rigorou ..."
Abstract
-
Cited by 29 (7 self)
- Add to MetaCart
(Show Context)
This paper adresses the behaviour of the mutual information of correlated MIMO Rayleigh channels when the numbers of transmit and receive antennas converge to + ∞ at the same rate. Using a new and simple approach based on Poincaré-Nash inequality and on an integration by parts formula, it is rigorously established that the mutual information when properly centered and rescaled converges to a Gaussian random variable whose mean and variance are evaluated. These results confirm previous evaluations based on the powerful but non rigorous replica method. It is believed that the tools that are used in this paper are simple, robust, and of interest for the communications engineering community.
Tight bounds on the capacity of binary input random CDMA systems
- IEEE TRANS. INFORM. THEORY
, 2008
"... We consider multiple access communication on a binary input additive white Gaussian noise channel using randomly spread code division. For a general class of symmetric distributions for spreading coefficients, in the limit of a large number of users, we prove an upper bound on the capacity, which ma ..."
Abstract
-
Cited by 15 (4 self)
- Add to MetaCart
(Show Context)
We consider multiple access communication on a binary input additive white Gaussian noise channel using randomly spread code division. For a general class of symmetric distributions for spreading coefficients, in the limit of a large number of users, we prove an upper bound on the capacity, which matches a formula that Tanaka obtained by using the replica method. We also show concentration of various relevant quantities including mutual information, capacity and free energy. The mathematical methods are quite general and allow us to discuss extensions to other multiuser scenarios.
The Martingale Approach for Concentration and Applications in Information Theory, Communications and Coding
"... ar ..."
(Show Context)
Polar codes for channel . . .
, 2009
"... The two central topics of information theory are the compression and the transmission of data. Shannon, in his seminal work, formalized both these problems and determined their fundamental limits. Since then the main goal of coding theory has been to find practical schemes that approach these limits ..."
Abstract
- Add to MetaCart
The two central topics of information theory are the compression and the transmission of data. Shannon, in his seminal work, formalized both these problems and determined their fundamental limits. Since then the main goal of coding theory has been to find practical schemes that approach these limits. Polar codes, recently invented by Arıkan, are the first “practical ” codes that are known to achieve the capacity for a large class of channels. Their code construction is based on a phenomenon called “channel polarization”. The encoding as well as the decoding operation of polar codes can be implemented with O(N logN) complexity, where N is the blocklength of the code. We show that polar codes are suitable not only for channel coding but also achieve optimal performance for several other important problems in informa-tion theory. The first problem we consider is lossy source compression. We construct polar codes that asymptotically approach Shannon’s rate-distortion bound for a large class of sources. We achieve this performance by design-