Results 1  10
of
64
CapacityAchieving Ensembles for the Binary Erasure Channel with Bounded Complexity
 IEEE TRANS. INFORMATION THEORY
, 2004
"... We present two sequences of ensembles of nonsystematic irregular repeataccumulate codes which asymptotically (as their block length tends to infinity) achieve capacity on the binary erasure channel (BEC) with bounded complexity. This is in contrast to all previous constructions of capacityachievi ..."
Abstract

Cited by 63 (17 self)
 Add to MetaCart
(Show Context)
We present two sequences of ensembles of nonsystematic irregular repeataccumulate codes which asymptotically (as their block length tends to infinity) achieve capacity on the binary erasure channel (BEC) with bounded complexity. This is in contrast to all previous constructions of capacityachieving sequences of ensembles whose complexity grows at least like the log of the inverse of the gap to capacity. The new bounded complexity result is achieved by allowing a su#cient number of state nodes in the Tanner graph representing the codes.
Iterative Decoding Threshold Analysis for LDPC Convolutional Codes
 ACCEPTED FOR PUBLICATION IN IEEE TRANSACTIONS ON INFORMATION THEORY
"... An iterative decoding threshold analysis for terminated regular LDPC convolutional (LDPCC) codes is presented. Using density evolution techniques, the convergence behavior of an iterative belief propagation decoder is analyzed for the binary erasure channel and the AWGN channel with binary inputs. I ..."
Abstract

Cited by 62 (9 self)
 Add to MetaCart
An iterative decoding threshold analysis for terminated regular LDPC convolutional (LDPCC) codes is presented. Using density evolution techniques, the convergence behavior of an iterative belief propagation decoder is analyzed for the binary erasure channel and the AWGN channel with binary inputs. It is shown that for a terminated LDPCC code ensemble, the thresholds are better than for corresponding regular and irregular LDPC block codes.
Density evolution, thresholds and the stability condition for nonbinary LDPC codes
 IEE Proc. Commun
, 2005
"... ..."
(Show Context)
Capacityachieving codes with bounded graphical complexity on noisy channels
 in Proc. Allerton Conf. Commun., Control
, 2005
"... We introduce a new family of concatenated codes with an outer lowdensity paritycheck (LDPC) code and an inner lowdensity generator matrix (LDGM) code, and prove that these codes can achieve capacity under any memoryless binaryinput outputsymmetric (MBIOS) channel using maximumlikelihood (ML) de ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
(Show Context)
We introduce a new family of concatenated codes with an outer lowdensity paritycheck (LDPC) code and an inner lowdensity generator matrix (LDGM) code, and prove that these codes can achieve capacity under any memoryless binaryinput outputsymmetric (MBIOS) channel using maximumlikelihood (ML) decoding with bounded graphical complexity, i.e., the number of edges per information bit in their graphical representation is bounded. We also show that these codes can achieve capacity for the special case of the binary erasure channel (BEC) under belief propagation (BP) decoding with bounded decoding complexity per information bit for all erasure probabilities in (0, 1). By deriving and analyzing the average weight distribution (AWD) and the corresponding asymptotic growth rate of these codes with a rate1 inner LDGM code, we also show that these codes achieve the GilbertVarshamov bound with asymptotically high probability. This result can be attributed to the presence of the inner rate1 LDGM code, which is demonstrated to help eliminate high weight codewords in the LDPC code while maintaining a vanishingly small amount of low weight codewords. 1
Complexity versus performance of capacityachieving irregular repeataccumulate codes on the binary erasure channel
 IEEE TRANS. ON INFORMATION THEORY
, 2004
"... We derive upper and lower bounds on the encoding and decoding complexity of two capacityachieving ensembles of irregular repeataccumulate (IRA1 and IRA2) codes on the binary erasure channel (BEC). These bounds are expressed in terms of the gap between the channel capacity and the rate of a typical ..."
Abstract

Cited by 23 (7 self)
 Add to MetaCart
(Show Context)
We derive upper and lower bounds on the encoding and decoding complexity of two capacityachieving ensembles of irregular repeataccumulate (IRA1 and IRA2) codes on the binary erasure channel (BEC). These bounds are expressed in terms of the gap between the channel capacity and the rate of a typical code from this ensemble for which reliable communications is achievable under messagepassing iterative (MPI) decoding. The complexity of the ensemble of IRA1 codes grows like the negative logarithm of the gap to capacity. On the other hand, the complexity of the ensemble of IRA2 codes with any choice of the degree distribution grows at least like the inverse square root of the gap to capacity, and at most like the inverse of the gap to capacity.
An improved spherepacking bound for finitelength codes over symmetric memoryless channels
 IEEE Transactions on Information Theory
, 1962
"... This paper derives an improved spherepacking (ISP) bound for finitelength errorcorrecting codes whose transmission takes place over symmetric memoryless channels, and the codes are decoded with an arbitrary list decoder. We first review classical results, i.e., the 1959 spherepacking (SP59) boun ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
(Show Context)
This paper derives an improved spherepacking (ISP) bound for finitelength errorcorrecting codes whose transmission takes place over symmetric memoryless channels, and the codes are decoded with an arbitrary list decoder. We first review classical results, i.e., the 1959 spherepacking (SP59) bound of Shannon for the Gaussian channel, and the 1967 spherepacking (SP67) bound of Shannon et al. for discrete memoryless channels. An improvement on the SP67 bound, as suggested by Valembois and Fossorier, is also discussed. These concepts are used for the derivation of a new lower bound on the error probability of list decoding (referred to as the ISP bound) which is uniformly tighter than the SP67 bound and its improved version. The ISP bound is applicable to symmetric memoryless channels, and some of its applications are exemplified. Its tightness under ML decoding is studied by comparing the ISP bound to previously reported upper and lower bounds on the ML decoding error probability, and also to computer simulations of iteratively decoded turbolike codes. This paper also presents a technique which performs the entire calculation of the SP59 bound in the logarithmic domain, thus facilitating the exact calculation of this bound for moderate to large block lengths without the need for the asymptotic approximations provided by Shannon.
LowDensity Graph Codes That Are Optimal for Binning and Coding With Side Information
"... Abstract—In this paper, we describe and analyze the source and channel coding properties of a class of sparse graphical codes based on compounding a lowdensity generator matrix (LDGM) code with a lowdensity paritycheck (LDPC) code. Our first pair of theorems establishes that there exist codes fro ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, we describe and analyze the source and channel coding properties of a class of sparse graphical codes based on compounding a lowdensity generator matrix (LDGM) code with a lowdensity paritycheck (LDPC) code. Our first pair of theorems establishes that there exist codes from this ensemble, with all degrees remaining bounded independently of block length, that are simultaneously optimal for both channel coding and source coding with binary data when encoding and decoding are performed optimally. More precisely, in the context of lossy compression, we prove that finitedegree constructions can achieve any pair (R; D) on the ratedistortion curve of the binary symmetric source. In the context of channel coding, we prove that the same finitedegree codes can achieve any pair (C; p) on the capacitynoise curve of the binary symmetric channel (BSC). Next, we show that our compound construction has a nested structure that can be exploited to achieve the Wyner–Ziv bound for source coding with side information (SCSI), as well as the Gelfand–Pinsker bound for channel coding with side information (CCSI). Although the results described here are based on optimal encoding and decoding, the proposed graphical codes have sparse structure and high girth that renders them well suited to message passing and other efficient decoding procedures. Index Terms—Channel coding, coding with side information, distributed source coding, Gelfand–Pinsker problem, graphical codes, information embedding, lowdensity generator matrix code (LDGM), lowdensity paritycheck code (LDPC), source coding, weight enumerator, Wyner–Ziv problem. I.
Tightened upper bounds on the ML decoding error probability of binary linear block codes
 IEEE TRANS. ON INFORMATION THEORY
, 2006
"... The performance of maximumlikelihood (ML) decoded binary linear block codes is addressed via the derivation of tightened upper bounds on their decoding error probability. The upper bounds on the block and bit error probabilities are valid for any memoryless, binaryinput and outputsymmetric communi ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
(Show Context)
The performance of maximumlikelihood (ML) decoded binary linear block codes is addressed via the derivation of tightened upper bounds on their decoding error probability. The upper bounds on the block and bit error probabilities are valid for any memoryless, binaryinput and outputsymmetric communication channel, and their effectiveness is exemplified for various ensembles of turbolike codes over the AWGN channel. An expurgation of the distance spectrum of binary linear block codes further tightens the resulting upper bounds.
The price of certainty: “waterslide curves” and the gap to capacity
 IEEE Trans. Inform. Theory, Submitted 2007. [Online]. Available: http://arxiv.org/abs/0801.0352
"... The classical problem of reliable pointtopoint digital communication is to achieve a low probability of error while keeping the rate high and the total power consumption small. Traditional informationtheoretic analysis uses explicit models for the communication channel to study the power spent in ..."
Abstract

Cited by 14 (9 self)
 Add to MetaCart
(Show Context)
The classical problem of reliable pointtopoint digital communication is to achieve a low probability of error while keeping the rate high and the total power consumption small. Traditional informationtheoretic analysis uses explicit models for the communication channel to study the power spent in transmission. The resulting bounds are expressed using ‘waterfall ’ curves that convey the revolutionary idea that unboundedly low probabilities of biterror are attainable using only finite transmit power. However, practitioners have long observed that the decoder complexity, and hence the total power consumption, goes up when attempting to use sophisticated codes that operate close to the waterfall curve. This paper gives an explicit model for power consumption at an idealized decoder that allows for extreme parallelism in implementation. The decoder architecture is in the spirit of message passing and iterative decoding for sparsegraph codes, but is further idealized in that it allows for more computational power than is currently known to be implementable. Generalized spherepacking arguments are used to derive lower bounds on the decoding power needed for any possible code given only the gap from the Shannon limit and the desired probability of error. As the gap goes to zero, the energy per bit spent in decoding is shown to go to infinity. This suggests that to optimize total power, the transmitter should operate at a power that is strictly above the minimum demanded by the Shannon capacity. The lower bound is plotted to show an unavoidable tradeoff between the average biterror probability and the total power used in transmission and decoding. In the spirit of conventional waterfall curves, we call these ‘waterslide’ curves. The bound is shown to be order optimal by showing the existence of codes that can achieve similarly shaped waterslide curves under the proposed idealized model of decoding. 1 The price of certainty: “waterslide curves ” and the gap to capacity I.
Capacity achieving LDPC codes through puncturing
 in Proc. International Conf. on Wireless Networks, Commun., and Mobile Comp., Maui
, 2005
"... The performance of punctured LDPC codes under maximumlikelihood (ML) decoding is studied in this paper via deriving and analyzing their average weight distributions (AWDs) and the corresponding asymptotic growth rate of the AWDs. In particular, we prove that capacityachieving codes of any rate and ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
(Show Context)
The performance of punctured LDPC codes under maximumlikelihood (ML) decoding is studied in this paper via deriving and analyzing their average weight distributions (AWDs) and the corresponding asymptotic growth rate of the AWDs. In particular, we prove that capacityachieving codes of any rate and for any memoryless binaryinput outputsymmetric (MBIOS) channel under ML decoding can be constructed by puncturing some original LDPC code with small enough rate. Moreover, we prove that the gap to capacity of all the punctured codes can be the same as the original code with a small enough rate. Conditions under which puncturing results in no rate loss with asymptotically high probability are also given in the process. These results show high potential for puncturing to be used in designing capacityachieving codes, and also be used in ratecompatible coding under any MBIOS channel.