• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Good Error-Correcting Codes based on Very Sparse Matrices (1999)

by David J.C. MacKay
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 751
Next 10 →

Design of capacity-approaching irregular low-density parity-check codes

by Thomas J. Richardson, M. Amin Shokrollahi, Rüdiger L. Urbanke - IEEE TRANS. INFORM. THEORY , 2001
"... We design low-density parity-check (LDPC) codes that perform at rates extremely close to the Shannon capacity. The codes are built from highly irregular bipartite graphs with carefully chosen degree patterns on both sides. Our theoretical analysis of the codes is based on [1]. Assuming that the unde ..."
Abstract - Cited by 588 (6 self) - Add to MetaCart
We design low-density parity-check (LDPC) codes that perform at rates extremely close to the Shannon capacity. The codes are built from highly irregular bipartite graphs with carefully chosen degree patterns on both sides. Our theoretical analysis of the codes is based on [1]. Assuming that the underlying communication channel is symmetric, we prove that the probability densities at the message nodes of the graph possess a certain symmetry. Using this symmetry property we then show that, under the assumption of no cycles, the message densities always converge as the number of iterations tends to infinity. Furthermore, we prove a stability condition which implies an upper bound on the fraction of errors that a belief-propagation decoder can correct when applied to a code induced from a bipartite graph with a given degree distribution. Our codes are found by optimizing the degree structure of the underlying graphs. We develop several strategies to perform this optimization. We also present some simulation results for the codes found which show that the performance of the codes is very close to the asymptotic theoretical bounds.
(Show Context)

Citation Context

...e best degree distribution pair with some a priori bound on the size of the degrees. 6 In the case of maximum-likelihood decoding this was answered in the affirmative by Gallager and McKay, see [13], =-=[14]-=-. 7 We conjecture that a similar statement (and proof) can be given for continuous channels. 8 In fact, a similar theorem holds also for the erasure channel [15, Theorem 1], and yet, there are capacit...

The Capacity of Low-Density Parity-Check Codes Under Message-Passing Decoding

by Thomas J. Richardson, Rüdiger L. Urbanke , 2001
"... In this paper, we present a general method for determining the capacity of low-density parity-check (LDPC) codes under message-passing decoding when used over any binary-input memoryless channel with discrete or continuous output alphabets. Transmitting at rates below this capacity, a randomly chos ..."
Abstract - Cited by 574 (9 self) - Add to MetaCart
In this paper, we present a general method for determining the capacity of low-density parity-check (LDPC) codes under message-passing decoding when used over any binary-input memoryless channel with discrete or continuous output alphabets. Transmitting at rates below this capacity, a randomly chosen element of the given ensemble will achieve an arbitrarily small target probability of error with a probability that approaches one exponentially fast in the length of the code. (By concatenating with an appropriate outer code one can achieve a probability of error that approaches zero exponentially fast in the length of the code with arbitrarily small loss in rate.) Conversely, transmitting at rates above this capacity the probability of error is bounded away from zero by a strictly positive constant which is independent of the length of the code and of the number of iterations performed. Our results are based on the observation that the concentration of the performance of the decoder around its average performance, as observed by Luby et al. [1] in the case of a binary-symmetric channel and a binary message-passing algorithm, is a general phenomenon. For the particularly important case of belief-propagation decoders, we provide an effective algorithm to determine the corresponding capacity to any desired degree of accuracy. The ideas presented in this paper are broadly applicable and extensions of the general method to low-density parity-check codes over larger alphabets, turbo codes, and other concatenated coding schemes are outlined.
(Show Context)

Citation Context

...ders, turbo codes, turbo decoding. I. INTRODUCTION IN the wake of the phenomenal success of turbo codes [2], another class of codes exhibiting similar characteristics and performance was rediscovered =-=[3]-=-, [4]. This class of codes, called low-density parity-check (LDPC) codes, was first introduced by Gallager in his thesis in 1961 [5]. In the period between Gallager’s thesis and the invention of turbo...

Near Shannon limit performance of low density parity check codes”,

by D J C MacKay, R M Neal - Electronics Letters, , 1962
"... ..."
Abstract - Cited by 500 (22 self) - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...aper we report the empirical performance of these codes on Gaussian channels. We have proved theoretical properties of GL codes (essentially, that the channel coding theorem holds for them) elsewhere =-=[9]-=-. GL codes can also be defined over GF (q). We are currently implementing this generalization. We created sparse random parity check matrices in the following ways. Construction 1A. An M by N matrix (...

Turbo decoding as an instance of Pearl’s belief propagation algorithm

by Robert J. Mceliece, David J. C. Mackay, Jung-fu Cheng - IEEE Journal on Selected Areas in Communications , 1998
"... Abstract—In this paper, we will describe the close connection between the now celebrated iterative turbo decoding algorithm of Berrou et al. and an algorithm that has been well known in the artificial intelligence community for a decade, but which is relatively unknown to information theorists: Pear ..."
Abstract - Cited by 404 (16 self) - Add to MetaCart
Abstract—In this paper, we will describe the close connection between the now celebrated iterative turbo decoding algorithm of Berrou et al. and an algorithm that has been well known in the artificial intelligence community for a decade, but which is relatively unknown to information theorists: Pearl’s belief propagation algorithm. We shall see that if Pearl’s algorithm is applied to the “belief network ” of a parallel concatenation of two or more codes, the turbo decoding algorithm immediately results. Unfortunately, however, this belief diagram has loops, and Pearl only proved that his algorithm works when there are no loops, so an explanation of the excellent experimental performance of turbo decoding is still lacking. However, we shall also show that Pearl’s algorithm can be used to routinely derive previously known iterative, but suboptimal, decoding algorithms for a number of other error-control systems, including Gallager’s
(Show Context)

Citation Context

...alomon Brothers Inc., New York, NY 10048 USA. Publisher Item Identifier S 0733-8716(98)00170-X. 0733–8716/98$10.00 © 1998 IEEE paper that motivated this one, is that of MacKay and Neal [37]. See also =-=[38]-=- and [39].) In this paper, we will review the turbo decoding algorithm as originally expounded by Berrou et al. [10], but which was perhaps explained more lucidly in [3], [18], or [50]. We will then d...

On the design of low-density parity-check codes within 0.0045 dB of the Shannon limit

by Sae-young Chung, G. David Forney, Jr., Thomas J. Richardson, Rüdiger Urbanke - IEEE COMMUNICATIONS LETTERS , 2001
"... We develop improved algorithms to construct good low-density parity-check codes that approach the Shannon limit very closely. For rate 1/2, the best code found has a threshold within 0.0045 dB of the Shannon limit of the binary-input additive white Gaussian noise channel. Simulation results with a ..."
Abstract - Cited by 306 (6 self) - Add to MetaCart
We develop improved algorithms to construct good low-density parity-check codes that approach the Shannon limit very closely. For rate 1/2, the best code found has a threshold within 0.0045 dB of the Shannon limit of the binary-input additive white Gaussian noise channel. Simulation results with a somewhat simpler code show that we can achieve within 0.04 dB of the Shannon limit at a bit error rate of 10 T using a block length of 10 U.
(Show Context)

Citation Context

...region asymptotically as the block length tends to innity. (Density evolution based on combinatorial or Monte Carlo approaches had been previously attempted by Gallager [2], Spielman [7], and MacKay [=-=8]-=-.) In this letter, we develop an improved implementation of density evolution called discretized density evolution [9]. We show that this improved algorithm models the exact behavior of discretized su...

Analysis of sum-product decoding of low-density parity-check codes using a Gaussian approximation

by Sae-Young Chung, Thomas J. Richardson, Rüdiger L. Urbanke - IEEE TRANS. INFORM. THEORY , 2001
"... Density evolution is an algorithm for computing the capacity of low-density parity-check (LDPC) codes under messagepassing decoding. For memoryless binary-input continuous-output additive white Gaussian noise (AWGN) channels and sum-product decoders, we use a Gaussian approximation for message densi ..."
Abstract - Cited by 244 (2 self) - Add to MetaCart
Density evolution is an algorithm for computing the capacity of low-density parity-check (LDPC) codes under messagepassing decoding. For memoryless binary-input continuous-output additive white Gaussian noise (AWGN) channels and sum-product decoders, we use a Gaussian approximation for message densities under density evolution to simplify the analysis of the decoding algorithm. We convert the infinite-dimensional problem of iteratively calculating message densities, which is needed to find the exact threshold, to a one-dimensional problem of updating means of Gaussian densities. This simplification not only allows us to calculate the threshold quickly and to understand the behavior of the decoder better, but also makes it easier to design good irregular LDPC codes for AWGN channels. For various regular LDPC codes we have examined, thresholds can be estimated within 0.1 dB of the exact value. For rates between 0.5 and 0.9, codes designed using the Gaussian approximation perform within 0.02 dB of the best performing codes found so far by using density evolution when the maximum variable degree is IH. We show that by using the Gaussian approximation, we can visualize the sum-product decoding algorithm. We also show that the optimization of degree distributions can be understood and done graphically using the visualization.
(Show Context)

Citation Context

...DUCTION FOR many channels and iterative decoders of interest, lowdensity parity-check (LDPC) codes—first discovered by Gallager [1], [2] and rediscovered by Spielman et al. [3] and MacKay et al. [4], =-=[5]-=-—exhibit a threshold phenomenon: as the block length tends to infinity, an arbitrarily small bit-error probability can be achieved if the noise level is smaller than a certain threshold. For a noise l...

On the Optimality of Solutions of the Max-Product Belief Propagation Algorithm in Arbitrary Graphs

by Yair Weiss, William T. Freeman , 2001
"... Graphical models, suchasBayesian networks and Markov random fields, represent statistical dependencies of variables by a graph. The max-product "belief propagation" algorithm is a local-message passing algorithm on this graph that is known to converge to a unique fixed point when the gra ..."
Abstract - Cited by 241 (13 self) - Add to MetaCart
Graphical models, suchasBayesian networks and Markov random fields, represent statistical dependencies of variables by a graph. The max-product "belief propagation" algorithm is a local-message passing algorithm on this graph that is known to converge to a unique fixed point when the graph is a tree. Furthermore, when the graph is a tree, the assignment based on the fixed-point yields the most probable a posteriori (MAP) values of the unobserved variables given the observed ones. Recently, good

Correctness of Local Probability Propagation in Graphical Models with Loops

by Yair Weiss , 2000
"... This article analyzes the behavior of local propagation rules in graphical models with a loop. ..."
Abstract - Cited by 231 (8 self) - Add to MetaCart
This article analyzes the behavior of local propagation rules in graphical models with a loop.

Distributed source coding for sensor networks

by Zixiang Xiong, Angelos D. Liveris, Samuel Cheng - In IEEE Signal Processing Magazine , 2004
"... n recent years, sensor research has been undergoing a quiet revolution, promising to have a significant impact throughout society that could quite possibly dwarf pre-vious milestones in the information revolution. MIT Technology Review ranked wireless sensor networks that con-sist of many tiny, low- ..."
Abstract - Cited by 224 (4 self) - Add to MetaCart
n recent years, sensor research has been undergoing a quiet revolution, promising to have a significant impact throughout society that could quite possibly dwarf pre-vious milestones in the information revolution. MIT Technology Review ranked wireless sensor networks that con-sist of many tiny, low-power and cheap wireless sensors as the number one emerging technology. Unlike PCs or the Internet, which are designed to support all types of applications, sensor networks are usually mission driven and application specific (be it detection of biological agents and toxic chemicals; environmental measure-ment of temperature, pressure and vibration; or real-time area video surveillance). Thus they must operate under a set of unique constraints and requirements. For example, in contrast to many other wireless devices (e.g., cellular phones, PDAs, and laptops), in which energy can be recharged from time to time, the energy provisioned for a wireless sensor node is not expected to be renewed throughout its mission. The limited amount of energy available to wireless sensors has a significant impact on all aspects of a wireless sensor network, from the amount of information that the node can process, to the volume of wireless communication it can carry across large distances. Realizing the great promise of sensor networks requires more than a mere advance in individual technologies; it relies on many com-ponents working together in an efficient, unattended, comprehensible, and trustworthy manner. One of the enabling technologies for sensor networks is distributed source coding (DSC), which refers to the compression of multiple correlated sensor out-puts [1]–[4] that do not communicate with each other (hence distributed coding). These sensors send their compressed outputs to a central point [e.g., the base station (BS)] for joint decoding. I

Improved low-density parity-check codes using irregular graphs

by Michael G. Luby, Michael Mitzenmacher, M. Amin Shokrollahi, Daniel A. Spielman - IEEE Trans. Inform. Theory , 2001
"... Abstract—We construct new families of error-correcting codes based on Gallager’s low-density parity-check codes. We improve on Gallager’s results by introducing irregular parity-check matrices and a new rigorous analysis of hard-decision decoding of these codes. We also provide efficient methods for ..."
Abstract - Cited by 223 (15 self) - Add to MetaCart
Abstract—We construct new families of error-correcting codes based on Gallager’s low-density parity-check codes. We improve on Gallager’s results by introducing irregular parity-check matrices and a new rigorous analysis of hard-decision decoding of these codes. We also provide efficient methods for finding good irregular structures for such decoding algorithms. Our rigorous analysis based on martingales, our methodology for constructing good irregular codes, and the demonstration that irregular structure improves performance constitute key points of our contribution. We also consider irregular codes under belief propagation. We report the results of experiments testing the efficacy of irregular codes on both binary-symmetric and Gaussian channels. For example, using belief propagation, for rate I R codes on 16 000 bits over a binary-symmetric channel, previous low-density parity-check codes can correct up to approximately 16 % errors, while our codes correct over 17%. In some cases our results come very close to reported results for turbo codes, suggesting that variations of irregular low density parity-check codes may be able to match or beat turbo code performance. Index Terms—Belief propagation, concentration theorem, Gallager codes, irregular codes, low-density parity-check codes.
(Show Context)

Citation Context

...mation it can transmit to its neighbors. These two competing requirements must be appropriately balanced. Previous work has shown that for regular graphs, low degree graphs yield the best performance =-=[1]-=-. Ifone allows irregular graphs, however, there is significantly more flexibility in balancing these competing requirements. Message nodes with high degree will tend to their correct value quickly. Th...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University