Results 1 - 10
of
14
Efficient compressive sensing with determinstic guarantees using expander graphs
- IEEE INFORMATION THEORY WORKSHOP
, 2007
"... Compressive sensing is an emerging technol-ogy which can recover a sparse signal vector of dimension n via a much smaller number of measurements than n. However, the existing compressive sensing methods may still suffer from relatively high recovery complexity, such as O(n3), or can only work effic ..."
Abstract
-
Cited by 80 (9 self)
- Add to MetaCart
(Show Context)
Compressive sensing is an emerging technol-ogy which can recover a sparse signal vector of dimension n via a much smaller number of measurements than n. However, the existing compressive sensing methods may still suffer from relatively high recovery complexity, such as O(n3), or can only work efficiently when the signal is super sparse, sometimes without deterministic performance guarantees. In this paper, we propose a compressive sensing scheme with deterministic performance guarantees using expander-graphs-based measurement matrices and show that the signal recovery can be achieved with complexity O(n) even if the number of nonzero elements k grows linearly with n. We also investigate compressive sensing for approximately sparse signals using this new method. Moreover, explicit constructions of the considered expander graphs exist. Simulation results are given to show the performance and complexity of the new method.
Asymptotic enumeration methods for analyzing LDPC codes
- IEEE Trans. Inform. Theory
, 2004
"... We show how asymptotic estimates of powers of polynomials with non-negative coefficients can be used in the analysis of low-density parity-check (LDPC) codes. In particular we show how these estimates can be used to derive the asymptotic distance spectrum of both regular and irregular LDPC code ense ..."
Abstract
-
Cited by 59 (2 self)
- Add to MetaCart
We show how asymptotic estimates of powers of polynomials with non-negative coefficients can be used in the analysis of low-density parity-check (LDPC) codes. In particular we show how these estimates can be used to derive the asymptotic distance spectrum of both regular and irregular LDPC code ensembles. We then consider the binary erasure channel (BEC). Using these estimates we derive lower bounds on the error exponent, under iterative decoding, of LDPC codes used over the BEC. Both regular and irregular code structures are considered. These bounds are compared to the corresponding bounds when optimal (maximum likelihood) decoding is applied.
Upper Bounds on the Rate of LDPC Codes
- IEEE Trans. on Information Theory
, 2002
"... We derive upper bounds on the rate of low density parity check (LDPC) codes for which reliable communication is achievable. We rst generalize Gallager's bound to a general binaryinput symmetric-output channel. We then proceed to derive tighter bounds. We also derive upper bounds on the rate as ..."
Abstract
-
Cited by 27 (2 self)
- Add to MetaCart
(Show Context)
We derive upper bounds on the rate of low density parity check (LDPC) codes for which reliable communication is achievable. We rst generalize Gallager's bound to a general binaryinput symmetric-output channel. We then proceed to derive tighter bounds. We also derive upper bounds on the rate as a function of the minimum distance of the code. We consider both individual codes and ensembles of codes. Index Terms - Low density parity check (LDPC) codes, iterative decoding, maximum-likelihood decoding, error probability, minimum distance. I
Bounds on the Performance of Belief Propagation Decoding
- IEEE Trans. Inform. Theory
, 2002
"... We consider Gallager’s soft decoding (belief propagation) algorithm for decoding low den-sity parity check (LDPC) codes, when applied to an arbitrary binary-input symmetric-output channel. By considering the expected values of the messages, we derive both lower and upper bounds on the performance of ..."
Abstract
-
Cited by 17 (5 self)
- Add to MetaCart
(Show Context)
We consider Gallager’s soft decoding (belief propagation) algorithm for decoding low den-sity parity check (LDPC) codes, when applied to an arbitrary binary-input symmetric-output channel. By considering the expected values of the messages, we derive both lower and upper bounds on the performance of the algorithm. We also derive various properties of the decoding algorithm, such as a certain robustness to the details of the channel noise. Our results apply both to regular and irregular LDPC codes. Index Terms- Belief propagation, Iterative decoding, Low density parity check (LDPC) codes, Sum product algorithm. I
Sparse graph codes for compression, sensing, and secrecy
, 2010
"... Sparse graph codes were first introduced by Gallager over 40 years ago. Over the last two decades, such codes have been the subject of intense research, and capacity-approaching sparse graph codes with low complexity encoding and decoding algo-rithms have been designed for many channels. Motivated ..."
Abstract
-
Cited by 8 (2 self)
- Add to MetaCart
Sparse graph codes were first introduced by Gallager over 40 years ago. Over the last two decades, such codes have been the subject of intense research, and capacity-approaching sparse graph codes with low complexity encoding and decoding algo-rithms have been designed for many channels. Motivated by the success of sparse graph codes for channel coding, we explore the use of sparse graph codes for four other problems related to compression, sensing, and security. First, we construct locally encodable and decodable source codes for a simple class of sources. Local encodability refers to the property that when the original source data changes slightly, the compression produced by the source code can be updated easily. Local decodability refers to the property that a single source symbol can be recovered without having to decode the entire source block.
Compressive Sensing for Sparse Approximations: Constructions, Algorithms, and Analysis
, 2010
"... ..."
(Show Context)
Two-Bit Message Passing Decoders for LDPC Codes Over the Binary Symmetric Channel
, 2009
"... In this paper, we consider quantized decoding of LDPC codes on the binary symmetric channel. The binary message passing algorithms, while allowing extremely fast hardware implementation, are not very attractive from the perspective of performance. More complex decoders such as the ones based on beli ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
In this paper, we consider quantized decoding of LDPC codes on the binary symmetric channel. The binary message passing algorithms, while allowing extremely fast hardware implementation, are not very attractive from the perspective of performance. More complex decoders such as the ones based on belief propagation exhibit superior performance but lead to slower decoders. The approach in this paper is to consider message passing decoders that have larger message alphabet (thereby providing performance improvement) as well as low complexity (thereby ensuring fast decoding). We propose a class of message-passing decoders whose messages are represented by two bits. The thresholds for various decoders in this class are derived using density evolution. The problem of correcting a fixed number of errors assumes significance in the error floor region. For a specific decoder, the sufficient conditions for correcting all patterns with up to three errors are derived. By comparing these conditions and thresholds to the similar ones when Gallager B decoder is used, we emphasize the advantage of decoding on a higher number of bits, even if the channel observation is still one bit.
On the Practicality of Low-Density Parity-Check Codes
"... Recent advances in coding theory have produced two classes of codes, turbo codes and low-density parity-check (LDPC) codes, which approach the Shannon limit of channel capacity while admitting efficient implementations of message encoding and decoding in software. Theoretic results about the latter ..."
Abstract
- Add to MetaCart
(Show Context)
Recent advances in coding theory have produced two classes of codes, turbo codes and low-density parity-check (LDPC) codes, which approach the Shannon limit of channel capacity while admitting efficient implementations of message encoding and decoding in software. Theoretic results about the latter have been shown to apply to the former, hence we examine the evolution of LDPC codes from their origin in Gallager’s 1963 thesis to their current incarnation as tornado codes developed by Luby, Mitzenmacher, Shokrollahi, and Spielman. After considering several analytic approaches to quantifying their performance, we discuss the practicality of LDPC codes, particularly those designed for the erasure channel, when applied to a number of current problems in networking. 1
On the Practicality of Low-Density Parity-Check Codes
, 2001
"... Recent advances in coding theory have produced two classes of codes, turbo codes and low-density parity-check (LDPC) codes, which approach the Shannon limit of channel capacity while admitting efficient implementations of message encoding and decoding in software. Theoretic results about the latter ..."
Abstract
- Add to MetaCart
(Show Context)
Recent advances in coding theory have produced two classes of codes, turbo codes and low-density parity-check (LDPC) codes, which approach the Shannon limit of channel capacity while admitting efficient implementations of message encoding and decoding in software. Theoretic results about the latter have been shown to apply to the former, hence we examine the evolution of LDPC codes from their origin in Gallager's 1963 thesis to their current incarnation as tornado codes developed by Luby, Mitzenmacher, Shokrollahi, and Spielman. After considering several analytic approaches to quantifying their performance, we discuss the practicality of LDPC codes, particularly those designed for the erasure channel, when applied to a number of current problems in networking.