Results 1  10
of
77,291
Near Shannon limit errorcorrecting coding and decoding
, 1993
"... Abstract This paper deals with a new class of convolutional codes called Turbocodes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The TurboCode encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes and the associated ..."
Abstract

Cited by 1771 (6 self)
 Add to MetaCart
Abstract This paper deals with a new class of convolutional codes called Turbocodes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The TurboCode encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes
Physics of the Shannon Limits
, 2009
"... Abstract—We provide a simple physical interpretation, in the context of the second law of thermodynamics, to the information inequality (a.k.a. the Gibbs ’ inequality, which is also equivalent to the log–sum inequality), asserting that the relative entropy between two probability distributions canno ..."
Abstract
 Add to MetaCart
cannot be negative. Since this inequality stands at the basis of the data processing theorem (DPT), and the DPT in turn is at the heart of most, if not all, proofs of converse theorems in Shannon theory, it is observed that conceptually, the roots of fundamental limits of Information Theory can actually
Physics of the Shannon Limits
, 903
"... Abstract — We provide a simple physical interpretation, in the context of the second law of thermodynamics, to the information inequality (a.k.a. the Gibbs ’ inequality, which is also equivalent to the log–sum inequality), asserting that the relative entropy between two probability distributions can ..."
Abstract
 Add to MetaCart
cannot be negative. Since this inequality stands at the basis of the data processing theorem (DPT), and the DPT in turn is at the heart of most, if not all, proofs of converse theorems in Shannon theory, it is observed that conceptually, the roots of fundamental limits of Information Theory can actually
On the design of lowdensity paritycheck codes within 0.0045 dB of the Shannon limit
 IEEE COMMUNICATIONS LETTERS
, 2001
"... We develop improved algorithms to construct good lowdensity paritycheck codes that approach the Shannon limit very closely. For rate 1/2, the best code found has a threshold within 0.0045 dB of the Shannon limit of the binaryinput additive white Gaussian noise channel. Simulation results with a ..."
Abstract

Cited by 307 (6 self)
 Add to MetaCart
We develop improved algorithms to construct good lowdensity paritycheck codes that approach the Shannon limit very closely. For rate 1/2, the best code found has a threshold within 0.0045 dB of the Shannon limit of the binaryinput additive white Gaussian noise channel. Simulation results with a
NearShannonlimit Lineartimeencodable Nonbinary Irregular LDPC Codes
 in Proceedings of the 28th IEEE GLOBECOM. Piscataway
, 2009
"... Abstract—In this paper, we present a novel method to construct nonbinary irregular LDPC codes whose parity check matrix has only column weights of 2 and t, where t ≥ 3. The constructed codes can be encoded in linear time and in a parallel fashion. Also, they can achieve nearShannonlimit performanc ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Abstract—In this paper, we present a novel method to construct nonbinary irregular LDPC codes whose parity check matrix has only column weights of 2 and t, where t ≥ 3. The constructed codes can be encoded in linear time and in a parallel fashion. Also, they can achieve nearShannonlimit
Near Shannon Limit Performance of Low DensityParity Check Codes
, 1996
"... Indexing terms: Errorcorrection codes, probabilistic decoding. Abstract We report the empirical performance of Gallager's low density parity check codes on Gaussian channels. We show that performance substantially better than that of standard convolutional and concatenated codes can be achieve ..."
Abstract
 Add to MetaCart
be achieved; indeed the performance is almost as close to the Shannon limit as that of Turbo codes.
Nonlinear communication channels with capacity above the linear Shannon limit
, 2012
"... We prove that, under certain conditions, the capacity of an optical communication channel with inline, nonlinear filtering (regeneration) elements can be higher than the Shannon capacity for the corresponding linear Gaussian white noise channel. © 2012 Optical Society of America OCIS codes: 060.233 ..."
Abstract
 Add to MetaCart
]. The information capacity of a linear channel with additive white Gaussian noise (AWGN) is often called the Shannon limit, stressing the fact that it is the maximum errorfree data rate achievable in such channel [1] and in any channels sub optimal to AWGN. The optical fiber channel characterized by additive
Implementation of near Shannon Limit errorcorrecting codes using reconfigurable hardware
 Proc. IEEE Symp. on FieldProg. Cust. Comput. Mach
, 2000
"... Abstract  Error correcting codes (ECCs) are widely used in digital communications. Recently, new types of ECCs have been proposed which permit errorfree data transmission over noisy channels at rates which approach the Shannon capacity. For wireless communication, these new codes allow more data t ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(LDPCs), another ECC, also provide near Shannon limit error correction ability. However, LDPCs use a decoding scheme which is much more amenable to hardware implementation. This paper will rst present an overview of these coding schemes, then discuss the issues involved in building an LDPC decoder using
Good ErrorCorrecting Codes based on Very Sparse Matrices
, 1999
"... We study two families of errorcorrecting codes defined in terms of very sparse matrices. "MN" (MacKayNeal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The ..."
Abstract

Cited by 754 (23 self)
 Add to MetaCart
. The decoding of both codes can be tackled with a practical sumproduct algorithm. We prove that these codes are "very good," in that sequences of codes exist which, when optimally decoded, achieve information rates up to the Shannon limit. This result holds not only for the binarysymmetric channel
Transactions Letters________________________________________________________________ On the LowRate Shannon Limit for Binary Intersymbol Interference Channels
"... with finite intersymbol interference, we prove that reliable communication can be achieved if and only if 0 log 2 opt, for some constant opt that depends on the channel. To determine this constant, we consider the finitestate machine which represents the output sequences of the channel filter when ..."
Abstract
 Add to MetaCart
with finite intersymbol interference, we prove that reliable communication can be achieved if and only if 0 log 2 opt, for some constant opt that depends on the channel. To determine this constant, we consider the finitestate machine which represents the output sequences of the channel filter when driven by binary inputs.We then define opt as the maximum output power achieved by a simple cycle in this graph, and show that no other cycle or asymptotically long sequence can achieve an output power greater than this.We provide examples where the binary input constraint leads to a suboptimality, and other cases where binary signaling is just as effective as real signaling at very low signaltonoise ratios. Index Terms—Information rates, intersymbol interference (ISI), magnetic recording, modulation coding. I.
Results 1  10
of
77,291