Results 11  20
of
20
Linear codes based lossless joint sourcechannel coding for multipleaccess channels,” submitted to IEEE Trans. Inf. Theory, draft available at http://arxiv.org/abs/cs.IT/0611146
"... ..."
(Show Context)
Invariance Properties of Binary Linear Codes Over a Memoryless Channel With Discrete Input
"... Abstract—This work studies certain properties of the probability density function (pdf) of the bit loglikelihood ratio (LLR) for binary linear block codes over a memoryless channel with discrete input and discrete or continuous output. We prove that under a set of mild conditions, the pdf of the bi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract—This work studies certain properties of the probability density function (pdf) of the bit loglikelihood ratio (LLR) for binary linear block codes over a memoryless channel with discrete input and discrete or continuous output. We prove that under a set of mild conditions, the pdf of the bit LLR of a specific bit position is independent of the transmitted codeword. It is also shown that the pdf of a given bit LLR when the corresponding bit takes the values of zero and one are symmetric with respect to each other (reflection of one another with respect to the vertical axis). For the case of channels with binary input, a sufficient condition for two bit positions to have the same pdf is presented. Index Terms—Bit decoding, block codes, geometrically uniform, loglikelihood ratio (LLR), probability density function (pdf), regular channel, symmetric channel. I.
Good ErrorCorrecting Codes based on Very Sparse Matrices
"... Abstract We study two families of errorcorrecting codes defined in terms of very sparse matrices. `MN ' (MacKayNeal) codes are recently invented, and `Gallager codes ' were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The ..."
Abstract
 Add to MetaCart
Abstract We study two families of errorcorrecting codes defined in terms of very sparse matrices. `MN ' (MacKayNeal) codes are recently invented, and `Gallager codes ' were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The decoding of both codes can be tackled with a practical sumproduct algorithm. We prove that these codes are `very good', in that sequences of codes exist which, when optimally decoded, achieve information rates up to the Shannon limit. This result holds not only for the binary symmetric channel but also for any channel with symmetric stationary ergodic noise. We give experimental results for binary symmetric channels and Gaussian channels demonstrating that practical performance substantially better than that of standard convolutional and concatenated codes can be achieved; indeed the performance of Gallager codes is almost as close to the Shannon limit as that of Turbo codes. Keywords Errorcorrection codes, iterative probabilistic decoding, Shannon limit, lowcomplexity decoding.
0 An Exposition of a Result in “Conjugate Codes for Secure and Reliable Information Transmission”
, 2010
"... ar ..."
(Show Context)
1 Secure Multiplex Coding with Dependent and NonUniform Multiple Messages
"... ar ..."
(Show Context)