### Citations

534 |
Limit Distributions for Sums of Independent Random Variables (Translated from the Russian by Kai Lai Chung
- Gnedenko, Kolmogorov
- 1968
(Show Context)
Citation Context ...ever, the theorem is true regardless of the minimal character of the exponent. Although it is not necessary in the proof, it can be shown, using "tilted" probabilities [4] and a central limit theorem =-=[7]-=-, that asymptotically for large n, N1h nk _0(s)i ! 2p2ssn_0(s) exp nk \Theta _(s) \Gammass_0(s) + (k \Gammas1) ln 2\Lambdas(2.15) Theorem 2.3 can now be used to find the probability P (`) of the set o... |

137 |
On a class of error correcting binary group codes.
- Bose, Ray-Chaudhuri
- 1960
(Show Context)
Citation Context ... codes [3] with sequential decoding as developed by Wozencraft [17], Fano [5], and Reiffen [14]; second, convolutional codes with Massey's threshold decoding [10]; and third, the Bose-Chaudhuri codes =-=[2]-=- with the decoding schemes developed by Peterson [12] and Zierler and Gorenstein [18]. It has been shown by Fano [5] that for arbitrary discrete memoryless channels, sequential decoding has a probabil... |

99 |
A heuristic discussion of probabilistic decoding
- Fano
- 1963
(Show Context)
Citation Context ...promising for achieving low error probabilities and high data rates at reasonable cost are the following: first, convolutional codes [3] with sequential decoding as developed by Wozencraft [17], Fano =-=[5]-=-, and Reiffen [14]; second, convolutional codes with Massey's threshold decoding [10]; and third, the Bose-Chaudhuri codes [2] with the decoding schemes developed by Peterson [12] and Zierler and Gore... |

88 |
A comparison of signalling alphabets
- Gilbert
- 1952
(Show Context)
Citation Context ...ep function with the step at that ffi0 ! 12 for which H(ffi0) = (1 \GammasR) ln 2. Figure 2.4 plots ffi0 as a function of rate. This result is closely related to the Gilbert bound on minimum distance =-=[6]-=-. The asymptotic form of the Gilbert bound for large n states that there exists a code for which D * nffi0. Theorem 2.2 states that for any ffl ? 0, the probability of the set of parity-check codes fo... |

77 |
Threshold decoding
- Massey
- 1963
(Show Context)
Citation Context ...st are the following: first, convolutional codes [3] with sequential decoding as developed by Wozencraft [17], Fano [5], and Reiffen [14]; second, convolutional codes with Massey's threshold decoding =-=[10]-=-; and third, the Bose-Chaudhuri codes [2] with the decoding schemes developed by Peterson [12] and Zierler and Gorenstein [18]. It has been shown by Fano [5] that for arbitrary discrete memoryless cha... |

51 |
Coding for two noisy channels,”
- Elias
- 1955
(Show Context)
Citation Context ...coder stores a waveform or code 4sword for each possible block of * binits, then the storage requirement must be proportional to 2*, which is obviously impractical when * is large. Fortunately, Elias =-=[3]-=- and Reiffen [14] have proved that for a wide variety of channel models, the results of the Noisy Channel Coding theorem can be achieved with little equipment complexity at the encoder by the use of p... |

48 |
Certain results in coding theory for noisy channels
- Shannon
- 1957
(Show Context)
Citation Context ...igh error probabilities. A more fundamental approach to the problems of efficiency and reliability in communication systems is contained in the Noisy Channel Coding theorem developed by C. E. Shannon =-=[15, 4]-=- in 1948. In order to understand the meaning of this theorem, consider Figure 1.1. The source produces binary digits, or binits, at some fixed time rate Rt. The encoder is a device that performs data ... |

18 |
Error-Correcting Codes (M.I.T
- Peterson
- 1961
(Show Context)
Citation Context ...y, then the probability of decoding error will be almost as small as that for the best possible code of that rate and block length. 1For a more detailed discussion of parity-check codes, see Peterson =-=[12]-=-. 2The modulo 2 sum is 1 if the ordinary sum is odd and 0 if the ordinary sum is even. 6sUnfortunately, the decoding of parity-check codes is not inherently simple to implement; thus we must look for ... |

15 |
Coding for two-way channels",
- Wozencraft, Horstein
- 1961
(Show Context)
Citation Context ...ursts, it is often impractical to correct errors in these noisy periods. In such cases it is advantageous to use a combination of error correction and error detection with feedback and retransmission =-=[16]-=-. All of the coding and decoding schemes being considered here fit naturally into such a system, but in cases where little or no error correction is attempted, low-density codes appear at a disadvanta... |

10 |
Transmission of Information, The M.I.T
- Fano
- 1961
(Show Context)
Citation Context ...exponent in Equation (2.14) equal to 0, we get ` = (n=k)_0(s), and when we substitute this value of ` in Equation (2.14), Equation (2.8) results, thereby proving the theorem. It is shown in Reference =-=[4]-=- that setting ` = (n=k)_0(s) actually minimizes the exponent, thereby yielding the best bound; however, the theorem is true regardless of the minimal character of the exponent. Although it is not nece... |

8 |
A class of cyclic linear errorcorrecting codes in pm symbols,”
- Gorenstein, Zierler
- 1961
(Show Context)
Citation Context ...eiffen [14]; second, convolutional codes with Massey's threshold decoding [10]; and third, the Bose-Chaudhuri codes [2] with the decoding schemes developed by Peterson [12] and Zierler and Gorenstein =-=[18]-=-. It has been shown by Fano [5] that for arbitrary discrete memoryless channels, sequential decoding has a probability of decoding error that is upper bounded by a function of the form e\Gamma ffn. He... |

4 |
The resolution of signals in white, Gaussian noise
- Helstrom
- 1955
(Show Context)
Citation Context ... a sample of white Gaussian noise of power density N0 per unit bandwidth that is added to the signal at the receiver. Then the log-likelihood ratio y computed by an ideal receiver can easily be shown =-=[8]-=- to be y = 2N 0 Z T 0 \Theta x 0(t) \Gammasx1(t)\Lambda r(t) dt where r(t) is the received waveform. When x = 0 is the transmitted digit, then 60sE N0 (db) P (e) 10 \Gamma 4 10 \Gamma 3 10 \Gamma 2 10... |

4 |
Sequential Decoding (M.I.T
- Wozencraft, Reiffen
- 1961
(Show Context)
Citation Context ... extremely promising for achieving low error probabilities and high data rates at reasonable cost are the following: first, convolutional codes [3] with sequential decoding as developed by Wozencraft =-=[17]-=-, Fano [5], and Reiffen [14]; second, convolutional codes with Massey's threshold decoding [10]; and third, the Bose-Chaudhuri codes [2] with the decoding schemes developed by Peterson [12] and Zierle... |

2 |
et al., “Improvement of binary transmission by null zero reception
- Bloom
- 1957
(Show Context)
Citation Context ...words to correct the errors. This intermediate decision, however, destroys a considerable amount of information about the transmitted message, as discussed in detail for several channels in Reference =-=[1]-=-. The decoding scheme to be described here avoids this intermediate decision and operates directly with the a posteriori probabilities of the input symbols conditional on the corresponding received sy... |

2 |
Application of Sequential Decoding to High Rate Data Communication on a Telephone Line
- Lebow, McHugh, et al.
- 1963
(Show Context)
Citation Context ..., Lexington, Massachusetts [11]. By using this decoder in a system with a feedback link and an appropriately designed modulator and demodulator, reliable transmission has been achieved experimentally =-=[9]-=- over a telephone circuit at about 7500 bits per second rather than the 1200 or 2400 bits per second possible without coding. The two principal weaknesses of sequential decoding are as follows: First,... |

2 |
SECO: A Self Regulating Error Correcting Coder-Decoder
- Perry, Wozencraft
- 1962
(Show Context)
Citation Context ...mount of computation in decoding a digit is bounded by a quantity independent of constraint length. An experimental sequential decoder has been built at Lincoln Laboratories, Lexington, Massachusetts =-=[11]-=-. By using this decoder in a system with a feedback link and an appropriately designed modulator and demodulator, reliable transmission has been achieved experimentally [9] over a telephone circuit at... |

2 |
Sequential Decoding for Discrete Input Memoryless Channels
- Reiffen
- 1962
(Show Context)
Citation Context ...aveform or code 4sword for each possible block of * binits, then the storage requirement must be proportional to 2*, which is obviously impractical when * is large. Fortunately, Elias [3] and Reiffen =-=[14]-=- have proved that for a wide variety of channel models, the results of the Noisy Channel Coding theorem can be achieved with little equipment complexity at the encoder by the use of parity-check codin... |

1 |
Theoretical diversity improvement in frequency shift keying
- Pierce
- 1958
(Show Context)
Citation Context ...r and likelihood receiver; n = 504, R = 12 . matched to x0(t) and x1(t), z0 = fifififiZ T 0 x \Lambda 0(t)r(t) dtfifififiE\Gammas12c z1 = fifififiZ T 0 x \Lambda 1(t)r(t) dtfifififiE\Gammas12c Pierce =-=[13]-=- shows that z0 and z1 are positive Rayleigh distributed random variables with variance N0 + Ec and N0 depending on whether x0(t) or x1(t) was transmitted, Pr(zi) = ziN 0 + Ec exp \Gammasz2i 2(N0 + Ec)... |