Error bounds for convolutional codes and an asymptotically optimum decoding algorithm
- 1 April 1967
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 13 (2), 260-269
- https://doi.org/10.1109/tit.1967.1054010
Abstract
The probability of error in decoding an optimal convolutional code transmitted over a memoryless channel is bounded from above and below as a function of the constraint length of the code. For all but pathological channels the bounds are asymptotically (exponentially) tight for rates aboveR_{0}, the computational cutoff rate of sequential decoding. As a function of constraint length the performance of optimal convolutional codes is shown to be superior to that of block codes of the same length, the relative improvement increasing with rate. The upper bound is obtained for a specific probabilistic nonsequential decoding algorithm which is shown to be asymptotically optimum for rates aboveR_{0}and whose performance bears certain similarities to that of sequential decoding algorithms.Keywords
This publication has 5 references indexed in Scilit:
- The Noisy Channel Coding Theorem for Erasure ChannelsThe American Mathematical Monthly, 1974
- Lower bounds to error probability for coding on discrete memoryless channels. IInformation and Control, 1967
- Sequential Decoding - The Computation Problem*Bell System Technical Journal, 1966
- A simple derivation of the coding theorem and some applicationsIEEE Transactions on Information Theory, 1965
- A heuristic discussion of probabilistic decodingIEEE Transactions on Information Theory, 1963