Asymptotically catastrophic convolutional codes

Abstract
The minimum distance growth rate of unmerged codewords in a convolutional code is shown to depend upon the minimum average weight per branchw_{0}in the encoder state diagram. An upper bound onw_{0}is obtained for a large class of rate1/2codes which includes many of the best known classes of rate1/2codes. The hound is shown to be tight for short constraint length codes. A class of codes is defined to be asymptotically catastrophic ifw_{0}approaches zero for large constraint lengths. Several classes of rate1/2codes are shown to be asymptotically catastrophic. These include classes containing codes known to have large free distance. It is argued that the free distance alone is not a sufficient criterion to determine a codes performance with either Viterbi or sequential decoding. A code with a low distance growth rate will yield a high bit error probability and will not perform well with truncated Viterbi decoding.

This publication has 10 references indexed in Scilit: