Shannon’s coding theorem for noisy channels states that it is possible
to communicate information, with arbitrarily small error, at any
rate of transmission less than the channel capacity. The attainable
probability of error has previously been bounded as a function of capacity,
transmission rate, and delay. This investigation considers the
behavior of a new parameter, the average number of decoding computations.
A convolutional encoding and sequential decoding procedure
is proposed for the particular case of the binary symmetric channel.
With this procedure, the average number of decoding computations per
information digit can be constrained to grow less rapidly than the
square of the delay. The decoding process converges for constant
rates of transmission that are not too close to capacity. Although it
has not been possible to prove this rigorously, it appears that the
probability of error decreases exponentially with delay, and is essentially
optimum for transmission rates near the limit of convergence.
It also appears that the sequential decoding technique can be extended
to more general channels.
Reviews
There are no reviews yet.