The Maximum Data Rate of a Channel

In 1924, an AT&T engineer, Henry Nyquist, realized that even a perfect channel has a finite transmission capacity. He derived an equation expressing the maximum data rate for a finite-bandwidth noiseless channel. Nyquist proved that  if an arbitrary signal  has been run through a low-pass filter of bandwidth B, the filtered signal can be completely reconstructed by making only 2B (exact) samples per second, which was called Sampling Theorem when transforming analog signal (continuous function) into digital signal (discrete function). Sampling the line faster than 2B times per second is pointless because the higher-frequency components that such sampling could recover have already been filtered out. If the signal consists of V discrete leve ls, Nyquist's theorem states:
            maximum data rate = 2B log V  bits/sec
in which 2B indicates symbol rate (baud rate) and log2V  indicates the number of bits transmitted by each symbol.

      So far we have considered only noiseless channels. If random noise is present, the situation deteriorates rapidly. And there is always random (thermal) noise present due to the motion of the molecules in the system. The amount of thermal noise present is measured by the ratio of the signal power to the noise power, denoting S/N. Then Shannon's major result (the most important paper in all of information theory) is that the maximum data rate or capacity of a noisy channel whose bandwidth is B Hz is given by:

            maximum data rate =B log2 (1+S/N) bits/sec
This tells us the best capacities that real channels can have. Shannon's result was derived from information-theory arguments and applies to any channel subject to thermal noise. Counterexamples should be treated in the same category as perpetual motion machines.

你可能感兴趣的:(Computer,Networks)