Freelance Jobs

2.5.6 Channel Capacity and Shanon theory


2.5.6 Channel Capacity and Shanon theory:


How fast can we transmit information over a communication channel?

         Suppose a source sends r messages per second, and the entropy of a message is H  bits per message. The information rate is R = r H  bits/second.

One can intuitively reason that, for a given communication system, as the information rate increases the number of errors per second will also increase. Surprisingly, however, this is not the case.

2.5.6.1 Shannon’s theorem:


·        A given communication system has a maximum rate of information C known as the channel capacity.

·        If the information rate R is less than C, then one can approach arbitrarily small error probabilities by using intelligent coding techniques.

·        To get lower error probabilities, the encoder has to work on longer blocks of signal data. This entails longer delays and higher computational requirements.

Thus, if R C then transmission may be accomplished without error in the presence of noise.

Unfortunately, Shannon’s theorem is not a constructive proof,it merely states that such a coding method exists. The proof can therefore not be used to develop a coding method that reaches the channel capacity. The negation of this theorem is also true: if R > C, then errors cannot be avoided regardless of the coding technique used.

2.5.6.2 Shannon-Hartley theorem:


Consider a bandlimited Gaussian channel operating in the presence of additive Gaussian noise:


The Shannon-Hartley theorem states that the channel capacity is given by
where C is the capacity in bits per second , B is the bandwidth of the channel in Hertz, and S /N is the signal-to -noise ratio. 
We cannot prove the theorem, but can partially justify it as follows:Suppose the received signal is accompanied by noise with a RMS voltage of σ , and that the signal has been quantized with levels separated by
.
 If

is chosen sufficiently large, we may expect to be able to recognize the signal level with an acceptable probability of error. Suppose further that each message is to be represented by one voltage level. If there are to be M possible messages, then there must be M levels. The average signal power is then:
                                         

The number of levels for a given average signal power is therefore
Where  is
the noise power. If each message is equally likely , then each carries an equal amount of information
                  

0 comments:

Post a Comment

 
2011 Mother Reference | Blogger Templates for Over 50 Chat