You get a bonus - 1 coin for daily activity. Now you have 1 coin

The bandwidth of the information transmission channel in the absence of interference.

Lecture



The bandwidth of the information transmission channel in the absence of interference.

Each source of information transmits a message to the consumer via a communication channel (wired, atmospheric, radio, optical). The speed of the messages generated by the source must be consistent with the characteristics of the transmission channel. If the rate of generation of messages is less than is able to skip the channel, then it will be underutilized. If more, then some of the information may be lost. It is possible that there is an optimal agreement between the source of the message and the communication channel when the speed at which messages are created is equal to the channel capacity.

We introduce the concept H = J / n. H - the speed of creating information for one count. Let us attribute both parts to the quantization interval Δt: H / t = J / n * Δt = J / T = J 'is the speed of information creation. Δt = 1 / 2f max , then J '= H / Δt = 2 f max * H. If the channel transmits N messages over time T, we assume that all messages are equiprobable. The amount of information J k = log 2 N (T), J k ' = log 2 N (T) / T [bit / s]. The maximum value of the speed of information transmission over a given communication channel is called the bandwidth feature of the communication channel. C = lim T -> ∞ log 2 N max (T) / T. The task of optimal coding is that the speed at which the source information is created is equal to the bandwidth of the communication channel. If J k = J k ' , then the optimal matching. J k ' = H / Δt = nH / T = J k   => n / t = j k ' / h. n / T - is the average transfer rate of characters per unit of time. Obviously, this ratio will be optimal when n / T = C / H (*) is the optimal coding formula (when the message creation V is equal to the transmission rate of the channel). However, (*) is not realizable, in fact, an approximation to it is realized n / T = С / H-ε, where ε> 0.

Example: h 1 , h 2 , h 3 , h 4 ; p 1 = 0.4, p 2 = 0.3, p 3 = 0.2, p 4 = 0.1.

H = -∑ i = 1 4 p i * log 2 p i = 1,848 bits.

The source is included in the channel with a bandwidth of C = 1000 bps, then C / H = 540 el / s can be transmitted over the channel. Since each of the messages is encoded with two characters, such a code will be transmitted at a speed of 1000/2 = 500 e / s. Ε = 540-500 = 40 el / s.


Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Information and Coding Theory

Terms: Information and Coding Theory