You get a bonus - 1 coin for daily activity. Now you have 1 coin

6.2 Shannon Formula

Lecture



In section 2.4, the famous Boltzmann formula was given, which determines the relationship of entropy with the statistical weight P of the system

  6.2 Shannon Formula   6.2 Shannon Formula (one)
where N 1 , N 2 , ..., N M is the number of particles in states 1, 2, ..., M, and N is the number of all possible states
  6.2 Shannon Formula .

The probability W i occurrence of the state i is equal to

  6.2 Shannon Formula , i = 1,2, ..., M, and   6.2 Shannon Formula .

As already noted, in the middle of the 20th century (1948) an information theory was created, with the advent of which the function introduced by Boltzmann (1) experienced a rebirth. American communications engineer Claude Shannon proposed to introduce a measure of the amount of information I using the statistical formula of entropy and take I = 1 as a unit of information:

I = log 2 (P). (2)
log 2 (2) = 1 bit

Such information is obtained by throwing a coin (P = 2).

A bit is a binary unit of information (binary digits), it operates with two possibilities: yes or no, numbers in the binary system are written with a sequence of zeros and ones.

Using formulas (1) and (2), we calculate the information contained in a single message. It consists of N letters of a language consisting of M letters.

I = log 2 P =   6.2 Shannon Formula (3)

We transform the last dependence using the approximate Stirling formula

lnN! ~ Nln (N / e). (four)

We present in the formula (3) the parameters N! and N i ! in the form (4), we get

I =   6.2 Shannon Formula .

Express N i through W i N, then the last expression takes the form

I =   6.2 Shannon Formula =   6.2 Shannon Formula =   6.2 Shannon Formula .

Let's go to logarithm with base two.

  6.2 Shannon Formula . (five)

The last formula allows you to quantify the information I, which is in the message of the N letters of the language containing M letters, and one letter has the information I 1 = I / N, i.e.

  6.2 Shannon Formula . (6)

The quantity (6) is called Shannon entropy.

Recall that the expression (1) for thermodynamic entropy after similar transformations can be represented in the form (5)

  6.2 Shannon Formula .

More information about the relationship of thermodynamic and informational entropy


Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Synergetics

Terms: Synergetics