You get a bonus - 1 coin for daily activity. Now you have 1 coin

18.7. Entropy and information for systems with a continuous set of states

Lecture



So far, we have considered physical systems whose various states   18.7.  Entropy and information for systems with a continuous set of states it was possible to list everything; the probabilities of these states were some nonzero magnitudes   18.7.  Entropy and information for systems with a continuous set of states . Such systems are similar to discontinuous (discrete) random variables, taking values   18.7.  Entropy and information for systems with a continuous set of states with probabilities   18.7.  Entropy and information for systems with a continuous set of states . In practice, physical systems of a different type are often encountered, similar to continuous random variables. The states of such systems cannot be renumbered: they continuously go from one to another, with each individual state having a probability equal to zero, and the probability distribution is characterized by a certain density. Such systems, by analogy with continuous random variables, we will call "continuous", in contrast to the previously considered, which we will call "discrete". The simplest example of a continuous system is a system whose state is described by one continuous random variable.   18.7.  Entropy and information for systems with a continuous set of states with distribution density   18.7.  Entropy and information for systems with a continuous set of states . In more complex cases, the state of the system is described by several random variables.   18.7.  Entropy and information for systems with a continuous set of states with distribution density   18.7.  Entropy and information for systems with a continuous set of states . Then it can be considered as an association.   18.7.  Entropy and information for systems with a continuous set of states simple systems   18.7.  Entropy and information for systems with a continuous set of states .

Consider a simple system.   18.7.  Entropy and information for systems with a continuous set of states defined by one continuous random variable   18.7.  Entropy and information for systems with a continuous set of states with distribution density   18.7.  Entropy and information for systems with a continuous set of states (fig. 18.7.1). We will try to extend to this system entered in   18.7.  Entropy and information for systems with a continuous set of states 18.1 the concept of entropy.

  18.7.  Entropy and information for systems with a continuous set of states

Fig. 18.7.1.

First of all, we note that the concept of a “continuous system,” like the concept of a “continuous random variable,” is a kind of idealization. For example, when we consider the value   18.7.  Entropy and information for systems with a continuous set of states - growth at random taken by a person - a continuous random variable, we are distracted from the fact that no one actually measures growth more precisely than 1 cm, and that it is almost impossible to distinguish between two growth values ​​that differ, say, by 1 mm. Nevertheless, it is natural to describe this random variable as continuous, although it could be described as discrete, considering those values ​​that differ by less than 1 cm to coincide.

Similarly, by setting the limit of measurement accuracy, i.e., some segment   18.7.  Entropy and information for systems with a continuous set of states within which the system state   18.7.  Entropy and information for systems with a continuous set of states practically indistinguishable, you can approximate a continuous system   18.7.  Entropy and information for systems with a continuous set of states to discrete. This is equivalent to replacing a smooth curve.   18.7.  Entropy and information for systems with a continuous set of states step type histogram (Fig. 18.7.2); at the same time each section (discharge) of length   18.7.  Entropy and information for systems with a continuous set of states replaced by one representative point.

  18.7.  Entropy and information for systems with a continuous set of states

Fig. 18.7.2.

The squares of the rectangles represent the probabilities of hitting the corresponding bits:   18.7.  Entropy and information for systems with a continuous set of states . If we agree to consider indistinguishable system states belonging to the same category, and combine them all into one state, then we can approximately determine the entropy of the system   18.7.  Entropy and information for systems with a continuous set of states considered up to   18.7.  Entropy and information for systems with a continuous set of states :

  18.7.  Entropy and information for systems with a continuous set of states

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.1)

When small enough   18.7.  Entropy and information for systems with a continuous set of states :

  18.7.  Entropy and information for systems with a continuous set of states ,

  18.7.  Entropy and information for systems with a continuous set of states

and the formula (18.7.1) takes the form:

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.2)

Note that in expression (18.7.2), the first term turned out to be completely independent of   18.7.  Entropy and information for systems with a continuous set of states - the degree of accuracy in determining system states. Depends on   18.7.  Entropy and information for systems with a continuous set of states only second member   18.7.  Entropy and information for systems with a continuous set of states which tends to infinity when   18.7.  Entropy and information for systems with a continuous set of states . This is natural, since the more precisely we want to set the state of the system   18.7.  Entropy and information for systems with a continuous set of states , the greater degree of uncertainty we need to eliminate, and with unlimited reduction   18.7.  Entropy and information for systems with a continuous set of states this uncertainty grows unlimited too.

So, asking an arbitrarily small "insensitivity area"   18.7.  Entropy and information for systems with a continuous set of states our measuring instruments, which determine the state of the physical system   18.7.  Entropy and information for systems with a continuous set of states entropy can be found   18.7.  Entropy and information for systems with a continuous set of states by the formula (18.7.2), in which the second term grows indefinitely with decreasing   18.7.  Entropy and information for systems with a continuous set of states . Entropy itself   18.7.  Entropy and information for systems with a continuous set of states differs from this unlimitedly growing member by independent of   18.7.  Entropy and information for systems with a continuous set of states magnitude

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.3)

This value can be called the “reduced entropy” of a continuous system.   18.7.  Entropy and information for systems with a continuous set of states . Entropy   18.7.  Entropy and information for systems with a continuous set of states expressed through reduced entropy   18.7.  Entropy and information for systems with a continuous set of states by formula

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.4)

The ratio (18.7.4) can be interpreted as follows: on the measurement accuracy   18.7.  Entropy and information for systems with a continuous set of states depends only on the origin, at which the entropy is calculated.

In the future, to simplify the record, we will omit the index   18.7.  Entropy and information for systems with a continuous set of states in the designation of entropy and write just   18.7.  Entropy and information for systems with a continuous set of states ; Availability   18.7.  Entropy and information for systems with a continuous set of states the right side will always indicate the accuracy of which.

Formula (18.7.2) for entropy can be given a more compact form, if, as we did for discontinuous quantities, we write it in the form of the mathematical expectation of a function. First of all, we rewrite (18.7.2) as

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.5)

This is nothing more than the expectation of a function.   18.7.  Entropy and information for systems with a continuous set of states from random variable   18.7.  Entropy and information for systems with a continuous set of states with density   18.7.  Entropy and information for systems with a continuous set of states :

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.6)

A similar form can be given to   18.7.  Entropy and information for systems with a continuous set of states :

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.7)

Let us proceed to the definition of conditional entropy. Suppose there are two continuous systems:   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states . In general, these systems are dependent. Denote   18.7.  Entropy and information for systems with a continuous set of states distribution density for states of the combined system   18.7.  Entropy and information for systems with a continuous set of states ;   18.7.  Entropy and information for systems with a continuous set of states - system distribution density   18.7.  Entropy and information for systems with a continuous set of states ;   18.7.  Entropy and information for systems with a continuous set of states - system distribution density   18.7.  Entropy and information for systems with a continuous set of states ;   18.7.  Entropy and information for systems with a continuous set of states - conditional density of distribution.

First, we define the partial conditional entropy.   18.7.  Entropy and information for systems with a continuous set of states i.e. system entropy   18.7.  Entropy and information for systems with a continuous set of states provided that the system   18.7.  Entropy and information for systems with a continuous set of states took a certain state   18.7.  Entropy and information for systems with a continuous set of states . The formula for it will be similar (18.4.2), only instead of conditional probabilities   18.7.  Entropy and information for systems with a continuous set of states there will be conditional distribution laws   18.7.  Entropy and information for systems with a continuous set of states and a term will appear   18.7.  Entropy and information for systems with a continuous set of states :

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.8)

We now turn to the full (average) conditional entropy   18.7.  Entropy and information for systems with a continuous set of states , for this you need to average private conditional entropy   18.7.  Entropy and information for systems with a continuous set of states in all states   18.7.  Entropy and information for systems with a continuous set of states taking into account their probabilities, characterized by a density   18.7.  Entropy and information for systems with a continuous set of states :

  18.7.  Entropy and information for systems with a continuous set of states (18.7.9)

or, considering that

  18.7.  Entropy and information for systems with a continuous set of states ,

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.10)

Otherwise, this formula can be written as

  18.7.  Entropy and information for systems with a continuous set of states (18.7.11)

or

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.12)

Having thus determined the conditional entropy, let us show how it is used in determining the entropy of the combined system.

We first find the entropy of the combined system directly. If “dead spots” for systems   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states will be   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states then for a unified system   18.7.  Entropy and information for systems with a continuous set of states the elementary rectangle will play their role   18.7.  Entropy and information for systems with a continuous set of states . Entropy system   18.7.  Entropy and information for systems with a continuous set of states will be

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.13)

Because

  18.7.  Entropy and information for systems with a continuous set of states ,

then and

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.14)

Let's substitute (18.7.14) in (18.7.13):

  18.7.  Entropy and information for systems with a continuous set of states

  18.7.  Entropy and information for systems with a continuous set of states ,

or, according to the formulas (18.7.6) and (18.7.12)

  18.7.  Entropy and information for systems with a continuous set of states , (18.7.15)

that is, the entropy theorem of a complex system also holds for continuous systems.

If a   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states are independent, the entropy of the combined system is equal to the sum of the entropies of the constituent parts:

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.16)

Example 1. Find the entropy of a continuous system   18.7.  Entropy and information for systems with a continuous set of states , which all states on some site   18.7.  Entropy and information for systems with a continuous set of states equally likely:

  18.7.  Entropy and information for systems with a continuous set of states

Decision.

  18.7.  Entropy and information for systems with a continuous set of states ;

  18.7.  Entropy and information for systems with a continuous set of states

or

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.17)

Example 2. Find the entropy of the system   18.7.  Entropy and information for systems with a continuous set of states The states of which are distributed according to the normal law:

  18.7.  Entropy and information for systems with a continuous set of states .

Decision.

  18.7.  Entropy and information for systems with a continuous set of states

  18.7.  Entropy and information for systems with a continuous set of states .

But

  18.7.  Entropy and information for systems with a continuous set of states ,

  18.7.  Entropy and information for systems with a continuous set of states

and

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.18)

Example 3. The condition of the aircraft is characterized by three random variables: height   18.7.  Entropy and information for systems with a continuous set of states , speed module   18.7.  Entropy and information for systems with a continuous set of states and angle   18.7.  Entropy and information for systems with a continuous set of states determining the direction of flight. The height of the aircraft is distributed with a uniform density on the site   18.7.  Entropy and information for systems with a continuous set of states ; speed   18.7.  Entropy and information for systems with a continuous set of states - under the normal law with m.   18.7.  Entropy and information for systems with a continuous set of states and s.ko.   18.7.  Entropy and information for systems with a continuous set of states ; angle   18.7.  Entropy and information for systems with a continuous set of states - with uniform density on the site   18.7.  Entropy and information for systems with a continuous set of states . Values   18.7.  Entropy and information for systems with a continuous set of states are independent. Find the entropy of the combined system.

Decision.

From example 1 (formula (18.7.17)) we have

  18.7.  Entropy and information for systems with a continuous set of states ,

Where   18.7.  Entropy and information for systems with a continuous set of states - “deadband” when determining the height.

Так как энтропия случайной величины не зависит от ее математического ожидания, то для определения энтропии величины   18.7.  Entropy and information for systems with a continuous set of states воспользуемся формулой (18.7.18):

  18.7.  Entropy and information for systems with a continuous set of states .

Энтропия величины   18.7.  Entropy and information for systems with a continuous set of states :

  18.7.  Entropy and information for systems with a continuous set of states .

Finally we have:

  18.7.  Entropy and information for systems with a continuous set of states

or

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.19)

Заметим, что каждый из сомножителей под знаком фигурной скобки имеет один и тот же смысл: он показывает, сколько «участков нечувствительности» укладывается в некотором характерном для данной случайной величины отрезке. В случае распределения с равномерной плотностью этот участок представляет собой просто участок возможных значений случайной величины; в случае нормального распределения этот участок равен   18.7.  Entropy and information for systems with a continuous set of states where   18.7.  Entropy and information for systems with a continuous set of states - среднее квадратическое отклонение.

Таким образом, мы распространили понятие энтропии на случай непрерывных систем. Аналогично может быть распространено и понятие информации. При этом неопределенность, связанная с наличием в выражении энтропии неограниченно возрастающего слагаемого, отпадает: при вычислении информации, как разности двух энтропий, эти члены взаимно уничтожаются. Поэтому все виды информации, связанные с непрерывными величинами, оказываются не зависящими от «участка нечувствительности»   18.7.  Entropy and information for systems with a continuous set of states .

Выражение для полной взаимной информации, содержащейся в двух непрерывных системах   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states , будет аналогично выражению (18.5.4), но с заменой вероятностей законами распределения, а сумм - интегралами:

  18.7.  Entropy and information for systems with a continuous set of states (18.7.20)

или, применяя знак математического ожидания,

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.21)

Полная взаимная информация   18.7.  Entropy and information for systems with a continuous set of states как и в случае дискретных систем, есть неотрицательная величина, обращающаяся в нуль только тогда, когда системы   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states are independent.

Пример 4. На отрезке   18.7.  Entropy and information for systems with a continuous set of states выбираются случайным образом, независимо друг от друга, две точки   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states , каждая из них распределена на этом отрезке с равномерной плотностью. В результате опыта одна из точек легла правее, другая - левее. Сколько информации о положении правой точки дает значение положения левой?

Decision. Рассмотрим две случайные точки   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states на оси абсцисс   18.7.  Entropy and information for systems with a continuous set of states (рис. 18.7.3).

  18.7.  Entropy and information for systems with a continuous set of states

Fig. 18.7.3.

Denote   18.7.  Entropy and information for systems with a continuous set of states абсциссу той из них, которая оказалась слева, а   18.7.  Entropy and information for systems with a continuous set of states - абсциссу той, которая оказалась справа (на рис. 18.7.3 слева оказалась точка   18.7.  Entropy and information for systems with a continuous set of states , но могло быть и наоборот). Values   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states определяются через   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states следующим образом

  18.7.  Entropy and information for systems with a continuous set of states ;   18.7.  Entropy and information for systems with a continuous set of states .

Найдем закон распределения системы   18.7.  Entropy and information for systems with a continuous set of states . Because   18.7.  Entropy and information for systems with a continuous set of states , it will exist only in the region   18.7.  Entropy and information for systems with a continuous set of states shaded in fig. 18.7.4.

  18.7.  Entropy and information for systems with a continuous set of states

Fig. 18.7.4.

Denote   18.7.  Entropy and information for systems with a continuous set of states the distribution density of the system   18.7.  Entropy and information for systems with a continuous set of states and find the element of probability   18.7.  Entropy and information for systems with a continuous set of states , that is, the probability that a random point   18.7.  Entropy and information for systems with a continuous set of states falls into an elementary rectangle  18.7.  Entropy and information for systems with a continuous set of states .This event can occur in two ways: either there is a dot on the left and a dot   18.7.  Entropy and information for systems with a continuous set of states on the right   18.7.  Entropy and information for systems with a continuous set of states , or vice versa. Consequently,

  18.7.  Entropy and information for systems with a continuous set of states ,

Where   18.7.  Entropy and information for systems with a continuous set of states marked density distribution of the system of values   18.7.  Entropy and information for systems with a continuous set of states .

In this case

  18.7.  Entropy and information for systems with a continuous set of states   18.7.  Entropy and information for systems with a continuous set of states ,

Consequently,

  18.7.  Entropy and information for systems with a continuous set of states ;

  18.7.  Entropy and information for systems with a continuous set of states

and

  18.7.  Entropy and information for systems with a continuous set of states

We now find the laws of the distribution of individual quantities in the system:

  18.7.  Entropy and information for systems with a continuous set of states at   18.7.  Entropy and information for systems with a continuous set of states ;

similarly

  18.7.  Entropy and information for systems with a continuous set of states at   18.7.  Entropy and information for systems with a continuous set of states .

Density graphs   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states shown in fig. 18.7.5.

  18.7.  Entropy and information for systems with a continuous set of states   18.7.  Entropy and information for systems with a continuous set of states

Figure 18.7.5.

Substituting   18.7.  Entropy and information for systems with a continuous set of states ,   18.7.  Entropy and information for systems with a continuous set of states and   18.7.  Entropy and information for systems with a continuous set of states in the formula (18.7.20), we get

  18.7.  Entropy and information for systems with a continuous set of states

  18.7.  Entropy and information for systems with a continuous set of states .

By virtue of the symmetry of the problem, the last two integrals are equal, and

  18.7.  Entropy and information for systems with a continuous set of states

  18.7.  Entropy and information for systems with a continuous set of states (two units).

Example 5. There is a random variable   18.7.  Entropy and information for systems with a continuous set of states distributed according to the normal law with parameters  18.7.  Entropy and information for systems with a continuous set of states ,   18.7.  Entropy and information for systems with a continuous set of states . Magnitude   18.7.  Entropy and information for systems with a continuous set of states измеряется с ошибкой   18.7.  Entropy and information for systems with a continuous set of states , тоже распределенной по нормальному закону с параметрами   18.7.  Entropy and information for systems with a continuous set of states ,   18.7.  Entropy and information for systems with a continuous set of states . Mistake   18.7.  Entropy and information for systems with a continuous set of states does not depend on   18.7.  Entropy and information for systems with a continuous set of states . В нашем распоряжении - результат измерения, т. е. случайная величина

  18.7.  Entropy and information for systems with a continuous set of states

Определить, сколько информации о величине   18.7.  Entropy and information for systems with a continuous set of states содержит величина   18.7.  Entropy and information for systems with a continuous set of states .

Decision. Воспользуемся для вычисления информации формулой (18.7.21), т. е. найдем ее как математическое ожидание случайной величины

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.22)

Для этого сначала преобразуем выражение

  18.7.  Entropy and information for systems with a continuous set of states .

In our case

  18.7.  Entropy and information for systems with a continuous set of states ,

  18.7.  Entropy and information for systems with a continuous set of states (см. главу 9).

Выражение (18.7.22) равно:

  18.7.  Entropy and information for systems with a continuous set of states

  18.7.  Entropy and information for systems with a continuous set of states .

From here

  18.7.  Entropy and information for systems with a continuous set of states . (18.7.23)

Ho   18.7.  Entropy and information for systems with a continuous set of states , Consequently,

  18.7.  Entropy and information for systems with a continuous set of states (18.7.24)

Подставляя (18.7.24) в (18.7.23), получим

  18.7.  Entropy and information for systems with a continuous set of states (two units).

For example, when   18.7.  Entropy and information for systems with a continuous set of states

  18.7.  Entropy and information for systems with a continuous set of states (two units).

If a   18.7.  Entropy and information for systems with a continuous set of states ;   18.7.  Entropy and information for systems with a continuous set of states then   18.7.  Entropy and information for systems with a continuous set of states (two units).


Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Information and Coding Theory

Terms: Information and Coding Theory