You get a bonus - 1 coin for daily activity. Now you have 1 coin

18.1. Subject and tasks of information theory

Lecture



Information theory is the science that studies the quantitative patterns associated with the receipt, transmission, processing and storage of information. Having emerged from the practical problems of communication theory in the 40s of our century, information theory is now becoming a necessary mathematical tool in the study of various control processes.

The traits of randomness inherent in the processes of information transfer force us to turn to probabilistic methods when studying these processes. At the same time, it is not possible to limit oneself to the classical methods of probability theory, and the need arises to create new probability categories. Therefore, information theory is not just an applied science, in which probabilistic research methods are applied, but should be considered as a section of probability theory.

Receiving, processing, transmitting and storing various types of information is an indispensable condition for the operation of any control system. In this process, information is always exchanged between different parts of the system. The simplest case is the transfer of information from the control device to the executive body (transmission of commands). A more complicated case is a closed loop control, in which information on the results of command execution is transmitted to the control device using the so-called “feedback”.

Any information in order to be transmitted must be properly “encoded,” that is, translated into the language of special characters or signals. Signals that transmit information can be electrical impulses, light or sound shakes, mechanical movements, etc.

One of the problems of information theory is to find the most economical coding methods that allow you to transfer specified information using a minimum number of characters. This problem is solved both in the absence and in the presence of distortions (interference) in the communication channel.

Another typical problem of the theory of information is set as follows: there is an information source (transmitter) that continuously generates information, and a communication channel through which this information is transmitted to another instance (receiver). What should be the bandwidth of the communication channel in order for the channel to “cope” with its task, i.e. transmit all the incoming information without delays and distortions?

A number of problems in information theory relate to the determination of the volume of storage devices intended for storing information, to methods for entering information into these storage devices and outputting it for direct use.

To solve such problems, you must first learn to measure quantitatively the amount of transmitted or stored information, the bandwidth of communication channels and their sensitivity to interference (distortion). The basic concepts of information theory, presented in this chapter, allow us to give a quantitative description of the processes of information transfer and outline some mathematical laws relating to these processes.

Information theory is a science that studies the quantitative patterns associated with the receipt, transmission and storage of information. The fundamentals of information theory are outlined in the works of K. Shannon. However, attempts to determine the information were before. Hartley defined the concept of information (but unfortunate). Brillouin came very close to quantifying. A problem is considered with a number of possible answers. If some information is obtained about the task, then the number of possible answers may be reduced, and the availability of information may leave a single answer. Thus, information is determined by the number of possible responses before and after receipt. Thus, in the theory of information processes and relations in random systems are considered. Information along with matter and energy is the primary concept of our world and therefore cannot be defined in the strict sense. However, you can list its properties:

1) information is intangible, but is presented in the form of tangible media (signs, signals, or in the form of functions of time)

2) information brings knowledge about the surrounding world, which was not at the considered point before it was received.

3) information can be enclosed both in signs and signals, and in their mutual arrangement. For example: T, P, O, C: CABLE, GRADE, SARS, GROWTH.

4) Signs and signals carry information only for the recipient who is able to recognize them.

Recognition is the identification of signs and signals with objects and their relationships in the real world. By signs we will call the real distinguishable ones obtained by material objects: capacity, signs, hieroglyphs.

Transmission, transformation and acquisition of information is associated with the concept of signal.

created: 2014-09-15
updated: 2024-11-13
285



Rating 9 of 10. count vote: 2
Are you satisfied?:



Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Information and Coding Theory

Terms: Information and Coding Theory