Information theory

The basic concepts of this theory were introduced in the forties of the twentieth century by K. Shannon. Information theory - This is the science of the processes of transmission, storage and receipt of data in natural, technical and social systems. In many applied scientific fields, such as computer science, linguistics, cryptography, control theory, image processing, genetics, psychology, economics, production organization, methods of this science are used.

In modern conditions, the theory of coding, which considers the most common problems of correspondence between a signal and a message, and the theory of signal processing, which studies the quantization and reconstruction of signals, as well as spectral and correlation analysis of such, are very closely connected with the theory of information.

Information theory considers the basic concept of โ€œinformationโ€ for the most part from a quantitative perspective, not taking into account its value, and sometimes its meaning. With this approach, a page of text will contain approximately the same amount of data, determined only by the number of characters and symbols, and not depending on what, in fact, is printed there, even if it is absolutely meaningless and chaotic set of certain characters.

This approach is valid, but only for modeling communication systems, since they must accurately transmit information over communication channels, which can be represented by any set of signs and symbols. Then, when it is necessary to take into account the value and meaning of the data, a quantitative approach is unacceptable. Such circumstances impose significant restrictions on the field of possible application of this theory.

The basics of information theory involve the consideration of various issues, including those directly related to the transmission and reception of data. The basic communication scheme, which is considered in the teaching, is as follows. Information and Coding Theory believes that the data is created by the message source, which is a word or a set of words written in letters of a certain alphabet. The message source can be text in any natural or artificial languages, human speech, databases and some mathematical models that create a sequence of letters. The transmitter converts the message into a signal corresponding to the physical nature of the communication channel โ€” the medium for signal transmission. During the passage of such, it may be affected by interference, introducing distortions into the values โ€‹โ€‹of information parameters. The receiver restores the original message on the received signal with distortion. The message in the restored form arrives at the addressee - a certain person or technical device.

The message source is statistical in nature, that is, the appearance of each message is determined by some probability. Shannonโ€™s information theory believes that if the probability of a message appearing is one, that is, its appearance is reliable and there is no uncertainty, they believe that it does not carry any information.

One of the important problems of information theory considers the coordination of the communication channel and information properties of the message source. Bandwidth is determined by a unit of measurement of 1 bit per second.

One of the problems of communication systems is interference on the path of the useful signal. Shannon's theorem, unfortunately, does not give us a concrete method of dealing with such. The simplest method of elimination, which consists in repeated repetition of messages, is not very effective, since it takes a lot of time to transmit information. Much greater efficiency is given by the use of codes that allow you to detect and correct errors in the transmission of information.

Source: https://habr.com/ru/post/G16842/


All Articles