Measures and amount of information

For the first time, the term “information” was proposed by ancient philosophers and sounded like informatio - clarification, exposition, awareness. However, in academic circles there is still a debate about the most accurate and complete definition of the word. For example, the scientist Claude Shannon, who laid the foundations of the theory of information, believes that information is the removed uncertainty of the subject’s knowledge of something. The simplest definition of “information” sounds like this is the degree of awareness of an object.

In order to determine the amount of information, you should familiarize yourself with the classification of measures of information data. In total there are three measures of information: syntactic, semantic and pragmatic. Consider each measure individually:

1. The syntactic measure works with data that does not reflect the semantic relation to the object. This measure operates with the type of medium, the method of presentation and coding, the speed of transmission and processing of information.

In this case, the measure is the amount of information - the amount of memory needed to store data about the object. The information volume is equal to the number of binary system digits with which the message in question is encoded, and is measured in bits.

In order to determine the syntactic amount of information, we turn to the concept of entropy - a measure of the uncertainty of the state of a system, namely, our knowledge about the state of its elements and the state of the system as a whole. Then the amount of information is a change in the measure of uncertainty of the system, that is, a change (increase or decrease) in entropy.

2. The semantic measure serves to determine the semantic content of the data and associates the relevant information parameters with the ability of the user to process the message. This concept is called the user thesaurus. A thesaurus is understood as a totality of information about an object that a system or user has at its disposal. The maximum amount of information from the point of view of semantics is possible in the case when the entire amount of data is understandable to the user or the system - can be processed using the existing thesaurus - and, therefore, is a relative concept.

3. A pragmatic measure of information measures the value of information to achieve a specific goal. This concept is also relative and is directly related to the ability of a system or user to apply a specific amount of data to a specific problem area. Therefore, it is advisable to measure information from a pragmatic point of view in the same units of measurement as the target function.

The qualitative characteristics of information include the following indicators:

- Representativeness - the correct selection and presentation of information for the most optimal display of the characteristics of the object.

- Content - the ratio of the amount of information in the semantic dimension to the volume of data processed.

- Completeness - the presence in the message of the minimum set of information necessary to achieve the goal.

- Availability - the implementation of procedures for obtaining and converting data by a user or system.

- Relevance - the degree of preservation of the value of information from the moment of receipt to the moment of use.

- Timeliness - the receipt of information no later than the required time.

- Accuracy - the degree to which information corresponds to the actual state of the object.

- Reliability - the ability of information data to reflect real-life objects with a given accuracy.

- Stability - a property of information that allows you to respond to the time conversion of the source data, while maintaining the specified accuracy.

Remember, information is very important now, so you need to know as much as possible about it!

Source: https://habr.com/ru/post/C1818/


All Articles