A unit of measure is present in any area of human activity as the generally accepted equivalent of measuring something. Volume, length, weight - the most common of the existing ones. On different continents, in different languages and, regardless of religion, we use the same measurement measures. For a person it is convenient, accessible and understandable, does not require any conversion actions, which often leads to errors.
With the advent of the computer in human life, there was also a need to create a single unit for measuring the amount of information. Yes, such that it was perceived by both Chinese, and American, and any other computer in the world. All programmers in the world could use it as a single language. So the binary logarithm of the bit appeared.
Bit is a unit of measurement of information equal to the amount of information and assuming only two values: “on” and “off” (displayed as 1 and 0). In order to encode the English alphabet on a computer, six bits are enough, but for the implementation of more complex programming processes, more voluminous units of information are required, which were borrowed from Latin prefixes from mathematics. A bit in a computer is an electrical impulse that is supplied from a command post to an executive body and a response impulse about the work done. 0 and 1 commands are given in different voltage ranges. All this happens at such a high speed, comparable to the transfer of information between the brain and nerve endings in the human body.
All programming languages are tied to binary code. A photo transferred to electronic form, games and any software, the management of mechanized production - all use the same units of information - bits.
Gathering in chains of pulses, bits form new units of information: bytes, kilobits, kilobytes, megabytes, megabytes, gigabytes, terabytes, petabytes, and so on.
A byte is considered equal to eight bits and has a second name - an octet, since in computer history there are solutions of both 6-bit bytes and 32-bit, but this is an exclusively software solution of some companies to improve the unit of measurement of information, which did not give fruitful results. The main one is an eight-bit byte, and therefore called the common name "octet".
To facilitate the work with information, units of measurement of the amount of information are encoded into specially approved tables, such as UNICOD, and other types of encoding using Windows OS or another. Each of the tables is used for its intended purpose. An 8-bit binary system is used for black-and-white images , and 24-bit for a color image. To write text in any language of the world, it is enough to use 16-bit UNICOD.
At the moment, the average user has access to the largest unit of information, as a terabyte - it's 1024 gigabytes, which in turn are 1024 gigabytes. If you multiply by 1024, we get the number of megabytes that make up one terabyte. If we continue such simple multiplications to the original unit of measurement, then we get the nth number, it will not fit into any calculator. This will be the number of binary codes needed to fill such a volume.
By the way, many manufacturers use the literal meaning of kilo-, mega- and giga- prefixes when releasing components. This leads to significant inaccuracies and veiled deception. That is, a 1 terabyte hard drive can physically store only 950 megabytes of information. The computer community has long suggested a way to avoid this.
The prefixes kilo-, mega- and so on remain with their usual factors. Kilo - a thousand, mega - a million. For computer technology, it is right to use a factor of not 1000, but 1024. He also invented a name - kibi. That is, the correct prefixes and factors will be: kibi-, mibi- and further.