Integers

Numbers is an abstract concept. They are a quantitative characteristic of objects and are real, rational, negative, integer and fractional, as well as natural.

The natural number is usually used in counting, in which the notation of quantity naturally arises. Acquaintance with the account begins in early childhood. Which kid escaped ridiculous counters in which elements of a natural account were used? "One, two, three, four, five ... A bunny came out for a walk!" or "1, 2, 3, 4, 5, 6, 7, 8, 9, 10, the king decided to hang me ..."

For any positive integer, one can find another greater than it. This set is usually denoted by the letter N and should be considered infinite in the direction of increasing. But the beginning of this is a lot - this is the unit. Although there are French natural numbers, many of which also include zero. But the main distinguishing features of both the one and the other set is the fact that they do not include fractional or negative numbers.

The need for recounting a variety of objects arose in prehistoric times. Then the concept of "natural numbers" was supposedly formed. Its formation took place throughout the entire process of changing the worldview of a person, the development of science and technology.

However, primitive people could not yet think abstractly. It was difficult for them to understand what the generality of the concepts of “three hunters” or “three trees” consists of. Therefore, when indicating the number of people, one definition was used, and when indicating the same number of objects of a different kind, a completely different definition was used.

Moreover, the number series was extremely short. Only numbers 1 and 2 were present in it, and the score ended with the concepts of “many,” “herd,” “crowd,” “heap.”

Later a more progressive account was formed, already wider. An interesting fact is that there were only two numbers - 1 and 2, and the following numbers were obtained by adding.

An example of this was the information that reached us about the number of the Australian tribe of the Murray River. At them 1 designated the word "Enza", and 2 - the word "petched". The number 3 therefore sounded like "petch-Enza", and 4 - already as "petch-petzheva".

Most peoples recognized fingers as the benchmark. Further, the development of the abstract concept of "natural numbers" went the way of using notches on a stick. And then the need arose to designate a dozen with a different sign. Ancient people are our way out - they began to use another stick, on which nicks were made, representing dozens.

The ability to reproduce numbers has greatly expanded with the advent of writing. At first, the numbers were represented by dashes on clay tablets or papyrus, but gradually other icons began to be used to record large numbers. So the Roman numerals appeared.

Much later, Arabic numerals appeared , which made it possible to write numbers with a relatively small set of characters. Today it is not difficult to write down such huge numbers as the distance between the planets and the number of stars. One has only to learn to use degrees.

Euclid in the 3rd century BC in the book "Beginnings" establishes the infinity of the numerical set of primes. And Archimedes in "Psamita" reveals the principles for constructing the names of arbitrarily large numbers. Almost until the middle of the 19th century, people did not need to clearly articulate the concept of “natural numbers”. The definition was required with the advent of the axiomatic mathematical method.

And in the 70s of the 19th century, George Cantor formulated a clear definition of natural numbers based on the concept of set. And today we already know that natural numbers are all integers, from 1 to infinity. Young children, taking their first step in acquaintance with the queen of all sciences - mathematics - begin to study these numbers.

Source: https://habr.com/ru/post/G11379/


All Articles