Translated from English, the word "computer" means "calculator." It was with the aim of conducting complex calculations that the first computers were created.
The history of the creation of computers, paradoxically as it may seem, originates in ancient times. Of course, the first computing tools cannot be compared with modern ones, however, it is with them that the creation of computer technology begins . The first tool that helped people count was the well-known abacus.
Ancient Greek abacus was a wooden board sprinkled with sand. In Russia, bones arranged in piles were used as the main tool for carrying out simple calculations and calculations. However, from the time of the existence of Russia many more centuries will pass before we come close to creating a real computer.
The first generation of computers is associated with the names of John von Neumann, Claude Shannon and Alan Turing - scientists who are directly related to the development of computer technology. Most of the first computers were created between 1945 and 1954 as an experiment to test some theoretical points. In the same period of time, a new science, which is directly related to information technology, is developing - cybernetics. Until the 60s of the 20th century, cybernetics was a science related to the development of computer technology, including its most promising area - robotics and the invention of artificial intelligence. The first generation of computers was created on the basis of the operation of electronic lamps (similar to those used in the work of old televisions), computers of the mid-20th century reached enormous size, and a separate room was required for their placement.
The evolution of computers includes five main stages. Why is the period of creation of the first generation of computers to be considered in more detail? We do not take into account the computing inventions of our ancient ancestors, since they, in fact, have nothing to do with the birth of electronic computing technology. The creation of the first computer, which is the prototype of a modern computer, is of tremendous importance for such a science as computer science as a whole.
Stages of computer development:
- 1945-1954, the creation of the first computer based on the development of computer designs by John von Neumann - the first generation of computers;
- 1955-1964 - the creation of the second generation of computers based on transistors, the introduction of the first operating system in them, the development of programming languages: Cobol, Algol, Fortran, the appearance of computers on sale;
- 1965-1974 - the invention of microcircuits, the beginning of the use of semiconductor memory (RAM) in computers, the beginning of large-scale computer production.
The time from 1975 to 1985 is considered a lull period in the development of electronic computers.
Since 1985, the period of development of the fourth generation of computers begins, the improvement of computer technology by increasing its capacity and size, the general availability and mass production of computers.
The fifth generation of computers, from the point of view of Japanese scientists, involves the improvement of existing computer models and the creation of computer technology that can understand user commands at the level of his thinking. Regardless of whether the grandiose plans of foreign inventors come true or not, the first generation of computers played a very big role in the development of information technologies of the future. The first computers occupying a vast territory and appearing to us as real monsters, one way or another, are a prototype of our familiar personal computer. Without their existence, the invention of laptops, netbooks, tablets, etc. would be impossible.