Software and hardware: concept, purpose, levels, characteristics and settings

A computer is a complex device that is a synthesis of software and hardware. This is a machine that solves problems through the execution of commands, such as: to sum two numbers, check whether the number is different from zero, copy data from one memory location to another, etc.

Simple commands make up a language called machine language, in which a person can explain to the computer what needs to be done. Each computer, depending on its purpose, is equipped with a certain set of commands. They are created primitive to simplify the production of computers.

However, machine language creates big problems for a person, because writing on it is tiring and extremely difficult. Therefore, engineers invented several levels of abstractions, each of which is based on a lower level, down to machine language and computer logic, and the interaction with the user is located on the upper level. This principle is called the multi-level structure of a computer, and both the hardware and software of computer systems obey it.

Multilevel structure of computers

As already mentioned, software and hardware are built on the principle of levels of abstractions, each of which is based on the previous one. Simply put, to make it easier for a person to write programs, a new language is created (or rather built on) based on a machine language, which is more understandable for a person, but completely impossible to use. Then how does a computer execute programs in a new language?

There are two main approaches - translation and interpretation. In the first case, each command of a new language corresponds to a set of machine language commands, so a program in a new language is completely converted into a program in machine language. In the second case, a program is created in machine language that accepts commands in a new language as input, recognizes them, translates them into machine language, and executes.

Multilevel organization of computers

Computer hardware and software can contain many levels from the very first, or basic, to that which will be understandable to humans. The concept of a virtual machine is great for illustrating this process. We can assume that when a computer executes a program in any language (C ++, for example), then a virtual machine runs in it that executes commands of this language. Below the C ++ virtual machine is another, with a more primitive language. For example, let it be "Assembler". At this level, the Assembler virtual machine is running. And between them, either translation or interpretation of the program code occurs. Thus, many levels are combined into a single chain to the very first - machine. A virtual machine is simply a concept that makes it easier to imagine a multi-level process.

We answer the obvious question - why not make a computer that works directly with the same C ++ language?

The fact is that the creation of such a technology will require tremendous investments in the hardware and software of such a computer. This is most likely possible, but will be so expensive that it will no longer be appropriate.

Modern computers

To date, most computers consist of 2-6 levels. The zero level is the basic level, that is, machine or hardware, only machine code that runs on the computer's electrical circuits works on it. And on the basis of them, a first-level language is built, etc. It should also be clarified that everything does not end at level zero. Below it there is a technical level - the transistors and resistors themselves, that is, the physics of solids, it is called physical. Thus, the zero level is called the base, because it is here that hardware and software meet with each other.

Modern computers

Finally, we list the hierarchical chain of levels that are contained in the average computer, starting from zero:

  • Level 0 - digital logic, or hardware - gates and registers work here, which are able to store the values ​​0 or 1, as well as perform simple functions "and", "or", etc.
  • Level 1 - microarchitecture - the arithmetic-logical device of the computer works at this level. Here, data, hardware, and software begin to work together.
  • Level 2 - command set architectures.
  • Level 3 - hybrid, or operating system - this level is more flexible, although very similar to level 2. For example, here programs can run in parallel.
  • Level 4 - assembler - the level at which machine digital languages ​​begin to give way to human languages.
  • Level 5 - high-level languages ​​(C ++, Pascal, PHP, etc.)

So, each level is a superstructure over the previous one and is associated with it by translation or interpretation methods, has its own abstract objects and operations. To work on a single level, you can, in principle, not know what is happening on the previous ones. Thanks to this approach, it became easier to understand computer technology.

Indeed, each brand of computers has its own architecture. Moreover, architecture refers to data types, operations and characteristics of each level. For example, the technology by which computer memory cells are created is not included in the concept of architecture.

Computer development

With the development of technology, new levels appeared, some left. The first computers in the 40s had only two levels: digital-logical, where the program was executed, and architectural-command, on which the code was written. Therefore, the boundary between the hardware and software was obvious, but with the increase in the number of levels, it began to disappear.

Today, information hardware and software can be considered identical concepts. Because any operation modeled by software can be performed directly at the hardware level, and vice versa. There are no iron rules that would state why one operation should be performed in hardware and another in software. The separation is based on factors such as production price, speed, reliability, etc. Today's software may tomorrow become part of the hardware or, conversely, something from the hardware can become a program.

Generations of computers

Mechanical computers represent the zero generation. Pascal in the 1640s created a counting machine with a manual drive, which was able to add and subtract. In the 1670s, Leibniz created a machine that also knew how to multiply and divide. Babbage in the 1830s, having spent all his savings, created an analytical machine that looked like a modern computer and consisted of an input device, memory, computer and output method. The machine was so perfect that it could memorize up to 1000 words of 50 decimal places and execute different algorithms at the same time. The analytic machine was programmed in Assembler, so Babbage hired Ada Lovelace to create the first programs. However, he lacked both funds and technology to debug the work of his brainchild.

A little later in America, the most powerful Atanasov machine was created, which worked on binary arithmetic and had an updated memory based on capacitors (RAM), which to this day also works. Atanasov, like Babbage, could not debug the work of his creation. Finally, in 1944, Aiken created the first general-purpose computer Mark I, which could memorize 72 words of 23 decimal places each. At the time of the Mark II design, relay computers were already a thing of the past, and electronic computers replaced them.

Pascal's car

First computer in the world

The Second World War stimulated work on the creation of computers, which entailed the development of the first generation (1945-1955) of computers. The first computer to use electronic tubes was the COLOSSUS Turing machine, which was designed to crack ENIGMA ciphers. And although the computer was late, and the war ended, and because of secrecy did not have an impact on the world of computers, nevertheless it was the first.

Then, in the U.S. Army, scientist Mousley began developing ENIAC. The first such computer weighed three dozen tons, consisted of 18,000 lamps and 1,500 relays, it was programmed with 6,000 switches and consumed a huge amount of energy. Setting up the software and hardware of such a monster was extremely difficult.

Eniac machine

Therefore, like COLOSSUS, the ENIAC machine was not debugged by the deadline and was no longer needed by the army. However, Moushli was allowed to create a school and, on the basis of work on ENIAC, put knowledge into the masses, which gave rise to the creation of many different computers (EDSAC, ILLIAC, WEIZAC, EDVAC, etc.).

Among the entire range of computers, IAS stood out, or von Neumann computing machine, which to this day has an impact on computers. It consisted of a memory, a control device, and an input-output module; it could store 4096 words, 40 bits long.

And although IAS did not become a market leader, it had a powerful impact on the development of computers. For example, Whirlwind I was created on its basis - a computer for serious scientific calculations. Ultimately, all the research led to the fact that a small company, the manufacturer of IBM punched cards, released the 701 computer in 1953 and began to displace Mowsley and its UNIVAC from the leading positions in the market.

Transistors and the first computer game

Bell lab workers received the 1956 Nobel Prize for inventing transistors that instantly changed all of computer technology and gave rise to the second generation (1955-1965) of computers. The first transistor computer was TX-0 (TX-2). He did not have much weight, but one of the creators, Olsen, founded the company DEC, which launched the PDP-1 computer in 1961.

And although it was seriously inferior in parameters to IBM models, it was cheaper. The PDP-1 hardware and software package cost $ 120,000, not millions, like the IBM 7090.

PDP-1 was a commercially successful product. It is believed he laid the foundation for the computer industry. Also, the first computer game "space war" was created on it. Later, the PDP-8 will come with breakthrough Omnibus unified data bus technology. In 1964, the CDC company and the scientist Cray released the 6600 machine, which is an order of magnitude faster than anyone through the use of parallel computing inside the CPU.

PDP-1 machine

IBM first steps

The invention of the silicon integrated circuit, which made it possible to place dozens of transistors on one chip, marked the beginning of the third generation (1965-1980) of computers. They were smaller and faster. Here it should be noted the IBM company, which first asked about the compatibility of different computers and began to produce a whole series called 360. The software and hardware of the 360 ​​series models differed in parameters, but were supplied with a similar set of commands, so they were compatible. Also, 360 machines were able to emulate the operation of other computers, which was a major breakthrough, as it allowed to run programs written for other machines. Meanwhile, DEC remained the market leader in small computers.

IBM360 Model 65

PC era

The fourth generation (1980 - today) - VLSI or ultra-large integrated circuits. There has been a sharp jump in ICs, and technologies have emerged that allow thousands of transistors to be placed on silicon crystals, not tens. The time has come for personal computers.

The first operating systems CP / M; Apple's market appearance; Intel's creation of the parent of the Pentium line - the 386 processor.

And here again, IBM makes a breakthrough in the market, starting to create personal computers from components of different companies, instead of producing everything on its own. This is how IBM PC, the best-selling computer in history, appears.

The new IBM PC approach at the same time gave rise to the era of personal computers, but at the same time harmed the computer industry as a whole. So, for example, Intel broke into the sole leaders in the production of CPUs, and no one could compete with them. Only highly targeted companies were able to survive. Apple Lisa appears - the first computer to use a graphics operating system. Compaq creates the first portable computers, occupies a niche in the market and redeems the former leaders of this DEC segment.

If Intel dealt the first blow to IBM, the second was the blow from the small company Microsoft, which was engaged in the production of OS for IBM. The first OS was MS-DOS, later Microsoft created the OS / 2 system for IBM, and Windows was created in the background. OS / 2 has failed in the market.

Thus, Intel and Microsoft overthrew IBM. The latter try to survive and generate another revolutionary idea, creating a processor with two cores. There is an improvement in PC hardware and software due to all kinds of optimizations.

Fifth generation

But development does not stand still. A paradigm shift is taking place, and the prerequisites for the 5th generation of computers are emerging. It all started with the Japanese government, which in the 1980s allocated tremendous funds to national companies and ordered them to invent the next generation of computers. Of course, the idea failed.

But the impact of this event was great. Japanese technology began to spread around the world. This technique has taken a leading position in many areas of the relevant market: cameras, audio equipment, etc. The West was not going to just give up and also joined in the struggle for the 5th generation.

Grid Systems launched the first tablet, and Apple created the pocket-sized Newton. So there were PDAs, or electronic assistants, or handheld computers.

And here IBM specialists make another breakthrough and present a new idea - they combine the growing popularity of mobile phones with adored PDA users. Thus, in 1993 the first smartphone called Simon was born.

Partly the 5th generation can be considered a reduction in software and hardware in size. And also the fact that today mini-computers are built into any equipment: from smartphones and electric kettles to cars and train rails - and expand its functionality. It is also worth noting spyware developments with hardware protection software. More inconspicuous, designed to fulfill their unique functions.

Smartphone IBM Simon

Types of computers

Not limited to PC hardware and software. To date, there are many:

  • disposable computers: greeting cards, RFID;
  • microcontrollers: watches, toys, honey. equipment and other devices;
  • mobile phones and laptops;
  • personal computers;
  • servers
  • clusters (multiple servers combined into a single whole)
  • mainframes - computers for batch processing of large amounts of data;
  • "cloud technology" - second-order mainframes;
  • supercomputers (although this class is replaced by clusters that are also able to perform serious calculations).

Given this information, hardware and software can be tailored to a wide variety of needs.

Computer families

The hardware and software of a personal computer (and not only it) varies by family. The most popular families are X86, ARM, and AVR. Family refers to the architecture of the instruction set. The first family - X86 - includes almost all personal computers and servers (both on Windows, Linux, and even Mac).

To the second - ARM - mobile systems. Finally, the third - AVR - refers to the majority of microcontrollers, those very inconspicuous computers that are built everywhere: in cars, in electrical appliances, on TVs, etc.

X86 is developed by Intel. Their processors, from the 8080 (1974) to the Pentium 4 (2000), are backward compatible, that is, the new processor is capable of executing programs written for the old one.

The legacy of hardware and software is through generations of processors that make Intel so versatile.

Acorn Computer was at the forefront of creating the ARM project, which later separated and became independent. ARM architecture has long been successful in a market segment where low power consumption is required.

Atmel hired two students who had an interesting idea. Continuing development, they created an AVR processor, which is different in that it is great for systems that do not require high performance. AVR processors fit into the most severe conditions when there are severe restrictions on size, power consumption and power.

Source: https://habr.com/ru/post/C38238/


All Articles