The principle of complementarity is a methodological postulate that was originally formulated by the great Danish physicist and philosopher Niels Bohr as applied to the field of quantum mechanics. The principle of Bohr's complementarity, most likely, was born only due to the fact that even earlier, the German physicist Kurt Gödel proposed his conclusion and the formulation of the famous theorem on the properties of deductive systems, which relates to the field of formal logic. Niels Bohr extended Gödel's logical conclusions to the subject field of quantum mechanics and formulated the principle in approximately this way: in order to reliably and adequately know the object of the microworld, it should be investigated in systems that are mutually exclusive, that is, in some additional systems. This definition went down in history as the principle of complementarity in quantum mechanics.
An example of such a solution to the problems of the microworld was the consideration of light in the context of two theories - wave and corpuscular, which led to an amazingly effective scientific result that revealed to man the physical nature of light.
Niels Bohr went even further in his understanding of the conclusion made. He makes an attempt to interpret the principle of complementarity through the prism of philosophical knowledge, and it is here that this principle gains universal scientific significance. Now the formulation of the principle sounded like: in order to reproduce any phenomenon in order to cognize it in a sign (symbolic) system, it is necessary to resort to additional concepts and categories. In simpler terms, the principle of complementarity suggests in cognition not only possible, but in some cases necessary, the use of several methodological systems that will allow you to acquire objective data about the subject of study. The principle of complementarity in this sense has shown itself to be a fact of agreement with the metaphorical nature of logical systems of methodology - they can manifest themselves both ways. Thus, with the advent and comprehension of this principle, in fact, it was recognized that logic alone is not enough for cognition, and therefore illogical conduct in the research process was recognized as permissible. Ultimately, the application of the Bohr principle contributed to a significant change in the scientific picture of the world.
Later, Yu. M. Lotman expanded the methodological significance of the Bohr principle and transferred its laws to the sphere of culture, in particular, applied it to the description of the semiotics of culture. Lotman formulated the so-called "paradox of the amount of information", the essence of which is that human existence mainly proceeds in conditions of informational insufficiency. And as it develops, this deficiency will increase all the time. Using the principle of complementarity, it is possible to compensate for the lack of information by translating it into another semiotic (sign) system. This technique led, in fact, to the emergence of computer science and cybernetics, and then the Internet. Later, the functioning of the principle was confirmed by the physiological fitness of the human brain to this type of thinking, due to this asymmetry of the activity of its hemispheres.
Another provision, which is mediated by the action of the Bohr principle, is the discovery by the German physicist Werner Heisenberg of the law of the relation of uncertainties. Its action can be defined as recognition of the impossibility of the same description of two objects with the same accuracy if these objects belong to different systems. A philosophical analogy of this conclusion was given by Ludwig Wittgenstein, who stated in his work “On Reliability” that in order to assert certainty of something, one must doubt something.
Thus, the principle of Bohr has acquired enormous methodological significance in various fields of scientific knowledge.