Biographies Characteristics Analysis

Formation of primary theoretical models and laws.

Let us now turn to the analysis of the second situation in the development of theoretical knowledge, which is associated with the formation of particular theoretical schemes and particular theoretical laws. At this stage, the explanation and prediction of empirical facts are no longer carried out directly on the basis of the picture of the world, but through the application of the created theoretical schemes and the expressions of theoretical laws associated with them, which serve as an intermediate link between the picture of the world and experience.

In developed science, theoretical schemes are first created as hypothetical models, and then substantiated by experience. Their construction is carried out through the use of abstract objects previously formed in the field of theoretical knowledge and used as a building material when creating a new model.

Only in the early stages of scientific research, when there is a transition from a predominantly empirical study of objects to their theoretical assimilation, the constructs of theoretical models are created by direct schematization of experience. But then they are used as a means to build new theoretical models, and this method begins to dominate science. The old method is retained only in a rudimentary form, and its scope is sharply narrowed. It is used mainly in those situations when science is faced with objects for the theoretical development of which sufficient funds have not yet been developed. Then objects begin to be studied experimentally, and on this basis the necessary idealizations are gradually formed as a means for constructing the first theoretical models in a new field of study. Examples of such situations are the early stages of the formation of the theory of electricity, when physics formed the initial concepts - "conductor", "insulator", "electric charge", etc. and thereby created the conditions for the construction of the first theoretical schemes explaining electrical phenomena.

Most of the theoretical schemes of science are constructed not by schematization of experience, but by the method of translating abstract objects that are borrowed from previously established areas of knowledge and connected to a new “network of connections”. Traces of such operations are easy to detect by analyzing the theoretical models of classical physics. For example, the objects of the Faraday model of electromagnetic induction "field lines" and "conductive substance" were not abstracted directly from experiments to detect the phenomenon of electromagnetic induction, but were borrowed from the field of knowledge of magnetostatics ("field line") and knowledge of the conduction current ("conductive substance" ). Similarly, when creating a planetary model of an atom, ideas about the center of potential repulsive forces inside the atom (nucleus) and electrons were drawn from the theoretical knowledge of mechanics and electrodynamics.

In this regard, the question arises about the initial premises that guide the researcher in the choice and synthesis of the main components of the hypothesis being created. Although such a choice is a creative act, it has certain reasons. Such grounds are created by the picture of the world adopted by the researcher. The ideas about the structure of natural interactions introduced in it make it possible to discover common features in various subject areas studied by science.

Thus, the picture of the world “suggests” where one can borrow abstract objects and structure, the combination of which leads to the construction of a hypothetical model of a new area of ​​interactions.

The goal-directing function of the picture of the world when putting forward hypotheses can be traced on the example of the formation of the planetary model of the atom.

This model is usually associated with the name of E. Rutherford, and the history of its formation is often described in such a way that it arose as a direct generalization of Rutherford's experiments on the scattering of a-particles by atoms. However, the actual history of science is far from this legend. Rutherford carried out his experiments in 1912, and the planetary model of the atom was first put forward as a hypothesis by the Japanese-born physicist X. Nagaoka much earlier, in 1904.

Here, the logic of the formation of hypothetical variants of the theoretical model, which is created "from above" in relation to experience, is clearly manifested. Sketchy, this logic in relation to the situation with the planetary model of the atom can be represented as follows.

The first impulse to its construction, as well as to the promotion of a number of other hypothetical models (for example, the Thomson model), was the changes in the physical picture of the world that occurred due to the discovery of electrons and the development of the theory of electrons by Lorentz. In the electrodynamic picture of the world, along with the ether and atoms of matter, a new element "atoms of electricity" was introduced. In turn, this raised the question of their relationship with the atoms of matter. The discussion of this issue led to the formulation of the problem: are electrons included in the composition of the atom? Of course, the very formulation of such a question was a bold step, since it led to new changes in the picture of the world (it was necessary to recognize the complex structure of the atoms of matter). Therefore, the concretization of the problem of the relationship between atoms and electrons was associated with entering the sphere of philosophical analysis, which always occurs with radical shifts in the picture of the world (for example, J.J. Thomson, who was one of the initiators of posing the question of the relationship between electrons and atoms of matter, was looking for support in the ideas of atomistics R Boscovichi, to prove the need to reduce "atoms of matter" to "atoms of electricity" in the picture of the world).

The subsequent development of physics reinforced this idea with new experimental and theoretical discoveries. After the discovery of radioactivity and its explanation as a process of spontaneous decay of atoms, the idea of ​​the complex structure of the atom was established in the picture of the world. Now the ether and "atoms of electricity" began to be considered as forms of matter, the interaction of which forms all other objects and processes of nature. As a result, the task arose - to build an "atom of matter" from positively and negatively charged "atoms of electricity" interacting through the ether.

The formulation of such a problem prompted the choice of initial abstractions for constructing hypothetical models of the atom - these should be abstract objects of electrodynamics. As for the structure in which all these abstract objects were included, its choice to some extent was also justified by the picture of the world. During this period (the end of the 19th - the beginning of the 20th century), the ether was considered as a single basis for gravitational and electromagnetic forces, which made the analogy between the interaction of gravitating masses and the interaction of charges natural.

When Nagaoka proposed his model, he proceeded from the fact that the rotation of satellites and rings around Saturn can serve as an analogue of the structure of the atom: electrons must rotate around a positively charged nucleus, just as in celestial mechanics satellites rotate around a central body.

The use of an analog model is a way of transferring a structure from celestial mechanics that has been connected to new elements (charges). The substitution of charges for gravitating masses in the analog model led to the construction of a planetary model of the atom.

Thus, in the process of putting forward hypothetical models, the picture of the world plays the role of a research program that ensures the formulation of theoretical problems and the choice of means for solving them.

After the hypothetical model of the studied interactions is formed, the stage of its substantiation begins. It does not come down only to testing those empirical consequences that can be obtained from a law formulated with respect to a hypothetical model. The model itself must be justified.

It is important to pay attention to the following circumstance. When, during the formation of a hypothetical model, abstract objects are included in new relationships, this, as a rule, leads to endowing them with new features. For example, when constructing a planetary model of an atom, the positive charge was defined as the atomic nucleus, and the electrons were endowed with the sign of "stably moving in orbits around the nucleus."

Assuming that a hypothetical model created in this way expresses the essential features of a new subject area, the researcher thereby admits, firstly, that new, hypothetical features of abstract objects have a basis precisely in that area of ​​empirically fixed phenomena, which the model claims to explain, and, secondly, that these new features are compatible with other defining features of abstract objects that have been substantiated by the previous development of knowledge and practice.

It is clear that the legitimacy of such assumptions should be proved specifically. This proof is made by introducing abstract objects as idealizations based on new experience. Signs of abstract objects, hypothetically introduced "from above" in relation to the experiments of a new field of interactions, are now restored "from below". They are obtained within the framework of mental experiments corresponding to the typical features of those real experimental situations that the theoretical model is intended to explain. After that, it is checked whether the new properties of abstract objects are consistent with those justified by previous experience.

This entire complex of operations provides a substantiation of the features of abstract objects of a hypothetical model and its transformation into a theoretical scheme of a new area of ​​interactions. We will call these operations constructive introduction of objects into the theory.

A theoretical scheme that satisfies the described procedures will be called constructively justified.

4. The role of analogies and the procedure for substantiating theoretical knowledge.
In the modern process of scientific research, the role of analogies becomes quite tangible. The transfer of abstract objects from one field of knowledge to another, which is used by modern theoretical knowledge, assumes as its basis the method of analogies, which indicates the relationship of similarity between things. This fairly widespread way of identifying the properties of objects or the objects themselves goes back to an ancient tradition, the echo of which is the reflections of the Pythagoreans on the numerical structure of the universe, i.e. about the ratio of numerical correspondences and cosmic harmony of spheres.
“All things are numbers”, “number owns things”, - these are the conclusions of Pythagoras. The unified beginning in the unmanifested state is equal to zero; when it is embodied, it creates the manifested pole of the absolute, equal to one. The transformation of a unit into a two symbolizes the division of a single reality into matter and spirit, testifies that knowledge of one is knowledge of the other. The ontological basis of the method of analogies is the well-known principle of the unity of the world, which, according to the ancient tradition, is interpreted in two ways: one is many and many is one.
Analogy played a great role in the metaphysics of Aristotle, who interpreted it as a form of manifestation of a single principle in single bodies. The meaning of the analogy can be understood by referring to the reasoning of the medieval thinkers Augustine and Thomas Aquinas. Augustine spoke of the similarity between the Creator and his creation; Thomas Aquinas considered "analogues of beings" that testify to the unequal and ambiguous distribution of perfection in the universe. The Creator has the fullness of being, the other entities have it "by analogy", i.e. in a certain proportion.
Modern researchers distinguish 1) the analogy of inequality, when different objects have the same name (heavenly body, earthly body; 2) the analogy of proportionality (physical health - mental health); 3) the analogy of attribution, when the same relationships or qualities are attributed to different objects (a healthy lifestyle - a healthy body - a healthy society, etc.).
Thus, reasoning by analogy allows us to liken a new single phenomenon to another, already known phenomenon. Analogy with a certain degree of probability allows you to expand knowledge by including new subject areas in their scope. It is noteworthy that Hegel highly valued the possibilities of the method of analogies, calling it "the instinct of reason."
Abstract objects transmitted from one sphere to another must satisfy the connections and interactions of the emerging field of knowledge. Therefore, the question of the reliability of the analogy is always relevant. Due to the fact that the history of science provides a significant number of examples of the use of analogies, analogy is recognized as an indispensable means of scientific and philosophical intellect.
There are analogies of objects and analogies of relations, as well as strict and non-strict analogy. Strict analogy provides the necessary connection of the transferred feature with the feature of similarity; the analogy is not strict and is problematic. It is important to note that, in contrast to deductive reasoning, in analogy there is an assimilation of single objects, and not a subsuming of an individual case under a general position.
An important role in the formation of classical mechanics was played by the analogy between the motion of a thrown body and the motion of celestial bodies; the analogy between geometric and algebraic objects was realized by Descartes in analytic geometry; the analogy of selective work in pastoralism was used by Darwin in his theory of natural selection; the analogy between light, electric and magnetic phenomena proved to be fruitful for Maxwell's theory of the electromagnetic field. An extensive class of analogies is used in modern scientific disciplines: in architecture and the theory of urban planning, bionics and cybernetics, pharmacology and medicine, logic and linguistics, etc.
Numerous examples of false analogies are also known. Such are the analogies between the movement of fluid and the spread of heat in the doctrine of "caloric" of the 17th-18th centuries, the biological analogies of social Darwinists in explaining social processes, etc.
It should be added that the analogy method is widely used in the field of technical sciences. In the technical sciences, it is customary to distinguish between invention (i.e., the creation of a new and original one) and improvement (the transformation of an existing one). Sometimes an invention is seen as an attempt to imitate nature, simulation modeling, an analogy between an artificially created object and a natural pattern.
So, a cylindrical shell is a common form used for various purposes in technology and everyday life, it is a universal structure of numerous manifestations of the plant world. Its perfect model is the stem. It is from wildlife that analogues of solutions for enveloping structures are borrowed. The role of pneumatic structures is great - they helped a person for the first time to overcome the force of gravity, to open the era of aeronautics. Their idea is also taken from nature, because one of the most perfect samples of pneumatic structures is a biological cell. Some fruits and seeds have adapted to distribution in nature with the help of a kind of "parachutes", "sails" or a winged outgrowth. It is not difficult to see an analogy and similarity between such sophisticated methods of natural adaptation and the later products of human civilization, which exploit the model of a sail, parachute, wing, etc.
An imitation invention has more reason to be inscribed in nature, since in this case the scientist uses the secrets of nature, its solutions and findings. But an invention is also the creation of a new, unparalleled one.
And if the role and significance of analogy in modern science needs to be proved, then the justification procedure has always been considered an important component of scientific research. Yes, and science itself was often interpreted as a purely "explanatory event." At the same time, it is necessary to strictly distinguish between justification, description and explanation. The most elementary definition of justification is based on the procedure of reducing the unknown to the known, the unfamiliar to the familiar. However, the latest achievements of science show that many processes of the modern physical picture of the world are fundamentally unimaginable and unimaginable. This indicates that the justification is deprived of a model character, visibility and must be based on purely conceptual techniques, in which the very procedure of reducing (reducing) the unknown to the known is called into question.
There is another paradoxical phenomenon: the objects that need to be explained, it turns out, cannot be observed in principle. Thus, scientific and theoretical knowledge acquires, alas, an inexperienced character.
In relation to the logic of scientific discovery, the position of rejecting the search for rational substantiation of scientific discovery is quite common. In the logic of discovery, a large place is given to bold guesses, they often refer to insight, analog modeling. There are widespread indications of heuristics and intuition that accompanies the process of scientific discovery.
The most general view of the mechanism of development of scientific knowledge from the positions of rationalism indicates that knowledge, as already mentioned, can be dissecting (analytical) and generalizing (synthetic). Analytical knowledge allows you to clarify the details and particulars, to identify the potential of the content present in the original basis. Synthetic knowledge leads not just to generalization, but to the creation of a fundamentally new content, which is not contained either in disparate elements or in their summative integrity. The essence of the analytical approach lies in the fact that the main essential aspects and patterns of the phenomenon under study are assumed to be something contained in the given, taken as the initial one. Research work is carried out within the framework of an already outlined area, a set task and is aimed at analyzing its internal potential. The synthetic approach focuses the researcher on finding dependencies outside the object itself, in the context of systemic relations coming from outside.
The rather traditional idea that the emergence of the new is associated only with synthesis cannot remain without clarification. Undoubtedly, it is the synthetic movement that presupposes the formation of new theoretical meanings, types of mental content, new horizons, a new layer of reality. Synthetic is the new, which leads to the discovery of a qualitatively different, different from the previous, available basis. The analytic movement presupposes a logic aimed at revealing elements that were not yet known about, but which were contained in the previous basis. A.F. Losev also emphasizes that the essence of analytic negation lies in the fact that it adds something to the motionless discreteness. The whole novelty of analytic negation lies in the fact that it points to some kind of shift, no matter how small and close to zero, to some kind of increment of this quantity. The analytical form of obtaining new knowledge fixes new connections and relationships of objects that have already fallen into the sphere of human practical activity. It is closely related to deduction and the concept of "logical consequence".

In the logic of discoveries, those areas are singled out where development proceeds according to the analytical type on the basis of the disclosure of the initial principles, and areas are also fixed where there is a “break in gradualness”, going beyond the limits of existing knowledge. The new theory in this case overturns the existing logical canons and is built on a fundamentally different, constructive basis. A constructive modification of the observed conditions, the establishment of new idealizations, the creation of a different scientific objectivity that is not found in finished form, an integrative crossing of principles at the "junction of sciences" that previously seemed unrelated to each other - these are the features of the logic of discovery, which gives new knowledge of a synthetic nature. and more heuristic value than the old one. The interaction of traditions and innovations, on the one hand, indicates the need to maintain continuity, the existing set of methods, techniques and skills, and on the other hand, demonstrates the potential that exceeds the method of reproducing the accumulated experience, which involves the creation of a new and unique one.
The logic of discovery aims at the awareness of such factors elusive from the field of view as a by-product of interactions, unintended consequences of goal-setting activity. (For example, Columbus wanted to open a new route to India, but discovered a previously unknown continent - America.) The discrepancy between goals and results is a fairly common process. In the end result, at least three layers are conjugated: the content of the originally set goal, the by-product of interactions, and the unintended consequences of expedient activity. They testify to the multidimensionality of natural and social interactions. Recognition of non-linearity, alternativeness is a feature of the new strategies of scientific research.
A modern scientist must be ready to record and analyze the results obtained outside and in addition to his conscious goal setting, including the fact that the latter may turn out to be much richer than the original goal. The fragment of being singled out as a subject of study is in fact not an isolated abstraction - it is connected with the infinite dynamics of the universe. The main and secondary, central and peripheral, main and dead-end directions of development, having their own niches, coexist in constant non-equilibrium interaction. Situations are possible when a developing phenomenon does not carry the forms of future states in a ready-made form, but receives them from the outside as a by-product of interactions occurring outside the phenomenon itself or, at least, on the periphery of this framework. And if earlier science could afford to cut off these lateral branches, which seemed insignificant, now this is an unaffordable luxury. It turns out that it is generally not easy to define what it means "not important" or "not interesting" in science. Arising on the periphery of connections and relationships, including under the influence of factors that have insignificantly manifested themselves in the past, a by-product can be a source of new formation and be even more significant than the originally set goal. It testifies to the indestructible desire of being to realize all its potentialities. Here there is a kind of equalization of opportunities, when everything that has a place to be declares itself and requires a recognized existence.
The ambiguity of the logic of constructing scientific knowledge has been noted by many philosophers.
So, M.K. Mamardashvili, in his monograph Forms and Contents of Thinking, emphasizes that in the logical apparatus of science it is necessary to distinguish between two types of cognitive activity. The first includes the means that allow you to get a lot of new knowledge from the existing ones, using the proof and logical derivation of all possible consequences. However, this does not highlight a fundamentally new mental content in objects and does not assume the formation of new abstractions. The second method involves the acquisition of new scientific knowledge "by acting with objects", which is based on the involvement of content in the construction of the course of reasoning. Here we are talking about the use of content in some new way, which does not follow from the logical form of the existing knowledge and any of their recombination.
In the works "Criteria of Meaning", "The Theorist's Dilemma" by the modern German-American philosopher of science Carl Gustav Hempel (1905-1997), special attention is paid to the problem of clarifying the relationship between "theoretical terms" and "terms of observation". Hempel shows that when the meaning of theoretical terms is reduced to the meaning of a set of terms of observation, theoretical concepts turn out to be redundant. They turn out to be redundant even if the introduction and substantiation of theoretical terms rely on intuition. Therefore, theoretical terms cannot be reduced to terms of observation, and no combination of observation terms can exhaust the theoretical terms.
These provisions were of great importance for understanding the status of theoretical models in science. The "theorist's dilemma", according to the researchers, can be presented in the form of the following statements:
1. Theoretical terms either perform or do not perform their function.
2. If theoretical terms do not fulfill their function, then they are not needed.
3. If theoretical terms perform their functions, then they establish connections between observed phenomena.
4. These connections can be established without theoretical terms.
5. If empirical connections can be established even without theoretical terms, then theoretical terms are not needed.
6. Therefore, theoretical terms are not needed both when they perform their functions and when they do not perform these functions.
Hempel poses the problem of the difference between laws and explanations in natural science and history. Scientific research in various fields seeks not only to generalize certain events in the world of our experience, but also to reveal the regularity in the course of these events and to establish general laws that can be used for prediction and explanation. According to the "embracing laws" model he justified, an event is explained when a statement describing the event is deduced from general laws and statements describing antecedent conditions; a general law is explanatory if it is deduced from a more exhaustive law. Hempel was the first to clearly link explanation with deductive inference, and deductive inference with law; in addition, he formulated the conditions for the adequacy of an explanation. According to the scientist, general laws have similar functions in history and in the natural sciences, form an integral tool of research and constitute the general foundations of various procedures, which are often considered as specific to the social sciences, in contrast to the natural ones.
Historical research often uses the general laws established in physics, chemistry, and biology. For example, the defeat of the army is explained by the lack of food, weather changes, diseases, etc. Determining dates in history using tree rings is based on the application of certain biological patterns. Various methods of empirical verification of the authenticity of documents, paintings, coins use physical and chemical theories. However, in all cases, the historical past is never accessible to direct study and description.
Analyzing the entire historical arsenal of explanation, it is necessary to distinguish between metaphors that have no explanatory value, sketches of explanations, among which there are both scientifically acceptable and pseudo-explanations, and, finally, satisfactory explanations. Hempel envisaged the need for a supplementary procedure, assuming a form of gradually increasing refinement of the formulations used, so that an outline of an explanation could be confirmed, refuted, or indicate approximately the type of research. The reconstruction procedure is also important, aimed at understanding the underlying explanatory hypotheses, assessing their significance and empirical base. From his point of view, the use of assumptions "therefore", "because", "therefore", etc., often shows that the proposed explanations are weakly justified or unacceptable. In many cases, this procedure detects an assertion error.
For example, the geographical or economic conditions of a group of people can be taken into account when explaining some common features, say, their art or moral code; but this does not mean that in this way we have explained in detail the artistic achievements of this group of people or the system of their moral code. From the description of geographical or economic conditions it is not possible to derive a detailed explanation of aspects of cultural life.
Proper justification is aided by the isolation of one or more important groups of facts, which must be specified in the initial conditions and the assertion that the event in question is "determined" and, therefore, must be explained in terms of only this group of facts.

Scientific explanation includes the following elements:
empirical verification of sentences that testify to certain conditions;
empirical testing of the universal hypotheses on which the explanation is based;
examining whether an explanation is logically persuasive.
A prediction, unlike an explanation, is a statement about some future event. Here the initial conditions are given, and the consequences do not yet take place, but must be established. We can talk about the structural equality of the procedures of substantiation and prediction. Very rarely, however, explanations are formulated so completely that they can be predictive. There are “causal” and “probabilistic” explanations, based rather on probabilistic hypotheses than on general “deterministic” laws, i.e. laws in the form of universal conditions.
In The Logic of Explanation, K. Hempel argues that to explain phenomena in the world of our experience means to answer the question “why?” rather than just the question “what?”. Science has always sought to go beyond description and break through to explanation. An essential characteristic of justification is the reliance on general laws.
For example, when a part of an oar under water appears to be broken upwards to a person in a boat, this phenomenon is explained using the law of refraction and the law of optical density of media: water has a higher optical density than air. So the question "Why is this happening?" is understood in the sense of "according to what general laws does this happen."
However, the question "why?" can arise in relation to the general laws themselves. Why does the propagation of light obey the law of refraction? Answering this question, representatives of classical physics will be guided by the wave theory of light. Thus, the explanation of the regularity is carried out on the basis of subsuming it under another, more general regularity. Based on this, an explanation structure is derived, consisting of two parts:
1. description of the phenomenon;
2. the class of sentences that are given for
explanation of this phenomenon, which, in turn, is divided into two subclasses: one of them describes the conditions, the other - general laws.
The principle of causal justification is used in both natural and social sciences. Explanation of actions in terms of the agent's motives is seen as a special kind of teleological explanation, which is absolutely necessary in biology, since it explains the characteristics of an organism by referring to certain goals that are essential for the preservation of its life or species.

In the philosophical and methodological literature of recent decades, fundamental ideas, concepts and ideas, which form relatively stable foundations, are increasingly becoming the subject of study, on which specific empirical knowledge and theories explaining them are developed.

The identification and analysis of these foundations presupposes the consideration of scientific knowledge as an integral developing system. In Western philosophy, such a vision of science began to take shape relatively recently, mainly in the post-positivist period of its history. As for the stage at which ideas about science, developed within the framework of positivist philosophy, dominated, their most striking expression was the so-called standard concept of the structure and growth of knowledge 1 . In wei, a single theory and its relationship with experience acted as a unit of analysis. Scientific knowledge was presented as a set of theories and empirical knowledge, considered as the basis on which theories are developed. However, it gradually became clear that the empirical basis of the theory is not pure, theoretically neutral empiricism, that not observational data, but Facts represent the empirical basis on which theories are based. And the facts are theoretically loaded, since other theories take part in their formation. And then the problem of the interaction of a separate theory with its empirical basis also appears as the problem of the relationship of this theory with other, previously established theories that form the composition of the theoretical knowledge of a certain scientific discipline.

On the other hand, this problem of the interconnection of theories was revealed in the study of their dynamics. It turned out that the growth of theoretical knowledge is carried out not simply as a generalization of experimental facts, but as the use in this process of theoretical concepts and structures developed in previous theories and used in the generalization of experience. Thus, the theories of the corresponding science were presented as a kind of dynamic network, an integral system interacting with empirical facts. The systemic impact of the knowledge of a scientific discipline posed the problem of system-forming factors that determine the integrity of the corresponding knowledge system. Thus, the problem of the foundations of science began to emerge, thanks to which various knowledge of a scientific discipline is organized into a systemic integrity at each stage of its historical development.

Finally, consideration of the growth of knowledge in its historical dynamics revealed special states associated with critical epochs in the development of science, when a radical transformation of its most fundamental concepts and ideas takes place. These states have been called scientific revolutions, and they can be seen as a restructuring of the foundations of science.

Thus, the expansion of the field of methodological problems in the post-positivist philosophy of science put forward the analysis of the foundations of science as a real methodological problem.

These foundations and their individual components were fixed and described in terms of: “paradigm” (T. Kuhn), “core of the research program” (I. Lakatos), “ideals of natural order” (S. Tulmin), “main themes of science” ( J. Holton), “research tradition” (L. Laudan).

In the process of discussions between supporters of various concepts, the problem of a differentiated analysis of the foundations of science became acute. Discussions around the key concept in Kuhn's concept of "paradigm" can serve as indicative in this regard. Its extreme ambiguity and vagueness were noted by Kuhn's numerous opponents.

Influenced by criticism, Kuhn attempted to analyze the structure of the paradigm. He singled out the following components: “symbolic generalizations” (mathematical formulations of laws), examples of solving specific problems, “metaphysical parts of the paradigm” and values ​​(values ​​of science) 2 . This was a step forward compared to the first version of the concept, but at this stage the structure of the foundations of science remained unclear. Firstly, it is not shown in what connections the selected components of the paradigm are located, which means, strictly speaking, its structure has not been revealed. Secondly, according to Kuhn, the paradigm includes both components related to the deep foundations of scientific research and the forms of knowledge that grow on these foundations. For example, “symbolic generalizations” include mathematical formulations of particular laws of science (such as formulas expressing the Joule-Lenz law, the law of mechanical vibration, etc.). But then it turns out that the discovery of any new private law should mean a change in the paradigm, i.e. scientific revolution. This blurs the distinction between “normal science” (an evolutionary stage in the growth of knowledge) and the scientific revolution. Thirdly, highlighting such components of science as “metaphysical parts of the paradigm” and values. Kuhn fixes them “ostensively”, through the description of relevant examples. From the examples cited by Kuhn, it can be seen that the “metaphysical parts of the paradigm” are understood by him either as philosophical ideas or as principles of a concrete scientific nature (such as the principle of close action in physics or the principle of evolution in biology). As for values, Kuhn's description of them also looks like only a first and very approximate sketch. In essence, here we have in mind the ideals of science, and taken in a very limited range - as the ideals of explanation, prediction and application of knowledge.

In principle, we can say that even in the most advanced studies of the foundations of science, to which the works of T. Kuhn can be attributed, Western philosophy of science is not analytical enough. It has not yet established what are the main components of the foundations of science and their connections. The connections between the foundations of science and the theories and empirical knowledge based on them are not sufficiently clarified. And this means that the problem of the structure of foundations, their place in the system of knowledge and their functions in its development requires further, deeper discussion.

In the established and developed system of disciplinary scientific knowledge, the foundations of science are found, firstly, in the analysis of systemic relationships between theories of varying degrees of generality and their relationship to various forms of empirical knowledge within a certain discipline (physics, chemistry, biology, etc.), secondly, in the study of interdisciplinary relations and interactions of various sciences.

As the most important components that form the foundations of science, we can single out: 1) the scientific picture of the world; 2) ideals and norms of scientific knowledge; 3) philosophical foundations of science.

The listed components express general ideas about the specifics of the subject of scientific research, about the features of cognitive activity that masters one or another type of objects, and about the nature of the links between science and the culture of the corresponding historical era.

Theoretical laws are directly formulated in relation to the abstract objects of the theoretical model.

Let us consider the process of formation of theoretical models (schemes).

In advanced science, theoretical schemes are first constructed as hypothetical models ( formation of a theoretical model as a hypothesis) through the use of abstract objects previously formed in the field of theoretical knowledge and used as building material when creating a new model.

The choice by the researcher of the main components of the hypothesis being created is a creative act, and, in addition, it has certain grounds that are created by the picture of the world adopted by the researcher. The ideas about the structure of natural interactions introduced in it make it possible to discover common features in various subject areas studied by science. Thus, the picture of the world "suggests" where one can borrow abstract objects and structure, the combination of which leads to the construction of a hypothetical model of a new area of ​​interactions.

After the hypothetical model of the studied interactions is formed, the stage her justification. It does not come down only to testing those empirical consequences that can be obtained from a law formulated with respect to a hypothetical model. The model itself must be justified. When forming a hypothetical model, abstract objects are immersed in new relationships. This usually leads to endowing them with new features. In doing so, the researcher assumes that:

  • 1) new, hypothetical features of abstract objects have a basis precisely in the field of empirically fixed phenomena, the explanation of which the model claims;
  • 2) these new features are compatible with other defining features of abstract objects, which were substantiated by the previous development of knowledge and practice.

Signs of abstract objects, hypothetically introduced "from above" in relation to the experiments of a new field of interactions, are now restored "from below". They are obtained within the framework of mental experiments corresponding to the typical features of those real experimental situations that the theoretical model is intended to explain. After that, it is checked whether the new properties of abstract objects are consistent with those justified by previous experience.

Hypothetical models acquire the status of theoretical ideas about a certain area of ​​interactions only when they go through the procedures of empirical justification. This is a special stage in the construction of a theoretical scheme, at which it is proved that its initial hypothetical version can appear as an idealized image of the structure of precisely those experimental and measuring situations in which the features of the interactions studied in the theory are revealed.

Let us first consider how the theoretical models are arranged. Their elements are abstract objects (theoretical constructs) that are in strictly defined connections and relationships with each other. For example, mechanical vibrations of bodies are studied, then the concept of a material point is introduced, which periodically deviates from the equilibrium position and returns to this position again. This representation itself makes sense only when the frame of reference is fixed. And this is the second theoretical construct that appears in the theory of oscillations. It corresponds to an idealized representation of a physical laboratory equipped with clocks and rulers. Finally, to describe oscillations, one more abstract object is needed - a quasi-elastic force, which is introduced on the basis of: set a material point in motion, returning it to the equilibrium position. The system of listed abstract objects (material point, reference system, quasi-elastic force) form a model of small oscillations (called an oscillator in physics). A law is an essential, recurring, stable connection between various kinds of material and ideal objects (object states). Theoretical laws are directly formulated in relation to the abstract objects of the theoretical model. They can be applied to describe real situations of experience only if the model is justified as an expression of the essential connections of reality that appear in such situations. Exploring the properties of this oscillator model and expressing the relations of the objects that form it in the language of mathematics, one obtains the formula , which is the law of small fluctuations. Theoretical models are not something external to the theory. They are part of it. To emphasize the special status of theoretical models, in relation to which laws are formulated and which are necessarily part of the theory, Stepin introduced the concept theoretical scheme. They really are schemes researched in theory objects and processes, expressing them significant connections. By introducing such a concept, Stepin wants to emphasize the correlation of a theoretical scheme with quite specific theoretical objects. This is how private scientific theories describe different theoretical objects, and moreover, these objects differ from objects of more general theories. For example, in Newtonian mechanics, its basic laws are formulated in relation to a system of abstract objects: "material point", "force", "inertial space-time frame of reference". Connections and relationships of the listed objects form a theoretical model of mechanical motion, depicting mechanical processes as the movement of a material point along a continuum of points in space of an inertial frame of reference over time and as a change in the state of motion of a material point under the action of a force. But also in mechanics there are theoretical schemes and laws of oscillation, rotation of bodies, collision of elastic bodies, movement of a body in the field of central forces, etc.



Now consider the process of formation of theoretical schemes. In developed science, theoretical schemes are first built as hypothetical models (i.e., a theoretical model is formed as a hypothesis). Such a construction is carried out through the use of abstract objects previously formed in the field of theoretical knowledge and used as a building material when creating a new model. Only in the early stages of scientific research, when there is a transition from a predominantly empirical study of objects to their theoretical assimilation, the constructs of theoretical models are created by direct schematization of experience. The schematization method is mainly used in situations where science encounters objects for which theoretical development has not yet been developed sufficient means. Then objects begin to be studied experimentally, and on this basis the necessary idealizations are gradually formed as a means for constructing the first theoretical models in a new field of study. An example of such situations is the early stages of the formation of the theory of electricity, when physics formed the initial concepts - "conductor", "insulator", "electric charge", etc. - and thereby created the conditions for constructing the first theoretical schemes explaining electrical phenomena. Most of the theoretical schemes of science are constructed by translating already created abstract objects, which are borrowed from previously established areas of knowledge and connected to a new "network of connections". In this regard, the question arises about the initial premises that guide the researcher in the choice and synthesis of the main components of the hypothesis being created. Although such a choice is a creative act, it has certain reasons. Such grounds are created by the picture of the world adopted by the researcher. The ideas about the structure of natural interactions introduced in it make it possible to discover common features in various subject areas studied by science. Thus, the picture of the world "suggests" where one can borrow abstract objects and structure, the combination of which leads to the construction of a hypothetical model of a new area of ​​interactions. (When Nagaoka proposed his model, he proceeded from the fact that the rotation of satellites and rings around Saturn can serve as an analogue of the structure of the atom: electrons must rotate around a positively charged nucleus, just as in celestial mechanics satellites rotate around a central body. The use of the analog model was a way of transferring from the celestial mechanics of the structure, which was connected with new elements (charges). The substitution of charges for the place of gravitating masses in the analog model led to the construction of a planetary model of the atom.). After the hypothetical model of the studied interactions is formed, the stage of its substantiation begins. It does not come down only to testing those empirical consequences that can be obtained from a law formulated with respect to a hypothetical model. The model itself must be justified. It is important to pay attention to the following circumstance. When, during the formation of a hypothetical model, abstract objects are immersed in new relationships, this, as a rule, leads to endowing them with new features. For example, when constructing a planetary model of an atom, a positive charge was defined as an atomic nucleus, and electrons were endowed with the sign "to move stably in orbits around the nucleus." Assuming that the hypothetical model created in this way expresses the essential features of a new subject area, the researcher thereby admits: firstly, that new, hypothetical features of abstract objects have a basis precisely in that area of ​​empirically fixed phenomena, which the model claims to explain, and, in secondly, that these new features are compatible with other defining features of abstract objects that have been substantiated by the previous development of knowledge and practice. It is clear that the legitimacy of such assumptions should be proved specifically. This proof is made by introducing 1) abstract objects as idealizations based on new experience. Signs of abstract objects, hypothetically introduced "from above" in relation to the experiments of a new field of interactions, are now restored "from below". They are obtained within the framework of mental experiments corresponding to the typical features of those real experimental situations that the theoretical model is intended to explain. After that, it is checked whether the new properties of abstract objects are consistent with those justified by previous experience. In order to consider this issue more specifically, let us return to the Nagaoka planetary model of the atom, in which the question of the constructiveness of ideas about the atomic nucleus remained open. This constructive justification for an abstract object - the atomic nucleus was obtained in Rutherford's experiments on the scattering of a-particles. Having discovered in the experiment precisely scattering at large angles, Rutherford interpreted it as evidence of the existence of a positively charged nucleus inside the atom. The core has been defined as a center of potential repulsive forces, capable of scattering heavy, positively charged particles through large angles. Characteristically, this definition can be found even in modern physics textbooks. It is easy to see that it is a concise description of a thought experiment on the scattering of heavy particles by an atom, which, in turn, is an idealization of Rutherford's real experiments. The signs of the construct "atomic nucleus", introduced hypothetically, "from above" in relation to experience, were now obtained "from below", as an idealization of real experiments in the atomic field. Thus, the hypothetical object "atomic nucleus" received a constructive justification and could be given an ontological status. Thus, the generation of new theoretical knowledge is carried out as a result of a cognitive cycle, which consists in the movement of research thought from the foundations of science, and first of all from the representations of the picture of the world substantiated by experience, to hypothetical variants of theoretical schemes. These schemes are then adapted to the empirical material they claim to explain. Theoretical schemes in the process of such adaptation are rebuilt, saturated with new content and then again compared with the picture of the world, exerting an active feedback effect on it (movement from the foundations of science to a hypothetical model, its constructive justification and then again to the analysis and development of the foundations of science.).

Note:(Hypothetical models acquire the status of theoretical ideas about a certain area of ​​interactions only when they go through the procedures of empirical justification. This is a special stage in constructing a theoretical scheme, at which it is proved that its initial hypothetical version can appear as an idealized image of the structure of precisely those experimental and measuring situations, within which the features of the interactions studied in the theory are revealed.It is possible to formulate in general terms the basic requirements that the substantiation of a hypothetical model must satisfy.Assuming that it is applicable to a new subject area that has not yet been mastered theoretically, the researcher thereby admits: firstly, that the hypothetical attributes of the abstract objects of the model can be compared with certain relations of the objects of experimental situations of precisely the area that the model claims to explain; secondly, that such attributes are compatible with other defining characteristics of abstract objects that were substantiated by the previous development of knowledge and practice. The correctness of such assumptions should be proved specifically. This proof is made by introducing abstract objects as idealizations based on new experience. The hypothetically introduced features of abstract objects are obtained within the framework of mental experiments corresponding to the features of those real experimental and measuring situations that the introduced theoretical model is intended to explain. After that, it is checked whether the new properties of abstract objects are consistent with those justified by previous experience.)

13. Interpretation- in a broad sense, it is characterized as an explanation, interpretation, decoding of one system (text, events, facts) in another, more specific, understandable, visual or generally recognized. In a special, strict sense, interpretation is defined as the establishment of systems of objects that make up the subject area of ​​meanings of the basic terms of the theory under study and satisfy the requirements of the truth of its provisions. In this perspective, interpretation acts as a procedure inverse to formalization.
Strict interpretation has two varieties: theoretical, determined by finding such values ​​of variables in the formulas of the theory under study, under which they turn into true positions; and empirical, associated with solving problems of establishing the correspondence of concepts to empirical objects, the search for empirical meanings of theoretical terms. In the latter case, operational definitions are of great importance, that is, methods for concretizing concepts through experimental situations, with the help of which the features of objects reflected by these concepts are fixed. For example, temperature can be determined through the readings of a thermometer, and distance through the movement of a body and time. The role of operational definitions in sociology is essential, in particular, in solving problems of translating concepts into indicators. The very specificity of sociological knowledge is such that the variables in it must allow for empirical interpretation. To the extent that the analysis of sociological data involves the use of theoretical models of the objects under study, theoretical interpretation is also used in sociology. Such are, for example, situations of interpreting graphs as sociograms by defining them on the links between members of small groups or cases of interpreting projective tests in the context of certain theoretical models. The most widespread in sociology is interpretation in the broad sense, i.e., the process of interpretation necessary, for example, to clarify the sociological meaning of statistical dependencies. In general, interpretation contributes to the concretization of theoretical systems and provisions, the translation of theoretical statements into factual ones. Interpretation enhances the cognitive value of theoretical concepts and, by reducing abstract terms to concrete ones, opens the way to verifying the theoretical constructions under study.

Interpretation of basic concepts- one of the important procedures for developing a program of sociological research. It includes theoretical and empirical clarification of concepts. Interpretation of the basic concepts - allows you to establish in which areas of analysis the collection of sociological data should be carried out.
The theoretical interpretation of the basic concepts means:
a) clarification of the concept from the point of view of the theory in which it is included, clarification of its place in the structure of this theory and its connection with its other concepts;
b) clarification of the relationship of the concept to its use in other theories, fields of knowledge, including journalism.
The theoretical interpretation of the basic concepts is indispensable for any sociological research, especially in cases where the concepts are not clearly defined. It allows you to reveal the richness of the content contained in them and thus creates the basis for constructing a conceptual scheme for the study, formulating its goals, objectives, hypotheses, and selecting materials.
However, only a theoretical interpretation of the basic concepts for conducting sociological research is not enough. The fact is that, having a good idea of ​​the problem at the theoretical level, the researcher, as a rule, does not have a clear idea of ​​the peculiarities of the relationship between the theoretical description of the subject area covered by it (the problem), the contradiction inherent in it and its manifestation in specific social facts. In order, on the one hand, to obtain such an idea, and on the other hand, to implement and verify the tasks and hypotheses put forward, formulated in terms of a certain sociological theory, with the help of an appropriate system of social facts (empirical indicators), it is necessary to carry out an empirical interpretation of the basic concepts, define these concepts operationally, i.e., correlate them with the phenomena (elements) of reality so that the latter are covered by their content and thus become the corresponding empirical indicators and indicators of each concept. But being "representatives" of empirically interpreted concepts and terms, these elements of reality are at the same time indicators of the object under study. So, through certain facts of social reality, recorded in the study, the correlation of sociological concepts with their own objective analogues, acting as empirical characteristics (features, indicators, indicators) of the object under study, is carried out. At the same time, the concepts are meaningfully narrowed down, limited, and the manifested properties of the object are empirically fixed and recognized (identified).
In the most general terms, the empirical interpretation of the basic concepts is understood as certain groups of facts of social reality, the fixation of which makes it possible to determine that the phenomenon under study takes place in it. So, for example, indicators of the presence of a new type of economic thinking in an employee can be: readiness for changes in technology, mastering advanced experience; the ability to combine professions; participation in the management of the team, in rationalization and inventive activities; the desire to master economic knowledge, etc.
The researcher should strive for the most complete representation of the concept in the system of indicators and indicators. However, a complete reduction (reduction) of the meaning of the concept to empirical features is fundamentally not feasible, because the finite number of manifestations of the essence of the phenomenon under study is not identical to this essence itself, reflected in the theoretical concept. Only a certain part of the content of the concept is in a more or less direct and unambiguous relationship with the empirical base. At the same time, for some concepts this part is much larger than for others. Therefore, some concepts of sociological theory are practically not amenable to direct empirical interpretation, and it is carried out only indirectly, through other concepts that are logically connected with them. In the empirical interpretation of the basic concepts, the main attention of the researcher is directed to the choice mainly of those empirical indicators and indicators that reflect the most significant aspects of the phenomenon under study, are relatively easy to identify and observe, as well as relatively simple and reliable measurement.
In the specialized literature (see, for example, Yadov V. A. Sociological research: methodology, program, methods. M., 1987), the following sequence is proposed for clarifying the basic concepts, interpreting their meaning through observed empirical indicators:
1. Determining the scope of the content of the concept. Initially, it is necessary to get the most general idea of ​​the social phenomenon denoted by the concept used, to single out the most general components of the content and interconnection of both this concept and the phenomenon reflected by it, as well as the area of ​​empirical reality with which the sociologist will have to deal.
2. Determination of the continuum of properties of the phenomenon under study. At this stage, all possible components of this phenomenon are distinguished, with the help of which it is possible to establish a correspondence between it and the system of concepts that describe it and used in the study. The selection of these possible properties is a very complicated and time-consuming procedure. Here it is necessary to use a multi-stage analysis of the concept under study. After identifying the main groups of facts of reality covered by its content, their subgroups are distinguished until the researcher reaches an empirically fixed and verifiable indicator (group of indicators). In a multi-stage analysis of an interpreted concept, the following requirements must be observed: the system of concepts and terms adopted to describe the objective content of the interpreted concept at each stage of its analysis must have the same degree of generality; these concepts and terms should be exhaustive and mutually exclusive, and the multistage analysis of the concept itself should be based on the general scheme of the phenomenon displayed by this concept. process. This scheme should contain its main elements.
3. The choice of empirical indicators of the concept being interpreted is based on the principle of their significance and accessibility. It is necessary to select a group from among the fixed indicators. which will form the basis for further empirical work (in particular, for the measurement of empirical indicators.
4. Building indexes. The results of the corresponding measurements of the selected empirical indicators are grouped into certain indices, which are quantitatively expressed qualitative indicators of the selected concepts.
The next stage of work with interpreted concepts is the description of the phenomenon under study in their system. As a result of such a description, the phenomenon appears as a more or less accurately outlined subject of research. Of course, only under this condition can it be studied by searching for ways to solve the problem, the expression of which is the subject of study. The prediction of these ways of solving the problem is carried out in the form of hypotheses. The interpretation of the basic concepts is an integral part of the concept operationalization procedure.
Operationalization of concepts is a specific scientific procedure for establishing a connection between the conceptual apparatus of research and its methodological tools. It combines into a single whole the problems of concept formation, measurement techniques and the search for social indicators (see Measurement; Social indicator; Interpretation of basic concepts). Operationalization - allows you to determine what sociological data should be collected about.
Procedure:
1. Translation of the original concept into indicators.
2. Translation of indicators into variables.
3. Converting variables into indicators.
4. Determination of methods for collecting the required data.
The empirical indicator allows you to:
- establish how and in what form it is necessary to approach data collection;
- correctly formulate questions in various types of tools;
- determine the structure of answers to questions (scales, tests).
Thus, work with concepts is a procedure for establishing a connection between the conceptual apparatus and methodological research tools.

Operationalization and interpretation

As mentioned in the previous paragraph, operationalization is associated with the reformulation of theoretical abstract concepts into concrete empirical ones, i.e. access to aspects directly observed in the framework of social interaction. It is naive to ask the respondent, for example, directly about national distance (an abstract concept). Such concepts may simply be incomprehensible to the respondent. If the researcher asks how close the respondent is ready to admit representatives of one or another nationality (as family members or close friends, or neighbors, or work colleagues, or residents of their country, etc.), then he works at the operational level , which is equally clear to both him and the respondent.
Therefore, high-quality operationalization is the key to the correct preparation of the survey instrument.
If we consider the problem of operationalization holistically (that is, without taking it out of context of the entire empirical study), then its solution begins at the stage of determining the social phenomenon to be studied. The naming and description of social phenomena is associated with the use of such theoretical tools as concepts and constructs. Without going into details, I will only give possible options for their correlation.
Firstly, concepts can act as categories that correspond to the phenomena and processes of the surrounding reality and which can be combined into theoretical constructs of a hypothetical nature that are subject to empirical verification. At the same time, concepts should be more specific in relation to more abstract constructs. Secondly, concepts and constructs can be distinguished according to the criteria of evidence and evidence - concepts are obviously interpretable, provable and commonly used categories of scientific practice, and constructs are hypothetical constructions that have not yet reached the status of evidence and are subject to research and justification. Thirdly, concepts and constructs can be correlated as reflections of two types of reality - the existent and the possible. This view is especially acceptable in the social sciences. For example, the existence of a society (concept) is not questioned, but the idea of ​​its essence and features is constructed in different ways based on different theoretical perspectives. This last method of correlation is adopted in the future as the main one.
Thus, operationalization consists of the following steps:
1) the name of the social phenomenon (concept);
2) description of the concept in the most general theoretical terms (construct);
3) empirical interpretation of the construct, i.e. highlighting aspects of the phenomenon under study that are understandable to the respondent (indicators);
4) the formulation of relevant variables that are easily translated into questionnaire questions.
Consider the following example:

Phenomenon/concept Social activity of students
Theoretical construct Social activity of students as a component of the types of activity inherent in the life of an individual in the appropriate age period and in appropriate social conditions, namely: academic activity, scientific activity, labor activity, social activity, interpersonal activity.
Empirical indicators 1. Academic activity: visiting couples, activity in lectures, activity in practical classes. 2. Scientific activity: (…) 3. Labor activity: (…) 4. Social activity: (…) 5. Interpersonal activity: (…)
Variables Academic activity: A) Attending couples: number of absences per week B) Activity in lectures: clarifying questions in lectures C) Activity in practice: frequency of preparation (…)

Conducting operationalization is typical, first of all, for quantitative research, in which the researcher starts with theory, and only then moves on to measuring social indicators.
If we talk about qualitative research, then in them the situation is often exactly the opposite - the researcher seeks to reflectively observe social reality in order to formulate a theory based on such observation. In this case, the problem of interpretation of the empirical material comes to the fore. Here I want to immediately make a reservation that the following is the author's understanding of the interpretation. Interpretation acts, in a sense, as an inverse operationalization. Thus, in the course of interpretation, the researcher seeks to express directly observable aspects of empirical reality in the most appropriate theoretical terms. The interpretation procedure, in contrast to the operationalization procedure, is not unambiguous; can take on different forms, depending on the approach used, the experience and preferences of the researcher. The main qualitative approaches include grounded theory, case studies, ethnography, narrative research, phenomenology, and discourse analysis. In each of them the problem of interpretation is solved in its own way. In the future, when preparing the relevant chapters of the site, I will dwell on this problem in more detail.

14. Verification- (from Latin verificatio - proof, confirmation) - a concept used in the logic and methodology of scientific knowledge to refer to the process of establishing the truth of scientific statements through their empirical verification.
Verification consists in correlating a statement with the real state of affairs by means of observation, measurement, or experiment.
Distinguish between direct and indirect verification. With direct V., the statement itself, which speaks of the facts of reality or experimental data, is subjected to empirical verification.
However, not every statement can be directly correlated with facts, because most scientific statements refer to ideal, or abstract, objects. Such statements are verified indirectly. From this statement we deduce a consequence relating to such objects that can be observed or measured. This corollary is verified directly.
The B. of the corollary is considered as an indirect verification of the statement from which the given corollary was obtained. For example, suppose we need to verify the statement "The temperature in the room is 20°C". It cannot be verified directly, because in reality there are no objects to which the terms "temperature" and "20°C" correspond. From this statement, we can deduce a consequence that says that if a thermometer is brought into the room, then the mercury column will stop at the “20” mark.
We bring a thermometer and by direct observation we verify the statement “The mercury column is at the “20” mark”. This serves as an indirect V. of the original statement. Verifiability, i.e., empirical testability, of scientific statements and theories is considered one of the important features of being scientific. Statements and theories that cannot be verified in principle are generally not considered scientific.
FALSIFICATION(from Latin falsus - false and facio - I do) - a methodological procedure that allows you to establish the falsity of a hypothesis or theory in accordance with the modus tollens rule of classical logic. The concept of "falsification" should be distinguished from the principle of falsifiability, which was proposed by Popper as a criterion for demarcating science from metaphysics, as an alternative to the principle of verifiability adopted in neopositivism. Isolated empirical hypotheses, as a rule, can be subjected to direct F. and rejected on the basis of relevant experimental data, and also because of their incompatibility with fundamental scientific theories. At the same time, abstract hypotheses and their systems, which form scientific theories, are directly unfalsifiable. The fact is that the empirical verification of theoretical systems of knowledge always involves the introduction of additional models and hypotheses, as well as the development of theoretical models of experimental facilities, etc. The discrepancies between theoretical predictions and experimental results that arise in the process of verification can, in principle, be resolved by making appropriate adjustments to individual fragments of the theoretical system being tested.
Therefore, for the final F. theory, it is necessary alternative theory: only it, and not the results of the experiments themselves, is able to falsify the theory being tested. Thus, only in the case when there is a new theory that really ensures progress in knowledge is the rejection of the previous scientific theory justified methodologically.
The scientist tries to ensure that scientific concepts satisfy the principle of testability (the principle verification ) or at least the principle of refutation (the principle falsifications ).
Principle verification states: only verifiable statements are scientifically meaningful 1 .

Scientists scrutinize each other's discoveries as well as their own discoveries. In this they differ from people who are alien to science.
To distinguish between what is being tested and what is in principle impossible to verify, helps "circle K a rnapa" (it is usually considered in a philosophy course in connection with the topic "Neopositivism"). The statement is not verified (scientifically not meaningful): "Natasha loves Petya 2". The statement is verified (scientifically meaningful): "Natasha says she loves Petya" or "Natasha says that she is a princess frog."
Principle rigging 1 does not recognize as scientific such a statement, which is confirmed any other statements (sometimes even mutually exclusive), and cannot even be basically refuted. There are people for whom any the statement is yet another proof that they were right. If you tell something like that, he will answer: "What did I say!" You tell him something directly opposite, and he again: "You see, I was right!" 2

Having formulated the falsification principle, Popper supplemented the verification principle as follows:
a) scientifically meaningful such concept, which satisfies experimental facts and for which there are imaginary facts that, if they are discovered, can refute it. This concept is true.
b) Scientifically meaningful such concept, which refuted facts and for which there are imaginary facts capable of confirming it when they are discovered. Such a concept is false.
If conditions are formulated at least indirect check, then the asserted thesis becomes more reliable knowledge.
If it is impossible (or very difficult) to find evidence, try to make sure that at least there are no rebuttals (a kind of "presumption of innocence").
Let's say we can't test some assertion. Then we will try to make sure that the statements opposite to it are not confirmed. In a similar peculiar way, "on the contrary," one frivolous person checked her feelings: "Honey! I meet other men to make sure that I truly love only you ..."
A stricter analogy with what we are talking about exists in logic. This so-called apagogic evidence(from the Greek apagōgos - diverting). The conclusion about the truth of a certain statement is made indirectly, namely, the statement that contradicts it is refuted.
Developing the principle of falsification, Popper sought to implement a more effective demarcation between scientific and non-scientific knowledge.
According to Academician Migdal, professionals, unlike amateurs, are constantly striving to refute themselves...
The same idea was expressed by Louis Pasteur: a true researcher is one who tries to "destroy" his own discovery, stubbornly testing it for strength.
So, in science, great importance is attached to the reliability of facts, their representativeness, as well as the logical validity of the hypotheses and theories created on their basis.
At the same time, scientific ideas include elements faith . But this is a special faith that does not lead to a transcendent, other world. It is exemplified by "taken on faith" axioms, basic principles.
I.S. Shklovsky, in his scientific bestseller book The Universe, Life, Mind, introduced a fruitful principle called the "presumption of naturalness." According to him, any discovered phenomenon is considered automatically natural, unless the contrary is absolutely reliably proven.
Closely interrelated within science are the orientations towards believe, trust and recheck.
More often than not, scientists only believe what they can verify. Not everything can be verified by yourself. Someone double-checks, and someone trusts the one who double-checked. Reputable professional experts are most trusted.
Often "what a priori* for personality, a posteriori for the genus” (on this thesis, see Topic 16 on CSE, as well as the question on “Evolutionary Epistemology”).
1 How would you react to my words that I invented the “standard of invisibility”, but I can’t show it to anyone - because it is invisible.
2 This statement can be either true or false in a particular case. After all, not every Natasha loves every Petya. Some Natasha, perhaps, loves some Petya, but the other Petya either does not know, or is indifferent to him. Yes, and different people understand love in different ways. For some, “to love means to run into the depths of the courtyard and until the rook night, forgetting about everything, chop wood, playfully with your strength” (Vl. Mayakovsky). And for someone it is a voluntary death (“The Case of Cornet Elagin” by I.A. Bunin).
You can check the truth of the statements "Natasha received a diploma" or "Peter lost the keys." But love is a deeply internal, subjective, intimate feeling. And no "lie detector" will help to "check" love from the side of its unique inherent value for a person.
1 Introduced by the famous English researcher of science, philosopher and sociologist K. Popper (1902-1994).
2 I will give as a concrete example such an everyday situation. The husband, returning home, reports: "Kostya called to work, said that he passed the exam perfectly!" Wife: "What did I say? He's our child prodigy!" Husband: "Yes, not our Kostya passed perfectly, but his friend, namesake. And our son got a couple." Wife: "What did I say? He's our dope king of heaven ..."

16. FOUNDATIONS OF SCIENCE- fundamental ideas, concepts and principles of science that determine the research strategy, organize the diversity of specific theoretical and empirical knowledge into an integral system and ensure their inclusion in the culture of a particular historical era.

The problem of the foundations of science was actively developed in the philosophy of science in the 20th century. The growing interest in this issue was stimulated by the scientific revolutions of the 20th century. (in physics, cosmology, biology); the emergence of new areas and branches of science (cybernetics, information theory); the intensified processes of differentiation and integration of sciences. In all these situations, there was a need to comprehend the fundamental concepts, ideas and images that determine the strategies of scientific research and their historical variability.

A number of components and aspects of the foundations of science were identified and analyzed in the Western philosophy of science of the 2nd floor. 20th century T.Kun designated them as a paradigm; S.Tulmin - as "principles of natural order", "ideals and standards of understanding"; in the concept of J. Holton they were presented as fundamental themes of science; I. Lakatos described their functioning in terms of research programs; L. Laudan analyzed them as a research tradition, which is characterized by accepted methodological and ontological assumptions and prohibitions. In the domestic philosophy of science, the problems of the foundations of science were studied both in terms of the internal structure and dynamics of scientific knowledge, and in the aspect of its socio-cultural conditioning, which made it possible to more analytically present the structure and functions of the foundations of science. The structure of the foundations of science is determined by the connections of three main components: 1) the ideals and norms of research, 2) the scientific picture of the world, 3) the philosophical foundations of science (see Fig. Ideals and norms of science , Scientific picture of the world , Philosophical foundations of science ).

The foundations of science perform the following functions: 1) determine the formulation of problems and the search for means of solving them, acting as a fundamental research program of science; 2) serve as a system-forming basis of scientific knowledge, uniting the variety of theoretical and empirical knowledge of each scientific discipline into an integral system; determine the strategy of interdisciplinary interactions and interdisciplinary synthesis of knowledge; 3) act as a mediating link between science and other areas of culture, determine the nature of the impact of socio-cultural factors on the processes of formation of theoretical and empirical knowledge and the reverse influence of scientific achievements on the culture of a particular historical era. The transformation of the foundations of science takes place in the era of scientific revolutions and is the main content of revolutionary transformations in science. These transformations determine the formation of new types of scientific rationality. See also Art. The science .

18. IDEALS AND NORMS OF SCIENCE- Regulatory ideas and principles that express ideas about the values ​​of scientific activity, its goals and ways to achieve them. According to the two aspects of the functioning of science - as a cognitive activity and as a social institution - there are: a) cognitive ideals and norms that regulate the process of reproducing an object in various forms of scientific knowledge; b) social norms that fix the role of science and its value for social life at a certain stage of historical development, govern the process of communication between researchers, relations between scientific communities and institutions with each other and with society as a whole, etc.

Cognitive ideals and norms are realized in the following main forms: ideals and norms are 1) explanations and descriptions, 2) evidence and validity of knowledge, 3) construction and organization of knowledge. Taken together, they form a peculiar scheme of the method of research activity, which ensures the development of objects of a certain type. On the basis of cognitive ideals and norms, concrete methods of empirical and theoretical research of its objects specific to each science are formed. The ideals and norms of science develop historically. In their content, three interrelated levels of meanings can be distinguished, expressing: 1) general characteristics of scientific rationality, 2) their modification in various historical types of science, 3) their concretization in relation to the specifics of the objects of a particular scientific discipline.

The first level is represented by signs that distinguish science from other forms of knowledge (ordinary, art, philosophy, religious and mythological exploration of the world, etc.). In different historical epochs, the nature of scientific knowledge, procedures for its substantiation and standards of evidence were understood differently. However, the fact that scientific knowledge is different from the opinion that it must be substantiated and proven, that science cannot be limited to direct statements of phenomena, but must reveal their essence - these normative requirements were met in ancient, and in medieval science, and in the science of the New time, and in science of the 20th century.

The second level of the content of the ideals and norms of research is represented by historically changeable attitudes that characterize the type of scientific rationality, the style of thinking that dominates in science at a certain historical stage of its development. Thus, comparing ancient Greek mathematics with the mathematics of Ancient Babylon and Ancient Egypt, one can find differences in the ideals of the organization of knowledge. The ideal of presenting knowledge as a set of recipes for solving problems, adopted in the mathematics of Ancient Egypt and Babylon, in Greek mathematics is replaced by the ideal of organizing knowledge as an integral theoretical system in which theoretical consequences are derived from the initial premises-postulates. The most striking realization of this ideal was Euclidean geometry.

When comparing the methods of substantiating knowledge that prevailed in medieval science with the research standards adopted in the science of modern times, a change in the ideals and norms of evidence and validity of knowledge is revealed. In accordance with the general worldview principles, the value orientations and cognitive attitudes that developed in the culture of their time, the scientist of the Middle Ages distinguished between correct knowledge, verified by observations and bringing a practical effect, and true knowledge, revealing the symbolic meaning of things, allowing through the sensual things of the microcosm to see the macrocosm, through earthly objects get in touch with the world of heavenly beings. Therefore, when justifying knowledge in medieval science, references to experience as evidence of the correspondence of knowledge to the properties of things meant, at best, revealing only one of the many meanings of a thing, and far from being the main one. In the process of the formation of natural science at the end of the 16th–17th centuries. new ideals and norms of the validity of knowledge were established. In accordance with the new value orientations and ideological attitudes, the main goal of knowledge was defined as the study and disclosure of the natural properties and relationships of objects, the discovery of natural causes and laws of nature. Hence, as the main requirement for the validity of knowledge about nature, the requirement for its experimental verification was put forward. Experiment began to be regarded as the most important criterion for the truth of knowledge.

The historical development of natural science was associated with the formation of classical, then non-classical and post-non-classical rationality , each of which changed the previous characteristics of the ideals and norms of research (see. The science ). For example, a physicist of the classical era would not accept the ideals of a quantum mechanical description, in which the theoretical characteristics of an object are given through references to the nature of the devices, and instead of a holistic picture of the physical world, two additional pictures are offered, where one gives the space-time, and the other - the causal investigative description of phenomena. Classical physics and quantum relativistic physics are different types of scientific rationality, which find their concrete expression in a different understanding of the ideals and norms of research.

Finally, in the content of the ideals and norms of scientific research, a third level can be distinguished, in which the settings of the second level are concretized in relation to the specifics of the subject area of ​​each science (mathematics, physics, biology, social sciences, etc.). For example, in mathematics there is no ideal of experimental verification of a theory, but for the experimental sciences it is obligatory. In physics, there are special standards for substantiating developed mathematical theories. They are expressed in the principles of observability, correspondence, invariance. These principles regulate physical research, but they are redundant for sciences that are just entering the stage of theorization and mathematization. Modern biology cannot do without the idea of ​​evolution, and therefore the methods of historicism are organically included in the system of its cognitive attitudes. Physics, however, has not explicitly resorted to these methods until now. If for biology the idea of ​​development extends to the laws of living nature (these laws arise together with the formation of life), then in physics, until recently, the problem of the origin of the physical laws operating in the Universe has not been raised at all. Only in the modern era, thanks to the development of the theory of elementary particles in close connection with cosmology, as well as the achievements of the thermodynamics of non-equilibrium systems (the concept of I.Prigozhin) and synergetics evolutionary ideas begin to penetrate physics, causing changes in previously established disciplinary ideals and norms.

A special system of cognition regulators is characteristic of the social sciences and the humanities. They take into account the specifics of social objects - their historical dynamics and the organic involvement of consciousness in the development and functioning of social processes.

The ideals and norms of science are determined in two ways. On the one hand, they are determined by the nature of the objects under study, on the other hand, by the worldview structures that dominate the culture of a particular historical era. The first is most clearly manifested at the level of the disciplinary component of the content of ideals and norms of cognition, the second - at the level expressing the historical type of scientific rationality. Defining the general scheme of the method of activity, ideals and norms regulate the construction of various types of theories, the implementation of observations and the formation of empirical facts. The researcher may not be aware of all the normative structures used in the search, many of which he takes for granted. He most often assimilates them, focusing on samples of research already carried out and on their results. The processes of building and functioning of scientific knowledge demonstrate the ideals and norms in accordance with which this knowledge was created. In their system, original standard forms arise, which the researcher focuses on. For example, for Newton, the ideals and norms of the organization of theoretical knowledge were expressed by Euclidean geometry, and he created his mechanics, focusing on this model. In turn, Newtonian mechanics was a kind of standard for Ampère when he set the task of creating a generalizing theory of electricity and magnetism.

At the same time, the historical variability of ideals and norms, the need to develop new regulations for research creates a need for their understanding and rational explication. The result of such reflection is the methodological principles of science, in the system of which the ideals and norms of research are described. The development of new methodological principles and the establishment of a new system of ideals and norms of science is one of the aspects of global scientific revolutions, during which a new type of scientific rationality arises.

19. SCIENTIFIC PICTURE OF THE WORLD- a holistic image of the subject of scientific research in its main systemic and structural characteristics, formed by means of fundamental concepts, ideas and principles of science at every stage of its historical development.

There are main varieties (forms) of the scientific picture of the world: 1) general scientific as a generalized idea of ​​the Universe, wildlife, society and man, formed on the basis of a synthesis of knowledge obtained in various scientific disciplines; 2) social and natural science pictures of the world as ideas about society and nature, generalizing the achievements of the social, humanitarian and natural sciences, respectively; 3) special scientific pictures of the world (disciplinary ontologies) - ideas about the subjects of individual sciences (physical, chemical, biological, etc. pictures of the world). In the latter case, the term "world" is used in a specific sense, denoting not the world as a whole, but the subject area of ​​a separate science (the physical world, the biological world, the world of chemical processes). To avoid terminological problems, the term “picture of the reality under study” is also used to designate disciplinary ontologies. Its most studied example is the physical picture of the world. But such pictures exist in any science as soon as it is constituted as an independent branch of scientific knowledge. A generalized system-structural image of the subject of research is introduced into a special scientific picture of the world through representations 1) about fundamental objects, from which all other objects studied by the corresponding science are supposed to be built; 2) about the typology of the studied objects; 3) about the general features of their interaction; 4) about the space-time structure of reality. All these representations can be described in a system of ontological principles that serve as the basis for the scientific theories of the relevant discipline. For example, principles - the world consists of indivisible corpuscles; their interaction is strictly determined and is carried out as an instantaneous transfer of forces in a straight line; corpuscles and the bodies formed from them move in absolute space over the course of absolute time - they describe the picture of the physical world that has developed in the 2nd floor. 17th century and later called the mechanical picture of the world.

The transition from mechanical to electrodynamic (at the end of the 19th century) and then to the quantum-relativistic picture of physical reality (first half of the 20th century) was accompanied by a change in the system of ontological principles of physics. It was most radical during the formation of quantum-relativistic physics (revision of the principles of the indivisibility of atoms, the existence of absolute space-time, the Laplacian determination of physical processes).

By analogy with the physical picture of the world, pictures of the studied reality are distinguished in other sciences (chemistry, astronomy, biology, etc.). Among them there are also historically replacing each other types of pictures of the world. For example, in the history of biology - the transition from pre-Darwinian ideas about the living to the picture of the biological world proposed by Darwin, to the subsequent inclusion in the picture of wildlife of ideas about genes as carriers of heredity, to modern ideas about the levels of systemic organization of living things - populations, biogeocenosis, biosphere and their evolution.

Each of the specific historical forms of the special scientific picture of the world can be realized in a number of modifications. Among them there are lines of succession (for example, the development of Newtonian ideas about the physical world by Euler, the development of the electrodynamic picture of the world by Faraday, Maxwell, Hertz, Lorentz, each of whom introduced new elements into this picture). But situations are possible when the same type of picture of the world is realized in the form of competing and alternative ideas about the reality under study (for example, the struggle between the Newtonian and Cartesian concepts of nature as alternative options for the mechanical picture of the world; competition between two main directions in the development of the electrodynamic picture of the world - the Ampère-Weber programs, on the one hand, and the Faraday-Maxwell programs, on the other).

The picture of the world is a special type of theoretical knowledge. It can be considered as a certain theoretical model of the reality under study, different from the models (theoretical schemes) underlying specific theories. First, they differ in the degree of generality. Many theories can be based on the same picture of the world, incl. and fundamental. For example, the mechanics of Newton-Euler, thermodynamics and electrodynamics of Ampère-Weber were connected with the mechanical picture of the world. Not only the foundations of Maxwellian electrodynamics, but also the foundations of Hertzian mechanics are connected with the electrodynamic picture of the world. Secondly, a special picture of the world can be distinguished from theoretical schemes by analyzing the abstractions that form them (ideal objects). So, in the mechanical picture of the world, the processes of nature were characterized by means of abstractions - "an indivisible corpuscle", "body", "the interaction of bodies, transmitted instantly in a straight line and changing the state of movement of bodies", "absolute space" and "absolute time". As for the theoretical scheme underlying Newtonian mechanics (taken in its Euler presentation), the essence of mechanical processes is characterized in it by means of other abstractions - “material point”, “force”, “inertial space-time frame of reference”.

Ideal objects that form a picture of the world, in contrast to the idealization of specific theoretical models, always have an ontological status. Any physicist understands that a "material point" does not exist in nature itself, because in nature there are no bodies devoid of dimensions. But the follower of Newton, who accepted the mechanical picture of the world, considered indivisible atoms to be really existing "first bricks" of matter. He identified with nature simplifying and schematizing abstractions, in the system of which a physical picture of the world is created. In what particular signs these abstractions do not correspond to reality - the researcher finds out most often only when his science enters a period of breaking the old picture of the world and replacing it with a new one. Being different from the picture of the world, the theoretical schemes that form the core of the theory are always associated with it. The establishment of this connection is one of the obligatory conditions for constructing a theory. The procedure for mapping theoretical models (schemes) onto the picture of the world provides that kind of interpretation of equations expressing theoretical laws, which in logic is called the conceptual (or semantic) interpretation and which is mandatory for constructing a theory. Outside the picture of the world, a theory cannot be built in a complete form.

Scientific pictures of the world perform three main interrelated functions in the research process: 1) systematize scientific knowledge, combining them into complex integrity; 2) act as research programs that determine the strategy of scientific knowledge; 3) ensure the objectification of scientific knowledge, their attribution to the object under study and their inclusion in culture.

A special scientific picture of the world integrates knowledge within individual scientific disciplines. The natural-science and social picture of the world, and then the general scientific picture of the world, set broader horizons for the systematization of knowledge. They integrate the achievements of various disciplines, highlighting stable empirically and theoretically substantiated content in disciplinary ontologies. For example, the ideas of the modern general scientific picture of the world about the non-stationary Universe and the Big Bang, about quarks and synergetic processes, about genes, ecosystems and the biosphere, about society as an integral system, about formations and civilizations, etc. were developed within the framework of the corresponding disciplinary ontologies of physics, biology, social sciences and then included in the general scientific picture of the world.

Carrying out a systematizing function, scientific pictures of the world at the same time play the role of research programs. Special scientific pictures of the world set the strategy for empirical and theoretical research within the relevant fields of science. In relation to empirical research, the goal-directing role of special pictures of the world is most clearly manifested when science begins to study objects for which no theory has yet been created and which are studied by empirical methods (typical examples are the role of the electrodynamic picture of the world in the experimental study of cathode and x-rays). Representations about the reality under study, introduced in the picture of the world, provide hypotheses about the nature of the phenomena found in the experiment. According to these hypotheses, experimental tasks are formulated and plans for experiments are developed, through which new characteristics of the objects studied in the experiment are discovered.

In theoretical studies, the role of a special scientific picture of the world as a research program is manifested in the fact that it determines the range of permissible tasks and the formulation of problems at the initial stage of theoretical search, as well as the choice of theoretical means for solving them. For example, during the construction of generalizing theories of electromagnetism, two physical pictures of the world and, accordingly, two research programs competed: Ampère-Weber, on the one hand, and Faraday-Maxwell, on the other. They posed different problems and determined different means of constructing a general theory of electromagnetism. The Ampère-Weber program proceeded from the principle of long-range action and focused on the use of mathematical means of point mechanics, the Faraday-Maxwell program was based on the principle of short-range action and borrowed mathematical structures from continuum mechanics.

In interdisciplinary interactions based on the transfer of ideas from one field of knowledge to another, the role of the research program is played by the general scientific picture of the world. It reveals similar features of disciplinary ontologies, thereby forming the basis for the translation of ideas, concepts and methods from one science to another. The exchange processes between quantum physics and chemistry, biology and cybernetics, which gave rise to a number of discoveries in the 20th century, were purposefully directed and regulated by the general scientific picture of the world.

Facts and theories created under the goal-directing influence of a special scientific picture of the world are again correlated with it, which leads to two options for changing it. If the representations of the picture of the world express the essential characteristics of the objects under study, these representations are refined and concretized. But if research encounters fundamentally new types of objects, a radical restructuring of the picture of the world takes place. Such restructuring is a necessary component of scientific revolutions. It involves the active use of philosophical ideas and the substantiation of new ideas by the accumulated empirical and theoretical material. Initially, a new picture of the reality under study is put forward as a hypothesis. Its empirical and theoretical substantiation may take a long period, when it competes as a new research program with the previously accepted special scientific picture of the world. The approval of new ideas about reality as a disciplinary ontology is ensured not only by the fact that they are confirmed by experience and serve as the basis for new fundamental theories, but also by their philosophical and ideological justification (see. Philosophical foundations of science ).

The ideas about the world that are introduced in the pictures of the reality under study always experience a certain impact of analogies and associations drawn from various areas of cultural creativity, including everyday consciousness and the production experience of a certain historical era. For example, the concepts of electric fluid and caloric, included in the mechanical picture of the world in the 18th century, were formed largely under the influence of objective images drawn from the sphere of everyday experience and technology of the corresponding era. Common sense 18th century it was easier to agree with the existence of non-mechanical forces, representing them in the image and likeness of mechanical ones, for example. representing the flow of heat as a flow of a weightless fluid - caloric, falling like a water jet from one level to another and doing work due to this in the same way as water does this work in hydraulic devices. But at the same time, the introduction of ideas about various substances - carriers of forces - into the mechanical picture of the world also contained an element of objective knowledge. The concept of qualitatively different types of forces was the first step towards the recognition of the irreducibility of all types of interaction to mechanical. It contributed to the formation of special, different from mechanical, ideas about the structure of each of these types of interactions.

The ontological status of scientific pictures of the world is a necessary condition for the objectification of specific empirical and theoretical knowledge of a scientific discipline and their inclusion in culture.

Through reference to the scientific picture of the world, special achievements of science acquire a general cultural meaning and ideological significance. For example, the basic physical idea of ​​the general theory of relativity, taken in its special theoretical form (the components of the fundamental metric tensor, which determines the metric of four-dimensional space-time, at the same time act as potentials of the gravitational field), is little understood by those who are not involved in theoretical physics. But when this idea is formulated in the language of the picture of the world (the nature of the geometry of space-time is mutually determined by the nature of the gravitational field), it gives it a status of scientific truth that is understandable to non-specialists and has an ideological meaning. This truth modifies the idea of ​​a homogeneous Euclidean space and quasi-Euclidean time, which through the system of education and upbringing since the time of Galileo and Newton have become a worldview postulate of everyday consciousness. This is the case with many discoveries of science, which were included in the scientific picture of the world and through it influence the ideological orientations of human life. The historical development of the scientific picture of the world is expressed not only in a change in its content. Its forms are historical. In the 17th century, in the era of the emergence of natural science, the mechanical picture of the world was simultaneously a physical, natural-science, and general scientific picture of the world. With the advent of discipline-organized science (end of the 18th century - first half of the 19th century), a spectrum of specially scientific pictures of the world emerged. They become special, autonomous forms of knowledge, organizing the facts and theories of each scientific discipline into a system of observation. There are problems of building a general scientific picture of the world, synthesizing the achievements of individual sciences. The unity of scientific knowledge becomes the key philosophical problem of science 19 - 1st floor. 20th century Strengthening of interdisciplinary interactions in science of the 20th century. leads to a decrease in the level of autonomy of special scientific pictures of the world. They are integrated into special blocks of natural-scientific and social pictures of the world, the basic representations of which are included in the general scientific picture of the world. In the 2nd floor. 20th century the general scientific picture of the world begins to develop on the basis of the ideas of universal (global) evolutionism, which combines the principles of evolution and a systematic approach. Genetic connections between the inorganic world, wildlife and society are revealed, as a result, a sharp opposition between the natural and social scientific pictures of the world is eliminated. Accordingly, the integrative connections of disciplinary ontologies are strengthened, which are increasingly acting as fragments or aspects of a single general scientific picture of the world.

20. PHILOSOPHICAL FOUNDATIONS OF SCIENCE- a system of philosophical ideas and principles through which ideas are justified scientific picture of the world , ideals and norms of science and which serve as one of the conditions for the inclusion of scientific knowledge in the culture of the corresponding historical era.

In the fundamental areas of research, developed science, as a rule, deals with objects that have not yet been mastered either in production or in everyday experience (sometimes the practical development of such objects is carried out not when they were discovered, but in a later historical era). For ordinary common sense, these objects may be unusual and incomprehensible. Knowledge about them and methods for obtaining such knowledge may not coincide significantly with the standards and ideas about the world of ordinary knowledge of the corresponding historical era. Therefore, the scientific pictures of the world (the scheme of the object), as well as the ideals and normative structures of science (the scheme of the method), not only during their formation, but also in subsequent periods of perestroika, need a kind of coordination with the dominant worldview of a particular historical era, with the dominant meanings of universals. culture. Such coordination is provided by the philosophical foundations of science. Along with justifying postulates, they also include ideas and principles that determine the search heuristic. These principles usually guide

Models play an important role in scientific and theoretical knowledge. They make it possible to visualize objects and processes that are inaccessible to direct perception: for example, a model of an atom, a model of the Universe, a model of the human genome, etc. Theoretical models reflect the structure, properties and behavior of real objects. The construction of a scientific model is the result of the interaction of the subject of scientific and cognitive activity with reality. There is a point of view according to which primary models can be evaluated as metaphors based on observations and conclusions made on the basis of observations that contribute to the visual representation and preservation of information. The well-known Western philosopher of science I. Lakatos noted that the process of formation of primary theoretical models can be based on programs of three kinds: firstly, it is the Euclidian system (Euclidean program), secondly, the empiricist program and, thirdly, the inductivistic program. All three programs start from the organization of knowledge as a deductive system.

The Euclidean program, which assumes that everything can be deduced from a finite set of trivial true propositions, consisting only of terms with a trivial semantic load, is commonly called the knowledge trivialization program. This program contains purely true judgments, but it does not work with assumptions or rebuttals. Knowledge as truth is introduced to the top of the theory and without any deformation "flows" from primitive terms to defined terms.

Unlike Euclidean, the empiricist program is built on the basis of basic provisions that are of a well-known empirical nature. Empiricists cannot admit any other introduction of meaning than from below theory. If these propositions turn out to be false, then this evaluation penetrates up the channels of deduction and fills the entire system. Therefore, the empiricist theory is conjectural and falsifiable. And if the Euclidean theory places the truth above and illuminates it with the natural light of reason, then the empiricist theory places it below and illuminates it with the light of experience. But both programs rely on intuition.

About the inductivist program The emergence of the inductivist program was associated with the dark pre-Copernican times of the Enlightenment, when refutation was considered indecent, and conjecture was despised. Inductive logic was replaced by probabilistic logic. The final blow to inductivism was dealt by Popper, who showed that even a partial transmission of truth and meaning cannot go upwards from below.

According to Academician V. S. Stepin, “the main feature of theoretical schemes is that they are not the result of a purely deductive generalization of experience.” In advanced science, theoretical schemes are first constructed as hypothetical models using previously formulated abstract objects. At the early stages of scientific research, the constructs of theoretical models are created by direct schematization of experience.


Important characteristics of a theoretical model are its structure, as well as the possibility of transferring abstract objects from other areas of knowledge. According to Lakatos, the hard core, the belt of defensive hypotheses, positive and negative heuristics should be considered as the main structural units. The negative heuristic forbids applying rebuttals to the hard core of the program.

Theoretical objects convey the meaning of such concepts as "ideal gas", "absolute black body", "point", "force", "circle", "segment", etc. In reality, there are no isolated systems that would not experience any impact , therefore, all classical mechanics focused on closed systems is built using theoretical constructs.

How is the law making process going?

The concept of "law" indicates the presence of internally necessary, stable and repetitive links between events and states of objects. The law reflects objectively existing interactions in nature and in this sense is understood as a natural regularity. The laws of science, aimed at reflecting natural patterns, are formulated using artificial languages ​​of their disciplinary field. The laws developed by the human community as the norms of human coexistence differ significantly from the laws of the natural sciences and, as a rule, have a conventional character. There are "probabilistic" (statistical) laws based on probabilistic hypotheses regarding the interaction of a large number of elements, and "dynamic" laws, i.e. laws in the form of universal conditions.

The laws of science reflect the most general and deepest natural and social interactions, they tend to adequately reflect the laws of nature. However, the very measure of adequacy and the fact that the laws of science are generalizations that are changeable and subject to refutation, brings to life a very acute philosophical and methodological problem about the nature of laws. It is no coincidence that Kepler and Copernicus understood the laws of science as hypotheses. Kant was generally convinced that laws are not derived from nature, but are prescribed to it.

Formation of laws assumes that a hypothetical model, substantiated experimentally or empirically, has the ability to turn into a scheme. Moreover, theoretical schemes are initially introduced as hypothetical constructions, but then they are adapted to a certain set of experiments and in this process are justified as a generalization of experience. Then follows the stage of its application to the qualitative diversity of things, i.e., its qualitative extension. And only after that - the stage of quantitative mathematical design in the form of an equation or formula, which marks the phase of the emergence of the law. So, the model - the scheme - qualitative and quantitative extensions - metamatization - the formulation of the law - this is the chain approved by science.

At all stages of scientific research, without exception, both the correction of the abstract objects themselves and their theoretical schemes, as well as their quantitative mathematical formalizations, are actually carried out. Theoretical schemes could also be modified under the influence of mathematical tools, but all these transformations remained within the limits of the put forward hypothetical model. B.C. Stepin emphasizes that “in classical physics, one can speak of two stages in the construction of particular theoretical schemes as hypotheses: the stage of their construction as content-physical models of a certain area of ​​interactions and the stage of possible restructuring of theoretical models in the process of their connection with the mathematical apparatus.” In the higher stages of development, these two aspects of the hypothesis merge, and in the early stages they are separated.

Scientific research in various fields seeks not only to generalize certain events in the world of our experience, but also to identify regularities in the course of these events, to establish general laws that can be used for prediction and explanation.