Biographies Characteristics Analysis

Formation of primary theoretical models and laws. Formation of primary theoretical models and laws and the formation of a developed theory

The theoretical model is a universal means of modern scientific knowledge, which serves to reproduce and fix the structure, properties and behavior of real objects in a symbolic form. Theoretical models make it possible to visually recreate objects and processes that are inaccessible to direct perception (for example, an atom model, a model of the Universe, a model of the human genome, etc.) in a situation where there is no direct access to reality. Theoretical models, being constructions and idealizations aimed at reproducing the invariant relationships of the elements acting in the system, are a kind of representation (representation) of the objective world. Theoretical models allow us to consider reality from the point of view of the “observer system”. The scientific community considers theoretical modeling as an important and necessary tool and at the same time as a stage in the research process. Theoretical modeling testifies to the rigor, orderliness and rationality of the process of scientific knowledge.

Primary theoretical models are most closely tied to data obtained empirically, suggesting their generalization, taking into account the explanatory hypothesis. In essence, they offer the attention of researchers a certain artifact (artificially created object). In other words, the primary theoretical models imply an accessible and consistent imitation of the action of the basic laws of the functioning of a particular process.

The important characteristics of the theoretical model are: (a) structure,(b) the possibility of transferring abstract objects from other areas of knowledge. The primary theoretical models should take into account the physical, functional, geometric or dynamic characteristics of real processes. They claim to be "recognized" and illustrative, on the one hand, and to be further refined and transformed, on the other. It is important to note the "inconclusive" nature of the primary theoretical models, which can be refined as a result of active experimentation, obtaining new observational data, discovering new facts, or the emergence of a new theory. Domestic philosopher of science B.C. Stepin believes that in the early stages of scientific research, theoretical models are created by direct schematization of experience.

In order for the primary theoretical model to be accepted, it must have "explanatory power" and be isomorphic to real processes. Informativeness and self-sufficiency are important characteristics of true theoretical models that help to understand the existing patterns of the world. In the history of science, it is not uncommon for primary theoretical models to turn out to be “inoperative”. It is important to emphasize that although the quality of “similarity” is important for a theoretical model, they reproduce reality in an ideal, extremely perfect form. But if idealization is a mental construction of objects that do not exist or are not implemented in the parameters of a given world, then a theoretical model is the construction of deep interconnections of really existing processes. Theoretical models capture supposedly true situations.

According to modern philosophers of science, for example, I. Lakatos, the process of formation of primary theoretical models can be based on the following methodological programs: (a) Euclidean; (b) empiricist; (c) inductivist. Euclidean the program, in which the axiomatic construction is considered exemplary, assumes that all knowledge can be deduced from the initial finite set of self-evident truths, consisting of terms with a trivial semantic load. Knowledge as truth is introduced to the top of the theory and without any deformation "flows" from primitive terms to defined terms. This program is called the knowledge trivialization program. And, if the Euclidean theory places truth at the top and illuminates it with the natural light of reason, then empiricist- places the truth below and illuminates with the light of experience. The empiricist program is built on the basis of basic provisions that are of a well-known empirical nature. It is important to emphasize that both programs include and recognize the moment of logical intuition. AT inductivist The program “expelled from the upper level, the mind seeks to find refuge and builds a channel through which the truth flows from the bottom up from the basic provisions. “Power” is transferred to the facts and an additional logical principle is established - the relaying of truth” (Lakatos). We can agree with the conclusions of I. Lakatos that the theoretical model is being approved, which has a greater empirical content than the previous one. To correlate a theoretical model with reality, a long chain of logical conclusions and consequences is often needed.

Theoretical models cannot be built without their important elements - abstract(from lat. abstrahere- extract, separate) objects that represent the abstraction of certain properties and characteristics from the composition of a holistic phenomenon and the restructuring (or “finishing”) of these extracted properties into an independent object. Examples of abstract objects: "ideal gas", "absolute rigid body", "point", "force", "circle", "segment", "perfect competition market", etc. The choice of certain abstract objects is associated with a certain "intellectual risk." The enormous importance of abstract objects is already evident from the fact that the abstraction of the extension of bodies from their mass provided the beginning of geometry, and the opposite abstraction of mass from extension served as the beginning of mechanics. The choice of certain abstract objects is significantly influenced by the scientific picture of the world.

Abstract objects, being idealizations of reality, are also called theoretical constructs. or theoretical objects. They can contain both features that correspond to real objects, as well as idealized (mentally constructed) objectivity, the properties of which are not possessed by any real object. Abstract objects replace certain connections of reality, but they cannot have the status of real physical objects, since they are idealizations. It is believed that an abstract object is much simpler than a real one.

Since the primary theoretical models are predominantly hypothetical, it is important for them to have factual confirmation, and, therefore, the stage of their justification becomes the methodological norm, during which they are adapted to a certain set of experiments. Otherwise, one may encounter a situation of arbitrariness of scientists and pseudoscientific theorizing. Therefore, the stage of creating a theoretical model is followed by the stage of its application to the qualitative variety of things, i.e., its qualitative expansion, after which the stage of quantitative mathematical formulation in the form of an equation or formula follows. This marks the phase of the appearance of the formulation of the law, although at all stages, without exception, the correction of the abstract objects themselves, their theoretical schemes, as well as quantitative mathematical formalizations, is actually carried out. V. S. Stepin emphasizes that “in classical physics, one can speak of two stages in the construction of particular theoretical schemes as hypotheses: the stage of their construction as content-physical models of a certain area of ​​interactions and the stage of possible restructuring of theoretical models in the process of their connection with the mathematical apparatus” . The laws reflect the most essential, necessary and recurring connections and interactions of the processes and phenomena of the universe. The law reflects objectively existing interactions in nature and in this sense is understood as a natural regularity.

There are a number of fundamental laws that reflect the fundamental interactions within our universe. The laws of science resort to artificial languages ​​to formulate these laws of nature. The laws of human life, developed by the human community as the norms of social coexistence, are, as a rule, conditional, conventional.

The sphere of scientific knowledge is divided into empirical and theoretical levels (see the previous chapter). Experience, experiment, observation are the components of the empirical level of knowledge. Abstractions, idealized objects, concepts, formulas and principles are necessary components of the theoretical level. The theoretical and empirical levels of cognition cannot be reduced to the ratio of the sensual and the rational. Both at the empirical and theoretical levels of cognition there is an interaction and unity of the sensual and the rational.

The developed theory is not just a set of interrelated provisions, but contains a mechanism for conceptual movement, internal deployment of content, includes a program for building knowledge. In this regard, one speaks of the integrity of the theory. The classical stage of the development of science is characterized by the ideal of deductively constructed theories.

Descriptive theories are focused on ordering and systematizing empirical material. Mathematical theories that use mathematical formalism involve formal operations with the signs of a mathematicized language that expresses the parameters of an object. Theory should not be regarded as a "closed" and immovable system. It contains the mechanisms of its development, both through sign-symbolic operations, and through the introduction of various hypothetical assumptions. There is also a way of a mental experiment with idealized objects, which also provides an increment in the content of the theory.

The language of theory, building on top of natural language, in turn, is subject to a certain hierarchy, which is due to the hierarchy of scientific knowledge itself. Diverse sciences have independent subject areas and are bound by the need for the existence of specific languages. Scientific language is a specific conceptual apparatus of scientific theory and means of proof acceptable in it. As a sign system, it is created and serves as an effective means of thinking. The very process of advancing towards a true theory is also a kind of success of the "expressive possibilities of language". Many scientists believe that the development of science is directly related to the development of linguistic means of expression, the development of a more perfect language and the transfer of knowledge from the old language to the new one. It is possible to single out the languages ​​of empirical and theoretical sciences, the language of observations and descriptions, etc. In science, there is a clear tendency to move from using the language of observations to the language of experiment. A convincing example of this is the language of modern physics, which contains terms denoting phenomena and properties, the very existence of which was established in the course of various experiments.

In the philosophy and methodology of science, special attention is paid to the logical ordering and concise description of facts. At the same time, it is obvious that ordering and logical concentration, a concise description of the factual material lead to a significant transformation in the semantic semantic continuum. When descriptive languages ​​go beyond description and point to patterns that unite given facts, their status changes, a nomological language emerges.

The diverse specification of various types of scientific languages ​​has given rise to the problem of their classification. One of the fruitful solutions to this problem was the proposal to classify the languages ​​of scientific theory on the basis of its internal structure: languages ​​began to differ depending on which of the subsystems of the theory they are predominantly used in. In this regard, the following classes of languages ​​of science are distinguished: (a) assertoric - the language of assertion, with its help the main assertions of a given theory are formulated. Assertoric languages ​​are divided into formalized and non-formalized. Any formal logical languages ​​serve as examples of the former. Examples of the second are fragments of natural languages ​​containing affirmative assumptions, supplemented by scientific terms; (b) model languages ​​that serve to build models and other elements of a model-representational theme and are also divided into formalized and non-formalized. Formalized ones are based on the use of mathematical symbolism; (c) procedural - a language used to describe measurement, experimental procedures, as well as rules for the transformation of linguistic expressions, processes for setting and solving problems. A feature of procedural languages ​​is unambiguity; (d) axiological - a language that creates the possibility of describing various assessments of the elements of the theory and has the means to compare processes and procedures in the structure of the scientific theory itself; (e) heuristic - a language that describes an exploratory search under conditions of uncertainty. It is with the help of heuristic languages ​​that such an important procedure as the formulation of the problem is carried out.

The axial components of the language are sign and meaning. In science, the meaning is understood as the semantic content of the word, which ensures the relative constancy of the structure of speech activity and its belonging to one or another class of objects. A sign is defined as a material object (phenomenon, event) that acts as a representative of something else and is used to acquire, store, process and transmit information. A linguistic sign is qualified as a formation representing an object, property, relation of reality. The totality of these signs, their specially organized sign system, forms the language.

The most common ways of creating artificial languages ​​of scientific theories are: (1) terminology of natural language words; (2) tracing terms of foreign origin; (3) language formalization. The language does not always have adequate means of reproducing alternative experience; certain symbolic fragments may be absent in its basic vocabulary. For the philosophy of science, it is fundamentally important to study the specifics of language as an effective means of representing, coding the basic cognitive system, clarifying the specifics of scientific discourse and the relationship between linguistic and non-linguistic mechanisms of theory construction. The acuteness of the problem of the correlation of formal language constructions and reality, the analyticity and syntheticity of statements is present at the stage of construction, development of the theory. The idea of ​​the universal representativeness of formalized languages, of their ideality is replete with paradoxical constructions, which brings to life an alternative concept of representation (representation of objectivity), indicating that the relationship of language structures to the outside world is not limited to formal designation, indication, coding.

In the process of developing a developed scientific theory, the verification procedure is of great importance, i.e. confirmation. At the same time, K. Popper proved that any theory is, in principle, falsifiable, that is, subject to the procedure of refutation. The falsifiable principle is an alternative to the principle of verification, but confirmed by the history of science. A theory is said to be empirical or falsifiable if it exactly divides the class of all possible basic statements into two subclasses: first, the class of all those basic statements with which it is incompatible, which it eliminates or forbids (this is the class of potential falsifiers of the theory), and, in secondly, the class of those basic statements that do not contradict it, which it “allows”. In other words, according to B.C. Stepin, "a theory is falsifiable if the class of its potential falsifiers is not empty."

A developed scientific theory contains a tendency towards extrapolation, i.e., towards the transfer of its principles and models to all cases of theoretical research. However, extrapolation is largely limited and is not a universal procedure. In the developed theory, the invariant content and the conceptual model of its further growth are preserved. A significant place is occupied by the procedures of interpretation and mathematical formalization.

B.C. Stepin distinguishes three features of the construction of a developed scientific theory. The first one indicates that “the developed theories of a greater degree of generality in modern conditions are created by a team of researchers with a fairly clearly defined division of labor between them,” i.e., we are talking about a collective subject of scientific creativity. This is due to the complication of the object of study and the increase in the amount of necessary information. “The second feature of the modern epistemological situation is that fundamental theories are increasingly being created without a sufficiently developed layer of primary theoretical schemes and laws”, “intermediate links necessary for constructing a theory are created in the course of theoretical synthesis”. The third feature is the use of the method of mathematical hypothesis, "the construction of a theory begins with attempts to guess its mathematical apparatus."

The developed theory has a prognostic function, which manifests itself in the following types of forecasts: trivial and non-trivial, exploratory and normative. A trivial forecast is built in a system of cause-and-effect relationships and is based on the assumption that there is a certainty given by the past state of the system. A non-trivial forecast forces one to take into account the potential influence of factors not included “in the model due to their very low significance in the past”, as well as the variability and mobility of the system itself, especially in the case of its openness. A non-trivial forecast uses a so-called preference filter, created on the basis of an image of the desired future. The search forecast involves the identification of the characteristics of objects and events based on the extrapolation of trends found in the present. Normative predicts the possible states of an object in accordance with given norms and goals. The level of the developed theory allows the development and active use of such prognostic methods as the "predictive graph" and "target tree". A graph is a geometric figure consisting of vertices of points connected by segments-edges. Vertices represent goals, edges represent ways to achieve them. Moreover, predictable deviations from the supposed direct scientific search can occur along the entire length of the rib. Then the graph has a structure with branches, reflecting the real course of movement of scientific thought. Graphs may or may not contain so-called cycles (loops), may be connected or unconnected, directed or undirected. If the connected graph does not contain loops and is oriented, then such a graph is called a goal tree or a graph tree. The graphic image of the tree performs an illustrative function in many respects and can be replaced by a list of alternative solutions with the principle of ousting less and less significant levels and events. To assess their significance, you can assign a coefficient of relative importance to each of them.

27. The role of constructive methods in the deductive development of the theory. Deployment of theory as a process of problem solving.

In theoretical knowledge, sublevels: 1) private Theoretical models and laws acting as theories relating to a fairly limited area of ​​phenomena. 2) Developed scientific theories , including particular theoretical laws as consequences derived from fundamental theories.

At each level, theoretical knowledge is organized around a construct - theoretical model and the theoretical law formulated in relation to it. Their elements are abstract objects that are in strictly defined connections and relationships with each other. Theoretical laws are directly formulated in relation to the abstract objects of the theoretical model.

Theoretical models are not something external to the theory. They are part of it. They should be distinguished from analog models, which serve as a means of constructing a theory, its original scaffolding, but are not fully included in the created theory. Theoretical models are schemes of objects and processes studied in the theory, expressing their essential connections.

At the base developed theory highlight the fundamental theoretical scheme, built from a small set of basic abstract objects that are structurally independent of each other, and in relation to which fundamental theoretical laws are formulated (in Newtonian mechanics, its basic laws are formulated in relation to a system of abstract objects: "material point", "force"; connections and relations of the listed objects form theoretical model of mechanical motion). In addition to the fundamental theoretical scheme and fundamental laws, the developed theory includes Private theoretical schemes and laws. In mechanics - theoretical schemes and laws of oscillation, rotation of bodies, collision of elastic bodies. When particular theoretical schemes are included in the theory, they are subordinate to the fundamental one, but in relation to each other they can have an independent status. The abstract objects that form them are specific. They can be constructed on the basis of abstract objects of the fundamental theoretical scheme and act as their original modification. The difference between the fundamental and particular theoretical schemes within a developed theory corresponds to the difference between its fundamental laws and their consequences. Thus, the structure of a developed scientific theory is a complex, hierarchically organized system of theoretical schemes and laws that form the inner skeleton of the theory.

The functioning of theories presupposes their application to the explanation and prediction of experimental facts. In order to apply the fundamental laws of a developed theory to an experiment, it is necessary to obtain consequences from them that are comparable with the results of the experiment. The conclusion of such consequences is characterized as Deployment of the theory . The hierarchy of interconnected abstract objects corresponds to the hierarchical structure of statements. The connections of these objects form theoretical schemes of various levels. And then the deployment of the theory appears not only as an operation with statements, but also as thought experiments with abstract objects of theoretical schemes.

In advanced disciplines, the laws of theory are formulated in the language of mathematics. The features of abstract objects that form a theoretical model are expressed in the form of physical quantities, and the relationships between these features are expressed in the form of relationships between the quantities included in the equations. Mathematical formalisms applied in the theory receive their interpretation due to their connections with theoretical models. By solving equations and analyzing the results, the researcher develops the content of the theoretical model and in this way receives more and more knowledge about the reality under study. The interpretation of the equations is provided by their connection with the theoretical model, in the objects of which the equations are satisfied, and by the connection of the equations with experience. The last aspect is called empirical interpretation.

The specificity of complex forms of theoretical knowledge, such as physical theory, lies in the fact that the operations of constructing particular theoretical schemes based on the constructs of the fundamental theoretical scheme are not explicitly described in the postulates and definitions of the theory. These operations are demonstrated on specific samples, which are included in the theory as reference situations, showing how the derivation of consequences from the basic equations of the theory is carried out. The informal nature of all these procedures, the need to each time refer to the object under study and take into account its features when constructing particular theoretical schemes, turn the derivation of each successive consequence from the basic equations of the theory into a special theoretical problem. The deployment of the theory is carried out in the form of solving such problems. The solution of some of them from the very beginning is offered as models, in accordance with which the rest of the problems should be solved.

  • X. Reformation by Peter I of the economic life of the country and the characteristic features of the socio-economic development of Russia in the first quarter of the 18th century.
  • Analysis of demand for products and formation of a portfolio of orders
  • Assortment policy of the enterprise and its impact on the formation of profit
  • The Aztecs were very well educated, teaching such disciplines as: religion, astronomy, history of laws, medicine, music and the art of war.
  • Banking system: concept, types, structure. Formation and development of the banking system of Russia
  • Ticket 20. Overcoming political fragmentation and the formation of nation-states.
  • Ticket 22. The formation of ancient Russian statehood. Acceptance of Christianity. Culture and life of Ancient Russia.
  • The theoretical model is a universal means of modern scientific knowledge, which serves to reproduce and fix the structure, properties and behavior of real objects in a symbolic form. Theoretical models make it possible to visually recreate objects and processes that are inaccessible to direct perception (for example, an atom model, a model of the Universe, a model of the human genome, etc.) in a situation where there is no direct access to reality. Theoretical models, being constructions and idealizations aimed at reproducing the invariant relationships of the elements acting in the system, are a kind of representation (representation) of the objective world. Theoretical models allow us to consider reality from the point of view of the “observer system”. The scientific community considers theoretical modeling as an important and necessary tool and at the same time as a stage in the research process. Theoretical modeling testifies to the rigor, orderliness and rationality of the process of scientific knowledge.

    Primary theoretical models are most closely tied to data obtained empirically, suggesting their generalization, taking into account the explanatory hypothesis. In essence, they offer the attention of researchers a certain artifact (artificially created object). In other words, the primary theoretical models imply an accessible and consistent imitation of the action of the basic laws of the functioning of a particular process.

    The important characteristics of the theoretical model are: (a) structure,(b) the possibility of transferring abstract objects from other areas of knowledge. In the primary theoretical mod-


    The physical, functional, geometric or dynamic characteristics of real processes must be taken into account. They claim to be "recognized" and illustrative, on the one hand, and to be further refined and transformed, on the other. It is important to note the "inconclusive" nature of the primary theoretical models, which can be refined as a result of active experimentation, obtaining new observational data, discovering new facts, or the emergence of a new theory. Domestic philosopher of science B.C. Stepin believes that in the early stages of scientific research, theoretical models are created by direct schematization of experience.

    In order for the primary theoretical model to be accepted, it must have "explanatory power" and be isomorphic to real processes. Informativeness and self-sufficiency are important characteristics of true theoretical models that help to understand the existing patterns of the world. In the history of science, it is not uncommon for primary theoretical models to turn out to be “inoperative”. It is important to emphasize that although the quality of “similarity” is important for a theoretical model, they reproduce reality in an ideal, extremely perfect form. But if idealization is a mental construction of objects that do not exist or are not implemented in the parameters of a given world, then a theoretical model is the construction of deep interconnections of really existing processes. Theoretical models capture supposedly true situations.

    According to modern philosophers of science, for example, I. Lakatos, the process of formation of primary theoretical models can be based on the following methodological programs: (a) Euclidean; (b) empiricist; (c) inductivist. Euclidean the program, in which the axiomatic construction is considered exemplary, assumes that all knowledge can be deduced from the initial finite set of self-evident truths, consisting of terms with a trivial semantic load. Knowledge as truth is introduced to the top of the theory and, without any deformation, “flows” from primitive terms to defined terms.


    us. This program is called the program of trivialization of knowledge. And, if the Euclidean theory places truth at the top and illuminates it with the natural light of reason, then empiricist- places the truth below and illuminates with the light of experience. The empiricist program is built on the basis of basic provisions that are of a well-known empirical nature. It is important to emphasize that both programs include and recognize the moment of logical intuition. AT inductivist The program “expelled from the upper level, the mind seeks to find refuge and builds a channel through which the truth flows from the bottom up from the basic provisions. “Power” is transferred to the facts and an additional logical principle is established - the relaying of truth” (Lakatos). We can agree with the conclusions of I. Lakatos that the theoretical model is being approved, which has a greater empirical content than the previous one. To correlate a theoretical model with reality, a long chain of logical conclusions and consequences is often needed.

    Theoretical models cannot be built without their important elements - abstract(from lat. abstrahere- extract, separate) objects that represent the abstraction of certain properties and characteristics from the composition of a holistic phenomenon and the restructuring (or “finishing”) of these extracted properties into an independent object. Examples of abstract objects: "ideal gas", "absolute rigid body", "point", "force", "circle", "segment", "perfect competition market", etc. The choice of certain abstract objects is associated with a certain "intellectual risk." The enormous importance of abstract objects is already evident from the fact that the abstraction of the extension of bodies from their mass provided the beginning of geometry, and the opposite abstraction of mass from extension served as the beginning of mechanics. The choice of certain abstract objects is significantly influenced by the scientific picture of the world.

    Abstract objects, being idealizations of reality, are also called theoretical constructs. or theoretical objects. They may contain both signs that


    rye correspond to real objects, such as idealized (mentally constructed) objectivity, the properties of which are not possessed by any real object. Abstract objects replace certain connections of reality, but they cannot have the status of real physical objects, since they are idealizations. It is believed that an abstract object is much simpler than a real one.

    Since the primary theoretical models are predominantly hypothetical, it is important for them to have factual confirmation, and, therefore, the stage of their justification becomes the methodological norm, during which they are adapted to a certain set of experiments. Otherwise, one may encounter a situation of arbitrariness of scientists and pseudoscientific theorizing. Therefore, the stage of creating a theoretical model is followed by the stage of its application to the qualitative variety of things, i.e., its qualitative expansion, after which the stage of quantitative mathematical formulation in the form of an equation or formula follows. This marks the phase of the appearance of the formulation of the law, although at all stages, without exception, the correction of the abstract objects themselves, their theoretical schemes, as well as quantitative mathematical formalizations, is actually carried out. V. S. Stepin emphasizes that “in classical physics, one can speak of two stages in the construction of particular theoretical schemes as hypotheses: the stage of their construction as content-physical models of a certain area of ​​interactions and the stage of possible restructuring of theoretical models in the process of their connection with the mathematical apparatus” . The laws reflect the most essential, necessary and recurring connections and interactions of the processes and phenomena of the universe. The law reflects objectively existing interactions in nature and in this sense is understood as a natural regularity.

    Theoretical models reflect the structure, properties and behavior of real objects. Models allow visualization of objects and processes that are inaccessible to direct perception: for example, a model of an atom, a model of the Universe. Important characteristics of a theoretical model are its structure, as well as the transfer of abstract objects from other fields of knowledge.

    The choice of abstract objects is significantly influenced by the scientific picture of the world, which stimulates the development of research practice, the definition of problems and ways to solve them.

    abstract objects Are these theoretical constructions, or is it an idealization of reality. They may contain signs that correspond to real objects (Ideal gas, absolutely black body), or maybe not a single real object has.

    Analogy- the transfer of abstract objects from one field of knowledge to another. Analogy is of great importance in the metaphysics of Aristotle, who practices it as a form of manifestation of a single principle in single bodies. It stands out: 1) analogy inequalities when equal objects have the same name (heavenly body, earthly body). 2) analogy proportionality(physical health, mental health). Distinguish between the analogy of objects and the analogy of relations, as well as strict analogy and non-strict. Strict analogy provides the necessary connection of the transferred feature with the feature of similarity. Analogy lax is problematic.

    Analogy between geometric and algebraic objects implemented Descartes in analytical geometry; The analogy of selection work in cattle breeding was used by Darwin in his theory of natural selection. An extensive class of analogies is used in modern scientific disciplines: in architecture and the theory of urban planning, cybernetics, pharmacology and medicine, logic and linguistics.

    The analogy method is widely used in the field of technical sciences. Procedure matters a lot schematization(the real object is replaced by a diagram, a model). Model - scheme - qualitative and quantitative extensions - mathematization - formulation of laws - stages of correction of the abstract objects themselves.

    It is necessary to form laws, they indicate the presence of internally necessary, stable and repetitive links between events and the state of objects.

    At the end of the 17th century T. Hobbes in Leviathan formulated a number of natural laws. They help to take the path of social contract, without them no society can be built.

    Knowledge can be dismembering (analytical) and generalizing (synthetic). Analytical knowledge allows you to clarify the details and particulars. Synthetic leads not just to generalization, but to the creation of new content, new horizons, a new layer of reality. Analytical knowledge implies a logic aimed at revealing elements that were not yet known about, but which were contained in the previous basis.

    Logic and laws are of great importance in scientific knowledge.

    Historical research often uses the general laws established in physics, chemistry, and biology. For example, the defeat of the army is explained by the lack of food, weather, diseases, etc. Determining dates in history using tree rings is based on the application of certain biological patterns. The authenticity of documents, paintings, coins - use physical and chemical theories.

    In the philosophical and methodological literature of recent decades, fundamental ideas, concepts and ideas, which form relatively stable foundations, are increasingly becoming the subject of study, on which specific empirical knowledge and theories explaining them are developed.

    The identification and analysis of these foundations presupposes the consideration of scientific knowledge as an integral developing system. In Western philosophy, such a vision of science began to take shape relatively recently, mainly in the post-positivist period of its history. As for the stage at which ideas about science, developed within the framework of positivist philosophy, dominated, their most striking expression was the so-called standard concept of the structure and growth of knowledge 1 . In wei, a single theory and its relationship with experience acted as a unit of analysis. Scientific knowledge was presented as a set of theories and empirical knowledge, considered as the basis on which theories are developed. However, it gradually became clear that the empirical basis of the theory is not pure, theoretically neutral empiricism, that not observational data, but Facts represent the empirical basis on which theories are based. And the facts are theoretically loaded, since other theories take part in their formation. And then the problem of the interaction of a separate theory with its empirical basis also appears as the problem of the relationship of this theory with other, previously established theories that form the composition of the theoretical knowledge of a certain scientific discipline.

    On the other hand, this problem of the interconnection of theories was revealed in the study of their dynamics. It turned out that the growth of theoretical knowledge is carried out not simply as a generalization of experimental facts, but as the use in this process of theoretical concepts and structures developed in previous theories and used in the generalization of experience. Thus, the theories of the corresponding science were presented as a kind of dynamic network, an integral system interacting with empirical facts. The systemic impact of the knowledge of a scientific discipline posed the problem of system-forming factors that determine the integrity of the corresponding knowledge system. Thus, the problem of the foundations of science began to emerge, thanks to which various knowledge of a scientific discipline is organized into a systemic integrity at each stage of its historical development.

    Finally, consideration of the growth of knowledge in its historical dynamics revealed special states associated with critical epochs in the development of science, when a radical transformation of its most fundamental concepts and ideas takes place. These states have been called scientific revolutions, and they can be seen as a restructuring of the foundations of science.

    Thus, the expansion of the field of methodological problems in the post-positivist philosophy of science put forward the analysis of the foundations of science as a real methodological problem.

    These foundations and their individual components were fixed and described in terms of: “paradigm” (T. Kuhn), “core of the research program” (I. Lakatos), “ideals of natural order” (S. Tulmin), “main themes of science” ( J. Holton), “research tradition” (L. Laudan).

    In the process of discussions between supporters of various concepts, the problem of a differentiated analysis of the foundations of science became acute. Discussions around the key concept in Kuhn's concept of "paradigm" can serve as indicative in this regard. Its extreme ambiguity and vagueness were noted by Kuhn's numerous opponents.

    Influenced by criticism, Kuhn attempted to analyze the structure of the paradigm. He singled out the following components: “symbolic generalizations” (mathematical formulations of laws), examples of solving specific problems, “metaphysical parts of the paradigm” and values ​​(values ​​of science) 2 . This was a step forward compared to the first version of the concept, but at this stage the structure of the foundations of science remained unclear. Firstly, it is not shown in what connections the selected components of the paradigm are located, which means, strictly speaking, its structure has not been revealed. Secondly, according to Kuhn, the paradigm includes both components related to the deep foundations of scientific research and the forms of knowledge that grow on these foundations. For example, “symbolic generalizations” include mathematical formulations of particular laws of science (such as formulas expressing the Joule-Lenz law, the law of mechanical vibration, etc.). But then it turns out that the discovery of any new private law should mean a change in the paradigm, i.e. scientific revolution. This blurs the distinction between “normal science” (an evolutionary stage in the growth of knowledge) and the scientific revolution. Thirdly, highlighting such components of science as “metaphysical parts of the paradigm” and values. Kuhn fixes them “ostensively”, through the description of relevant examples. From the examples cited by Kuhn, it can be seen that the “metaphysical parts of the paradigm” are understood by him either as philosophical ideas or as principles of a concrete scientific nature (such as the principle of close action in physics or the principle of evolution in biology). As for values, Kuhn's description of them also looks like only a first and very approximate sketch. In essence, here we have in mind the ideals of science, and taken in a very limited range - as the ideals of explanation, prediction and application of knowledge.

    In principle, we can say that even in the most advanced studies of the foundations of science, to which the works of T. Kuhn can be attributed, Western philosophy of science is not analytical enough. It has not yet established what are the main components of the foundations of science and their connections. The connections between the foundations of science and the theories and empirical knowledge based on them are not sufficiently clarified. And this means that the problem of the structure of foundations, their place in the system of knowledge and their functions in its development requires further, deeper discussion.

    In the established and developed system of disciplinary scientific knowledge, the foundations of science are found, firstly, in the analysis of systemic relationships between theories of varying degrees of generality and their relationship to various forms of empirical knowledge within a certain discipline (physics, chemistry, biology, etc.), secondly, in the study of interdisciplinary relations and interactions of various sciences.

    As the most important components that form the foundations of science, we can single out: 1) the scientific picture of the world; 2) ideals and norms of scientific knowledge; 3) philosophical foundations of science.

    The listed components express general ideas about the specifics of the subject of scientific research, about the features of cognitive activity that masters one or another type of objects, and about the nature of the links between science and the culture of the corresponding historical era.

    Formation of primary theoretical models and laws

    Models make it possible to visualize objects and processes that are inaccessible to direct perception: for example, a model of an atom, a model of the Universe, a model of the human genome, etc.
    Hosted on ref.rf
    Theoretical models reflect the structure, properties and behavior of real objects.

    The well-known Western philosopher of science Imre Lakatos noted that the process of formation of primary theoretical models can be based on programs of three kinds: firstly, this is an empiricist program, secondly, an inductivist program, and, thirdly, the system of Euclid (Euclidean program). All three programs proceed from the organization of knowledge as a deductive system 1 .

    The Euclidean program, which assumes that everything can be deduced from a finite set of trivial true statements, consisting only of terms with a trivial semantic load, is usually called the program of knowledge trivialization. This program contains purely true judgments, but it does not work with assumptions or rebuttals. Knowledge as truth is introduced at the top of the theory and, without any deformation, ʼʼflowsʼʼ from primitive terms to defined terms.

    Unlike Euclidean, the empiricist program is built on the basis of basic provisions that are of a well-known empirical nature.
    Hosted on ref.rf
    Empiricists cannot admit any other introduction of meaning than from below theory. If these propositions turn out to be false, then this assessment penetrates up the channels of deduction and fills the entire system. Therefore, the empiricist theory is conjectural and falsifiable. And if the Euclidean theory places the truth above and illuminates it with the natural light of reason, then the empiricist theory places it below and illuminates it with the light of experience. But both programs rely on logical intuition.

    Of the inductivist program, Lakatos says: ʼʼThe mind expelled from above seeks refuge below. (...) The inductivist program arose as part of an effort to construct a conduit through which truth flows upward from basic propositions, and thus establish an additional logical principle, the principle of relaying truthʼʼ. The emergence of the inductivist program was associated with the pre-Copernican Enlightenment, when refutation was considered obscene and conjecture despised. ʼʼThe transfer of authority from Revelation to facts, of course, met with opposition from the Church. Scholastic logicians and ʼʼhumanistsʼʼ did not get tired of predicting the sad outcome of the inductivist enterpriseʼʼ. Inductive logic has been replaced by probabilistic logic. The final blow to inductivism was dealt by Popper, who showed that even a partial transmission of truth and meaning cannot go upwards from below.

    In the fundamental work of Academician V. S. Stepin ʼʼTheoretical Knowledgeʼʼ it is shown that the main feature of theoretical schemes is that they are not the result of a purely deductive generalization of experience. In advanced science, theoretical schemes are first constructed as hypothetical models through the use of previously formulated abstract objects. At the early stages of scientific research, the constructs of theoretical models are created by direct schematization of experience.

    Important characteristics of a theoretical model are its structure, as well as the possibility of transferring abstract objects from other areas of knowledge. Lakatos believes that the main structural units are a hard core, a belt of defensive hypotheses, positive and negative heuristics. The negative heuristic forbids applying rebuttals to the hard core of the program. A positive heuristic allows further development and expansion of the theoretical model. Lakatos insisted that all science be understood as a gigantic research program, subject to the basic rule of K. Popper: ʼʼ Put forward hypotheses that have more empirical content than those of the previous ʼʼ. The construction of a scientific theory is conceived in two stages: the first is the advancement of a hypothesis, the second is its substantiation.

    The choice of abstract objects is significantly influenced by the scientific picture of the world, which stimulates the development of research practice, the definition of tasks and ways to solve them. Abstract objects, sometimes called theoretical constructs and sometimes theoretical objects, are idealizations of reality. They may contain features that correspond to real objects, and there may be properties that no real object has. Theoretical objects convey the meaning of such concepts as ʼʼideal gasʼʼ, ʼʼabsolute black bodyʼʼ, ʼʼpointʼʼ, ʼʼforceʼʼ, ʼʼ circumferenceʼʼ, ʼʼsegmentʼʼ, etc.
    Hosted on ref.rf
    Abstract objects are aimed at replacing certain connections of reality, but they cannot exist in the status of real objects, since they are idealizations.

    The transfer of abstract objects from one field of knowledge to another presupposes the existence of a solid basis for analogies that indicate relationships of similarity between things. This rather widespread way of identifying the properties of objects or the objects themselves goes back to the most ancient hermetic tradition, the echo of which is the reflections of the Pythagoreans on the numerical structure of the universe, ᴛ.ᴇ. about the ratio of numerical correspondences and cosmic harmony of spheres.
    Hosted on ref.rf
    ʼʼAll things are numbersʼʼ, ʼʼnumber owns thingsʼʼ - these are the conclusions of Pythagoras. The unified beginning in the unmanifested state is equal to zero; when it is embodied, it creates the manifested pole of the absolute equal to one. The transformation of a unit into a two symbolizes the "separation of a single reality into matter and spirit, says that knowledge of one is knowledge of another.

    The ontological basis of the method of analogies is hidden in the well-known principle of the unity of the world, which, according to ancient tradition, is interpreted in two ways: One is many, and many are one. Analogy is of great importance in the metaphysics of Aristotle, who interprets it as a form of manifestation of a single principle in single bodies.

    Modern interpreters distinguish: 1) the analogy of inequality, when different objects have the same name (heavenly body, earthly body); 2) the analogy of proportionality (physical health - mental health); 3) the analogy of attribution, when the same relationship is attributed to an object in different ways (a healthy lifestyle - a healthy body - a healthy society, etc.).

    Τᴀᴋᴎᴍ ᴏϬᴩᴀᴈᴏᴍ, inference by analogy allows us to liken a new single phenomenon to another, already known phenomenon. Analogy with a certain degree of probability allows you to expand existing knowledge by including new subject areas in their scope. It is noteworthy that Hegel highly valued the possibilities of the method of analogies, calling the latter an "instinct of reason".

    Abstract objects must satisfy the connections and interactions of the evolving field of knowledge. For this reason, the question of the reliability of the analogy is always relevant. Due to the fact that the history of science provides a significant number of examples of the use of analogy, it is recognized as an essential tool for scientific and philosophical understanding. There are analogies of objects and analogies of relations, as well as strict analogy and non-strict analogy. Strict analogy provides the necessary connection between the transferred feature and the similarity feature. The analogy is weak and problematic.
    Hosted on ref.rf
    It is important to note that the difference between analogy and deductive reasoning is that in analogy there is an assimilation of single objects, and not a subsuming of an individual case under a general position, as in deduction.

    As V. N. Porus notes, “an important role in the development of classical mechanics was played by the analogy between the motion of an abandoned body and the motion of celestial bodies; the analogy between geometric and algebraic objects was realized by Descartes in analytic geometry; the analogy of selective work in cattle breeding was used by Darwin, in his theory of natural selection; the analogy between light, electric and magnetic phenomena proved to be fruitful for Maxwell's theory of the electromagnetic field. An extensive class of analogies is used in modern scientific disciplines: in architecture and the theory of urban planning, bionics and cybernetics, pharmacology and medicine, logic and linguistics, etc.

    Numerous examples of false analogies are also known. Such are the analogies between the movement of fluid and the spread of heat in the doctrine of the ʼʼcaloricʼʼ of the 17th-18th centuries, the biological analogies of social Darwinists in explaining social processes, etc.

    It should be added to this group of examples that the analogy method is widely used in the field of technical sciences. It is worth saying what is important to them information procedure, where, when creating objects similar to the invention, one group of knowledge and principles is reduced to another. Procedure is of great importance schematization, which replaces a real engineering object with an idealized representation (diagram, model). The necessary condition is mathematization. There are technical sciences of the classical type, which are formed on the basis of one natural science (for example, electrical engineering), and non-classical or complex technical sciences, which are based on a number of natural sciences (radar, computer science, etc.).

    In technical sciences, it is customary to distinguish between invention as the creation of something new and original, and improvement as the transformation of an existing one. Sometimes an invention is seen as an attempt to imitate nature, simulation modeling, an analogy between an artificially created object and a natural pattern. So, a cylindrical shell - a common form used for various purposes in technology and everyday life - is a universal structure of numerous manifestations of the plant world.

    An imitation invention has more reason to be inscribed in nature, since in it the scientist uses the secrets of a natural laboratory, its solutions and findings. But an invention is also the creation of a new, unparalleled one.

    Formation laws suggests that an experimentally or empirically substantiated hypothetical model has the ability to be converted into a schema. Moreover, ʼʼ theoretical schemes are introduced at first as hypothetical constructions, but then they are adapted to a certain set of experiments and in this process are justified as a generalization of experience ʼʼ 1. Next came the stage of its application to the qualitative diversity of things, i.e., its qualitative expansion. And only after that followed the stage of quantitative mathematical design in the form of an equation or formula, which marked the phase of the emergence of the law.

    So, model -> scheme -ʼʼ qualitative and quantitative extensions -> mathematization -> formulation of the law. At all stages, without exception, both the correction of the abstract objects themselves and their theoretical schemes, as well as their quantitative mathematical formalizations, were actually carried out. Theoretical schemes could also be modified under the influence of mathematical means, however, all these transformations remained within the limits of the put forward hypothetical model. V. S. Stepin emphasizes that "in classical physics one can speak of two stages of constructing particular theoretical schemes as hypotheses: the stage of constructing them as content-physical models of a certain area of ​​interactions and the stage of possible restructuring of theoretical models in the process of their connection mathematical apparatus ʼʼ.

    In the higher stages of development, these two aspects of the hypothesis merge, and in the early stages they are separated. The concept of ʼʼlawʼʼ indicates the presence of internally necessary, stable and repetitive links between events and states of objects. The law reflects objectively existing interactions in nature and in this sense it is customary to understand it as a natural regularity. The laws of science resort to artificial languages ​​to formulate these laws of nature. The laws developed by the human community as norms of human coexistence are, as a rule, conventional in nature.

    It is noteworthy that in the XVII century. English materialist Thomas Hobbes in his famous work "Leviathan" formulated a number of "natural laws". Οʜᴎ help to take the path of social contract, without them no society can be built.

    The laws of science tend to adequately reflect the patterns of reality. At the same time, the very measure of adequacy and the fact that the laws of science are generalizations that are changeable and subject to falsification give rise to a very acute philosophical and methodological problem. It is no coincidence that Kepler and Copernicus understood the laws of science as hypotheses. Kant was generally convinced that laws are not derived from nature, but are prescribed to it.

    For this reason, one of the most important procedures in science has always been considered the procedure of scientific substantiation of theoretical knowledge, and science itself is often interpreted as a purely ʼʼexplanatory eventʼʼ. However, explanation has always faced the problem of counterfactuality and has been vulnerable in a situation where it is extremely important to draw a strict distinction between justification and description. The most elementary definition of justification is based on the procedure of reducing the unknown to the known, the unfamiliar to the familiar. At the same time, the latest achievements of science show that Riemann's geometry forms the basis of modern relativistic physics, while human perception is organized within the framework of Euclid's geometry. Consequently, many processes of the modern physical picture of the world are fundamentally unrepresentable and unimaginable. This suggests that the justification loses its model character, visibility and must be based on purely conceptual techniques, in which the very procedure of reducing (reducing) the unknown to the known is called into question.

    Another paradoxical phenomenon arises: objects that are extremely important to explain, it turns out, cannot be observed in principle! (an example of a quark is an unobservable entity). Τᴀᴋᴎᴍ ᴏϬᴩᴀᴈᴏᴍ, scientific and theoretical knowledge acquires, alas, an extra-experimental character.
    Hosted on ref.rf
    Non-experiential reality allows you to have non-experiential knowledge about yourself. This conclusion, at which the modern philosophy of science has stopped, is not perceived by all scientists as scientific outside the above context, because the procedure of scientific justification is based on what cannot be explained.

    In relation to the logic of scientific discovery, the position associated with the refusal to search for rational grounds for scientific discovery has very loudly declared itself. In the logic of discovery, a large place is given to bold guesses, often referring to the switching of gestalts (ʼʼsamplesʼʼ) to analog modeling. There are widespread indications of heuristics and intuition that accompanies the process of scientific discovery.

    The most general view of the mechanism of development of scientific knowledge from the standpoint of rationalism suggests that knowledge should be dissecting (analytical) and generalizing (synthetic). Analytical knowledge allows you to clarify the details and particulars, to reveal the full potential of the content present in the original basis. Synthetic knowledge leads not just to generalization, but to the creation of a fundamentally new content, which is not contained either in disparate elements or in their summative integrity. Kant's synthetic ʼʼʼʼʼʼ attaches contemplation to the concept, that is, it combines structures of different nature: conceptual and factual. The essence of the analytical approach lies in the fact that the main essential aspects and patterns of the phenomenon under study are assumed to be something contained in the given, taken as the source material. Research work is carried out within the framework of an already outlined area, a set task and is aimed at analyzing its internal potential. The synthetic approach focuses the researcher on finding dependencies outside the object itself, in the context of external systemic relations.

    The rather traditional notion that the emergence of the new is associated only with a synthetic movement cannot remain without clarification. Undoubtedly, it is the synthetic movement that presupposes the formation of new theoretical meanings, types of mental content, new horizons, a new layer of reality. Synthetic is something new, ĸᴏᴛᴏᴩᴏᴇ leads to the discovery of a qualitatively different, different from the previous, existing basis.

    The analytic movement presupposes a logic aimed at revealing elements that were not yet known about, but which were contained in the previous basis. ʼʼYou yourself do not know that you already know this, but we will now drag your knowledge out, logically reformulate itʼʼ - Galileo summarizes this process figuratively. A.F. Losev also emphasizes that the essence of analytic negation is essentially that it adds something to the motionless discreteness. This addition, however, is very small: at first it is close to zero. But it is by no means zero. The whole novelty of analytic negation lies essentially in the fact that it points to some kind of shift, no matter how small and close to zero, to some kind of increment of this quantity.

    The analytical form of obtaining new knowledge fixes new connections and relationships of objects that have already fallen into the sphere of human practical activity. It is closely related to deduction and to the notion of ʼʼlogical consequenceʼʼ. An example of such an analytical increment of new knowledge is the finding of new chemical elements in Mendel-Eev's periodic table. In the logic of discovery, those areas are singled out where development occurs according to the analytical type on the basis of the disclosure of the initial principles of the theory that has already become. Spheres where ʼʼbreak in gradualnessʼʼ is carried out, going beyond the limits of available knowledge are also fixed. The new theory in this case overturns the existing logical canons and is built on a fundamentally different, constructive basis.

    A constructive modification of the observed conditions, the establishment of new idealizations, the creation of a different scientific objectivity that is not found in ready-made form, the integrative crossing of principles at the ʼʼjunction of sciencesʼʼ, which previously seemed unrelated to each other - these are the features of the logic of discovery, which gives new knowledge of a synthetic nature and more heuristic value than the old one. The logic of traditions and innovations indicates, on the one hand, the extreme importance of maintaining continuity, the existing set of methods, techniques and skills. On the other hand, it demonstrates a potential that surpasses the method of reproducing the accumulated experience, which involves the creation of a new and unique one.

    The logic of discovery aims at the awareness of such factors escaping the field of view as a by-product of interactions, unintended consequences of goal-setting activity. Columbus wanted to open a new route to India, and discovered a previously unknown mainland - America. The discrepancy between goals and results is a fairly frequent, ubiquitous process. The end result is heteronomous, it combines at least three layers: the content of the originally set goal, the by-product of interactions, and the unintended consequences of expedient activity. Οʜᴎ testify to the multidimensionality of natural and social interactions. Recognition of non-linearity, multifactoriality, alternativeness is the hallmark of a new strategy for scientific research.

    A modern scientist must be ready to fix and analyze the results born outside and in addition to his conscious goal-setting, incl. and to the fact that the latter may turn out to be much richer than the original goal. A fragment of being, singled out as a subject of study, is in fact not isolated. It is connected with the infinite dynamics of the universe by a network of interactions, currents of multidirectional forces and influences. The main and secondary, central and peripheral, main and dead-end directions of development, having their own niches, coexist in constant non-equilibrium interaction. Situations are possible when a developing phenomenon does not carry the forms of future states in a ready-made form, but receives them from the outside as a by-product of interactions occurring outside the phenomenon itself or, at least, on the periphery of this framework. And if earlier science could afford to cut off these lateral branches, which seemed insignificant, now this is an unaffordable luxury.

    It turns out that it is generally not easy to define what ʼʼit doesn't matterʼʼ' or ʼʼʼʼʼʼʼʼ means in science. Arising on the periphery of connections and relations, incl. and under the influence of factors that have shown themselves in a small way in the past, the by-product can act as a source of neoplasm and be even more significant than the originally set goal. It testifies to the indestructible desire of being to realize all its potentialities. Here there is a kind of equalization of opportunities, when everything that has a place to be declares itself and requires a recognized existence.

    The ambiguity of the logic of constructing scientific knowledge has been noted by many philosophers. So, M. K. Mamardashvili in the monograph ʼʼForms and content of thinkingʼʼ emphasizes that in the logical apparatus of science it is extremely important to distinguish between two types of cognitive activity. The first includes means that allow you to get a lot of new knowledge from existing ones, using the proof and logical derivation of all possible consequences. At the same time, with this method of obtaining knowledge, no fundamentally new mental content is singled out in objects and the formation of new abstractions is not expected. The second method involves obtaining new scientific knowledge ʼʼby acting with objectsʼʼ, which are based on the involvement of content in the construction of a line of reasoning. Here we are talking about the use of content in some new plan, which does not follow in any way from the logical form of the existing knowledge and any of their recombination, namely, about ʼʼintroduction of objective activity into the given contentʼʼ.

    The Galilean principle of inertia was obtained using an ideal experiment. Galileo formulates a paradoxical image - movement along an infinitely large circle under the assumption that it is identical to an infinite straight line, and then carries out algebraic studies. And in all interesting cases, either a contradiction or a discrepancy between theoretical idealization and everyday experience is fixed, the theoretical construction and direct observation. For this reason, the essence of scientific and theoretical thinking begins to be associated with the search for a modification of the observed conditions, the assimilation of empirical material and the creation of a different scientific objectivity that is not found in finished form. Theoretical idealization, the theoretical construct becomes a permanent member in the arsenal of means of rigorous natural science.

    In ʼʼCriteria of Meaningʼʼ (1950) by a modern German-American philosopher of science Carl Gustav Hempel(1905-1997) pays special attention to the problem of clarifying the relationship between ʼʼtheoretical termsʼʼ and ʼʼterms of observationʼʼ. How, for example, does the term ʼʼelectronʼʼ correspond to observable entities and qualities, does it have an observational meaning? To find the answer to the question posed, the author introduces the concept of ʼʼinterpretative systemʼʼ. In the well-known Theorist's Dilemma, Hempel showed that when the meaning of theoretical terms is reduced to the meaning of a set of observation terms, theoretical concepts turn out to be redundant. Οʜᴎ turn out to be redundant even if the introduction and substantiation of theoretical terms rely on intuition. Thus, the Theorist's Dilemma showed that theoretical terms cannot be reduced to terms of observation, and no combination of observation terms can exhaust theoretical terms.

    These provisions were of great importance for understanding the status of theoretical models in science. The ʼʼTheoretician's dilemmaʼʼ, according to the researchers, should be presented in the form of the following statements:

    Theoretical terms either do their job or they don't.

    If they do not fulfill their function, then they are not needed.

    If theoretical terms perform their functions, then they establish connections between the observed phenomena.

    But these connections are also established without theoretical terms.

    If empirical connections are established even without theoretical terms, then theoretical terms are not needed.

    Therefore, theoretical terms are not needed both when they perform their functions and when they do not perform these functions.

    To explain the conditions for ʼʼaccepting a hypothesisʼʼ, Hempel proposed the concept of ʼʼepistemological utilityʼʼ. His well-known work ʼʼMotives and ʼʼencompassingʼʼ laws in historical explanationʼʼ poses the problem of the difference between laws and explanations of natural science and history. Scientific research in various fields of science seeks not just to generalize certain events in the world of our experience, but to identify regularities in the course of these events and establish general laws that can be used for prediction and explanation.

    According to the ʼʼencompassing lawsʼʼ model, an event is explained when a statement describing the event is deduced from general laws and statements describing antecedent conditions; a general law is explanatory if it is deduced from a more exhaustive law. Hempel was the first to clearly connect explanation with deductive output and law, and also formulated the conditions for the adequacy of explanation. According to the scientist, general laws have similar functions in history and in the natural sciences. Οʜᴎ form an essential research tool and constitute the common grounds for various procedures that are often seen as specific to the social sciences as opposed to the natural ones.

    Historical research often uses the general laws established in physics, chemistry, and biology. For example, the defeat of the army is explained by the lack of food, weather changes, diseases, etc. Determining dates in history using annual tree rings is based on the application of certain biological patterns. Various methods of empirical verification of the authenticity of documents, paintings, coins use physical and chemical theories. Moreover, in all cases, the historical past is never accessible to direct direct study and description.

    Analyzing the entire historical arsenal of explanation, it is necessary to distinguish between metaphors that have no explanatory value, sketches of explanations, among which there are both scientifically acceptable and pseudo-explanations, and, finally, satisfactory explanations. Hempel foresaw the critical importance of the supplement procedure, which assumes the form of a gradually increasing refinement of the formulations used, so that the outline of the explanation can be confirmed, refuted, or indicate approximately the type of research.

    The reconstruction procedure is also important, aimed at understanding the underlying explanatory hypotheses, assessing their significance and empirical base. From his point of view, the resurrection of assumptions buried under tombstones: ʼʼthereforeʼʼ, ʼʼbecauseʼʼ, ʼʼin connection with thisʼʼ, etc., often shows that the explanations offered are weakly substantiated or unacceptable. In many cases, this procedure detects an assertion error. For example, the geographical or economic conditions of a group of people can be taken into account when explaining some common features, say, their art or moral code. But this does not mean that in this way we have explained in detail the artistic achievements of this group of people or the system of their moral code. From the description of geographical or economic conditions it is not possible to derive a detailed explanation of aspects of cultural life.

    The concepts of ʼʼgeneral lawʼʼ and ʼʼhypothesis of the universal formʼʼ are identified. He defines the law itself as follows: in each case when an event of a certain type P (cause) takes place in a certain place and at a certain moment in time, an event of a certain type C (consequence) will take place in that place and at that moment in time, ĸᴏᴛᴏᴩᴏᴇ is in a certain way connected with the place and time of the appearance of the first event.

    Proper justification is facilitated by isolating one or more important groups of facts that must be specified in the initial conditions and the assertion that the event in question is ʼʼdeterminedʼʼ and, therefore, must be explained in terms of only this group of facts.

    Scientific explanation includes the following elements:

    a) empirical verification of sentences that speak of certain conditions;

    b) empirical testing of universal hypotheses on which the explanation is based;

    c) an examination of whether the explanation is logically convincing.

    A prediction, in contrast to an explanation, consists in an assertion about some future event. Here the initial conditions are given, but the consequences do not yet take place, but must be established. We can talk about the structural equality of the procedures of substantiation and prediction. Very rarely, however, explanations are formulated so completely that they can show their predictive character, more often explanations are incomplete. There are explanations ʼʼcausalʼʼ and ʼʼprobabilisticʼʼ, based rather on probabilistic hypotheses than on general ʼʼdeterministicʼʼ laws, i.e. laws in the form of universal conditions.

    In ʼʼThe Logic of Explanationʼʼ K. Hempel argues that to explain phenomena in the world of our experience means to answer the question ʼʼwhy?ʼʼ rather than just the question ʼʼwhat?ʼʼ. Science has always sought to go beyond description and break through to explanation. An important characteristic of the rationale is reliance on general laws. For example, when a part of an oar that is under water appears to be broken up to a person in a boat, this phenomenon is explained using the law of refraction and the law of optical density of media: water has a higher optical density than air. For this reason, the question "Why does this happen?" At the same time, the question ʼʼwhy?ʼʼ can also arise in relation to the general laws themselves. Why does the propagation of light obey the law of refraction? Answering it, representatives of classical physics will be guided by the wave theory of light.

    Τᴀᴋᴎᴍ ᴏϬᴩᴀᴈᴏᴍ, the explanation of the pattern is carried out on the basis of subsuming it under another, more general pattern. Based on this, a two-part structure of explanation is derived: the explanandum is a description of the phenomenon; explanans - a class of sentences that are given to explain a given phenomenon. Explanance, in turn, is divided into two subclasses: one of them describes conditions; the other is general laws.

    The explanandum must be logically deducible from the explanance - such is the logical condition of adequacy. Explanance must be confirmed by all available empirical material, must be true - this is an empirical condition for adequacy.

    Incomplete explanations omit part of the explanance as obvious. Causal or deterministic laws differ from statistical ones in that the latter establish that, in the long term, a certain percentage of all cases that satisfy a given set of conditions will be accompanied by a certain type of phenomenon.

    The principle of causal justification works in both natural and social sciences. The explanation of actions in terms of the agent's motives is perceived as a special kind of teleological explanation, ĸᴏᴛᴏᴩᴏᴇ is absolutely extremely important in biology, since it consists in explaining the characteristics of an organism by referring to certain goals that are essential for the preservation of the life of an organism or species.

    Formation of primary theoretical models and laws - concept and types. Classification and features of the category "Formation of primary theoretical models and laws" 2017, 2018.