Biographies Characteristics Analysis

The emergence of probability theory as a science. Probability theory

Some programmers, after working in the development of conventional commercial applications, are thinking about mastering machine learning and becoming a data analyst. Often they do not understand why certain methods work, and most machine learning methods seem like magic. In fact, machine learning is based on mathematical statistics, and that, in turn, is based on probability theory. Therefore, in this article we will pay attention to the basic concepts of probability theory: we will touch on the definitions of probability, distribution, and analyze a few simple examples.

You may know that probability theory is conditionally divided into 2 parts. Discrete probability theory studies phenomena that can be described by a distribution with a finite (or countable) number of possible behaviors (throws of dice, coins). Continuous probability theory studies phenomena distributed on some dense set, for example, on a segment or in a circle.

It is possible to consider the subject of probability theory with a simple example. Imagine yourself as a shooter developer. An integral part of the development of games in this genre is the mechanics of shooting. It is clear that a shooter in which all weapons shoot absolutely accurately will be of little interest to players. Therefore, it is necessary to add spread to the weapon. But simply randomizing weapon hitpoints won't allow for fine-tuning, so adjusting the game balance will be difficult. At the same time, using random variables and their distributions, you can analyze how the weapon will work with a given spread, and help make the necessary adjustments.

Space of elementary outcomes

Suppose, from some random experiment that we can repeat many times (for example, tossing a coin), we can extract some formalizable information (heads or tails). This information is called an elementary outcome, and it is useful to consider the set of all elementary outcomes, often denoted by the letter Ω (Omega).

The structure of this space depends entirely on the nature of the experiment. For example, if we consider shooting at a sufficiently large circular target, the space of elementary outcomes will be a circle, for convenience, placed with the center at zero, and the outcome will be a point in this circle.

In addition, they consider sets of elementary outcomes - events (for example, hitting the "top ten" is a concentric circle of small radius with a target). In the discrete case, everything is quite simple: we can get any event, including or excluding elementary outcomes in a finite time. In the continuous case, however, everything is much more complicated: we need some good enough family of sets to consider, called an algebra, by analogy with simple real numbers that can be added, subtracted, divided and multiplied. Sets in an algebra can be intersected and combined, and the result of the operation will be in the algebra. This is a very important property for the mathematics behind all these concepts. The minimal family consists of only two sets - the empty set and the space of elementary outcomes.

Measure and Probability

Probability is a way of making inferences about the behavior of very complex objects without understanding how they work. Thus, the probability is defined as a function of an event (from that very good family of sets), which returns a number - some characteristic of how often such an event can occur in reality. For definiteness, mathematicians agreed that this number should lie between zero and one. In addition, requirements are imposed on this function: the probability of an impossible event is zero, the probability of the entire set of outcomes is unity, and the probability of combining two independent events (disjoint sets) is equal to the sum of the probabilities. Another name for probability is a probability measure. The most commonly used Lebesgue measure, which generalizes the concepts of length, area, volume to any dimensions (n-dimensional volume), and thus it is applicable to a wide class of sets.

Together, the set of a set of elementary outcomes, a family of sets, and a probability measure is called probability space. Let's look at how we can construct a probability space for the target shooting example.

Consider shooting at a large round target of radius R that cannot be missed. As a set of elementary events, we put a circle centered at the origin of coordinates of radius R . Since we are going to use the area (the Lebesgue measure for two-dimensional sets) to describe the probability of an event, we will use the family of measurable (for which this measure exists) sets.

Note Actually, this is a technical point and in simple problems the process of determining the measure and the family of sets does not play a special role. But it is necessary to understand that these two objects exist, because in many books on probability theory, theorems begin with the words: “ Let (Ω,Σ,P) be a probability space…».

As mentioned above, the probability of the entire space of elementary outcomes must be equal to one. The area (the two-dimensional Lebesgue measure, which we will denote by λ 2 (A), where A is the event) of the circle, according to the well-known formula from school, is π * R 2 . Then we can introduce the probability P(A) = λ 2 (A) / (π *R 2) , and this value will already lie between 0 and 1 for any event A.

If we assume that hitting any point of the target is equally probable, then the search for the probability of the shooter hitting some area of ​​the target is reduced to finding the area of ​​this set (hence we can conclude that the probability of hitting a specific point is zero, because the area of ​​the point is zero).

For example, we want to know what is the probability that the shooter will hit the "ten" (event A - the shooter hit the right set). In our model, "ten" is represented by a circle centered at zero and with radius r. Then the probability of falling into this circle is P(A) = λ 2 /(A)π *R 2 = π * r 2 /(π R 2)= (r/R) 2 .

This is one of the simplest varieties of "geometric probability" problems - most of these problems require finding an area.

random variables

A random variable is a function that converts elementary outcomes into real numbers. For example, in the considered problem, we can introduce a random variable ρ(ω) — the distance from the point of impact to the center of the target. The simplicity of our model allows us to explicitly specify the space of elementary outcomes: Ω = (ω = (x,y) numbers such that x 2 +y 2 ≤ R 2 ) . Then the random variable ρ(ω) = ρ(x,y) = x 2 +y 2 .

Means of abstraction from the probability space. Distribution function and density

It is good when the structure of space is well known, but in reality this is not always the case. Even if the structure of space is known, it can be complex. To describe random variables, if their expression is unknown, there is the concept of a distribution function, which is denoted by F ξ (x) = P(ξ< x) (нижний индекс ξ здесь означает случайную величину). Т.е. это вероятность множества всех таких элементарных исходов, для которых значение случайной величины ξ на этом событии меньше, чем заданный параметр x .

The distribution function has several properties:

  1. First, it is between 0 and 1 .
  2. Second, it does not decrease when its argument x increases.
  3. Third, when the number -x is very large, the distribution function is close to 0, and when x itself is large, the distribution function is close to 1.

Probably, the meaning of this construction is not very clear on the first reading. One of the useful properties is that the distribution function allows you to look for the probability that a value takes a value from an interval. So, P (random variable ξ takes values ​​from the interval ) = F ξ (b)-F ξ (a) . Based on this equality, we can investigate how this value changes if the boundaries a and b of the interval are close.

Let d = b-a , then b = a+d . And therefore, F ξ (b)-F ξ (a) = F ξ (a+d) - F ξ (a) . For small values ​​of d, the above difference is also small (if the distribution is continuous). It makes sense to consider the relation p ξ (a,d)= (F ξ (a+d) - F ξ (a))/d . If for sufficiently small values ​​of d this ratio differs little from some constant p ξ (a) , independent of d, then at this point the random variable has a density equal to p ξ (a) .

Note Readers who have previously encountered the concept of a derivative may notice that p ξ (a) is the derivative of the function F ξ (x) at the point a . In any case, you can study the concept of a derivative in an article dedicated to this topic on the Mathprofi website.

Now the meaning of the distribution function can be defined as follows: its derivative (density p ξ , which we defined above) at point a describes how often a random variable will fall into a small interval centered at point a (neighborhood of point a) compared to neighborhoods of other points . In other words, the faster the distribution function grows, the more likely it is that such a value will appear in a random experiment.

Let's go back to the example. We can calculate the distribution function for a random variable, ρ(ω) = ρ(x,y) = x 2 +y 2 , which denotes the distance from the center to the point of a random hit on the target. By definition, F ρ (t) = P(ρ(x,y)< t) . т.е. множество {ρ(x,y) < t)} — состоит из таких точек (x,y) , расстояние от которых до нуля меньше, чем t . Мы уже считали вероятность такого события, когда вычисляли вероятность попадания в «десятку» - она равна t 2 /R 2 . Таким образом, Fρ(t) = P(ρ(x,y) < t) = t 2 /R 2 , для 0

We can find the density p ρ of this random variable. We note right away that it is zero outside the interval, since the distribution function on this interval is unchanged. At the ends of this interval, the density is not determined. Inside the interval, it can be found using a table of derivatives (for example, from the Mathprofi website) and elementary differentiation rules. The derivative of t 2 /R 2 is 2t/R 2 . This means that we found the density on the entire axis of real numbers.

Another useful property of density is the probability that a function takes a value from an interval is calculated using the integral of the density over this interval (you can get acquainted with what it is in articles about proper, improper, indefinite integrals on the Mathprofi website).

On the first reading, the span integral of the function f(x) can be thought of as the area of ​​a curvilinear trapezoid. Its sides are a fragment of the Ox axis, a gap (of the horizontal coordinate axis), vertical segments connecting the points (a,f(a)), (b,f(b)) on the curve with points (a,0), (b,0 ) on the x-axis. The last side is a fragment of the graph of the function f from (a,f(a)) to (b,f(b)) . We can talk about the integral over the interval (-∞; b] , when for sufficiently large negative values, a, the value of the integral over the interval will change negligibly small compared to the change in the number a. The integral over the intervals is determined in a similar way Topics information technology in general EN probability theory of chance probability calculation … Technical Translator's Handbook

Probability theory- there is a part of mathematics that studies the relationships between the probabilities (see Probability and Statistics) of various events. We list the most important theorems related to this science. The probability of occurrence of one of several incompatible events is equal to ... ... Encyclopedic Dictionary F.A. Brockhaus and I.A. Efron

PROBABILITY THEORY- mathematical a science that allows, according to the probabilities of some random events (see), to find the probabilities of random events associated with k. l. way with the first. Modern TV based on the axiomatics (see Axiomatic method) of A. N. Kolmogorov. On the… … Russian sociological encyclopedia

Probability theory- a branch of mathematics in which, according to the given probabilities of some random events, the probabilities of other events are found, related in some way to the first. Probability theory also studies random variables and random processes. One of the main… … Concepts of modern natural science. Glossary of basic terms

probability theory- tikimybių teorija statusas T sritis fizika atitikmenys: engl. probability theory vok. Wahrscheinlichkeitstheorie, f rus. probability theory, f pranc. theorie des probabilités, f … Fizikos terminų žodynas

Probability Theory- ... Wikipedia

Probability theory- a mathematical discipline that studies the patterns of random phenomena ... Beginnings of modern natural science

PROBABILITY THEORY- (probability theory) see Probability ... Big explanatory sociological dictionary

Probability theory and its applications- (“Probability Theory and Its Applications”), a scientific journal of the Department of Mathematics of the USSR Academy of Sciences. Publishes original articles and brief communications on probability theory, general problems of mathematical statistics and their applications in natural science and ... ... Great Soviet Encyclopedia

Books

  • Probability Theory. , Venttsel E.S. The book is a textbook intended for people who are familiar with mathematics in the scope of a regular high school course and are interested in technical applications of probability theory, in ... Buy for 2056 UAH (Ukraine only)
  • Probability Theory. , Wentzel E.S. This book will be produced in accordance with your order using Print-on-Demand technology. The book is a textbook intended for persons familiar with mathematics in the volume of ordinary ...

Probability theory is a branch of mathematics that studies the patterns of random phenomena: random events, random variables, their properties and operations on them.

For a long time, the theory of probability did not have a clear definition. It was formulated only in 1929. The emergence of probability theory as a science is attributed to the Middle Ages and the first attempts at the mathematical analysis of gambling (toss, dice, roulette). The French mathematicians of the 17th century Blaise Pascal and Pierre de Fermat discovered the first probabilistic patterns that arise when throwing dice while studying the prediction of winnings in gambling.

The theory of probability arose as a science from the belief that certain regularities underlie massive random events. Probability theory studies these patterns.

Probability theory deals with the study of events, the occurrence of which is not known for certain. It allows you to judge the degree of probability of the occurrence of some events compared to others.

For example: it is impossible to unambiguously determine the result of a coin tossing heads or tails, but with repeated tossing, approximately the same number of heads and tails falls out, which means that the probability that heads or tails will fall ", is equal to 50%.

test in this case, the implementation of a certain set of conditions is called, that is, in this case, the tossing of a coin. The challenge can be played an unlimited number of times. In this case, the complex of conditions includes random factors.

The test result is event. The event happens:

  1. Reliable (always occurs as a result of testing).
  2. Impossible (never happens).
  3. Random (may or may not occur as a result of the test).

For example, when tossing a coin, an impossible event - the coin will end up on the edge, a random event - the loss of "heads" or "tails". The specific test result is called elementary event. As a result of the test, only elementary events occur. The totality of all possible, different, specific test outcomes is called elementary event space.

Basic concepts of the theory

Probability- the degree of possibility of the occurrence of the event. When the reasons for some possible event to actually occur outweigh the opposite reasons, then this event is called probable, otherwise - unlikely or improbable.

Random value- this is a value that, as a result of the test, can take one or another value, and it is not known in advance which one. For example: the number of fire stations per day, the number of hits with 10 shots, etc.

Random variables can be divided into two categories.

  1. Discrete random variable such a quantity is called, which, as a result of the test, can take on certain values ​​with a certain probability, forming a countable set (a set whose elements can be numbered). This set can be either finite or infinite. For example, the number of shots before the first hit on the target is a discrete random variable, because this value can take on an infinite, although countable, number of values.
  2. Continuous random variable is a quantity that can take any value from some finite or infinite interval. Obviously, the number of possible values ​​of a continuous random variable is infinite.

Probability space- the concept introduced by A.N. Kolmogorov in the 1930s to formalize the concept of probability, which gave rise to the rapid development of probability theory as a rigorous mathematical discipline.

The probability space is a triple (sometimes framed in angle brackets: , where

This is an arbitrary set, the elements of which are called elementary events, outcomes or points;
- sigma-algebra of subsets called (random) events;
- probabilistic measure or probability, i.e. sigma-additive finite measure such that .

De Moivre-Laplace theorem- one of the limiting theorems of probability theory, established by Laplace in 1812. She states that the number of successes in repeating the same random experiment with two possible outcomes is approximately normally distributed. It allows you to find an approximate value of the probability.

If, for each of the independent trials, the probability of the occurrence of some random event is equal to () and is the number of trials in which it actually occurs, then the probability of the validity of the inequality is close (for large ) to the value of the Laplace integral.

Distribution function in probability theory- a function characterizing the distribution of a random variable or a random vector; the probability that a random variable X will take on a value less than or equal to x, where x is an arbitrary real number. Under certain conditions, it completely determines a random variable.

Expected value- the average value of a random variable (this is the probability distribution of a random variable, considered in probability theory). In English literature, it is denoted by, in Russian -. In statistics, the notation is often used.

Let a probability space and a random variable defined on it be given. That is, by definition, a measurable function. Then, if there is a Lebesgue integral of over space , then it is called the mathematical expectation, or mean value, and is denoted by .

Variance of a random variable- a measure of the spread of a given random variable, i.e. its deviation from the mathematical expectation. Designated in Russian literature and in foreign. In statistics, the designation or is often used. The square root of the variance is called the standard deviation, standard deviation, or standard spread.

Let be a random variable defined on some probability space. Then

where the symbol denotes the mathematical expectation.

In probability theory, two random events are called independent if the occurrence of one of them does not change the probability of the occurrence of the other. Similarly, two random variables are called dependent if the value of one of them affects the probability of the values ​​of the other.

The simplest form of the law of large numbers is Bernoulli's theorem, which states that if the probability of an event is the same in all trials, then as the number of trials increases, the frequency of the event tends to the probability of the event and ceases to be random.

The law of large numbers in probability theory states that the arithmetic mean of a finite sample from a fixed distribution is close to the theoretical mean of that distribution. Depending on the type of convergence, a weak law of large numbers is distinguished, when convergence in probability takes place, and a strong law of large numbers, when convergence almost certainly takes place.

The general meaning of the law of large numbers is that the joint action of a large number of identical and independent random factors leads to a result that, in the limit, does not depend on chance.

Methods for estimating probability based on the analysis of a finite sample are based on this property. A good example is the prediction of election results based on a survey of a sample of voters.

Central limit theorems- a class of theorems in probability theory stating that the sum of a sufficiently large number of weakly dependent random variables that have approximately the same scale (none of the terms dominates, does not make a decisive contribution to the sum) has a distribution close to normal.

Since many random variables in applications are formed under the influence of several weakly dependent random factors, their distribution is considered normal. In this case, the condition must be observed that none of the factors is dominant. Central limit theorems in these cases justify the application of the normal distribution.

"Randomness is not accidental"... It sounds like a philosopher said, but in fact, the study of accidents is the lot of the great science of mathematics. In mathematics, chance is the theory of probability. Formulas and examples of tasks, as well as the main definitions of this science will be presented in the article.

What is Probability Theory?

Probability theory is one of the mathematical disciplines that studies random events.

To make it a little clearer, let's give a small example: if you throw a coin up, it can fall heads or tails. As long as the coin is in the air, both of these possibilities are possible. That is, the probability of possible consequences correlates 1:1. If one is drawn from a deck with 36 cards, then the probability will be indicated as 1:36. It would seem that there is nothing to explore and predict, especially with the help of mathematical formulas. Nevertheless, if you repeat a certain action many times, then you can identify a certain pattern and, on its basis, predict the outcome of events in other conditions.

To summarize all of the above, the theory of probability in the classical sense studies the possibility of the occurrence of one of the possible events in a numerical sense.

From the pages of history

The theory of probability, formulas and examples of the first tasks appeared in the distant Middle Ages, when attempts to predict the outcome of card games first arose.

Initially, the theory of probability had nothing to do with mathematics. It was justified by empirical facts or properties of an event that could be reproduced in practice. The first works in this area as a mathematical discipline appeared in the 17th century. The founders were Blaise Pascal and Pierre Fermat. For a long time they studied gambling and saw certain patterns, which they decided to tell the public about.

The same technique was invented by Christian Huygens, although he was not familiar with the results of the research of Pascal and Fermat. The concept of "probability theory", formulas and examples, which are considered the first in the history of the discipline, were introduced by him.

Of no small importance are the works of Jacob Bernoulli, Laplace's and Poisson's theorems. They made probability theory more like a mathematical discipline. Probability theory, formulas and examples of basic tasks got their current form thanks to Kolmogorov's axioms. As a result of all the changes, the theory of probability has become one of the mathematical branches.

Basic concepts of probability theory. Events

The main concept of this discipline is "event". Events are of three types:

  • Reliable. Those that will happen anyway (the coin will fall).
  • Impossible. Events that will not happen in any scenario (the coin will remain hanging in the air).
  • Random. Those that will or will not happen. They can be influenced by various factors that are very difficult to predict. If we talk about a coin, then random factors that can affect the result: the physical characteristics of the coin, its shape, its initial position, the strength of the throw, etc.

All events in the examples are denoted by capital Latin letters, with the exception of R, which has a different role. For example:

  • A = "students came to the lecture."
  • Ā = "students didn't come to the lecture".

In practical tasks, events are usually recorded in words.

One of the most important characteristics of events is their equal possibility. That is, if you flip a coin, all options for the original fall are possible until it falls. But events are also not equally probable. This happens when someone deliberately influences the outcome. For example, "marked" playing cards or dice, in which the center of gravity is shifted.

Events are also compatible and incompatible. Compatible events do not exclude the occurrence of each other. For example:

  • A = "the student came to the lecture."
  • B = "the student came to the lecture."

These events are independent of each other, and the appearance of one of them does not affect the appearance of the other. Incompatible events are defined by the fact that the occurrence of one precludes the occurrence of the other. If we talk about the same coin, then the loss of "tails" makes it impossible for the appearance of "heads" in the same experiment.

Actions on events

Events can be multiplied and added, respectively, logical connectives "AND" and "OR" are introduced in the discipline.

The amount is determined by the fact that either event A, or B, or both can occur at the same time. In the case when they are incompatible, the last option is impossible, either A or B will drop out.

The multiplication of events consists in the appearance of A and B at the same time.

Now you can give a few examples to better remember the basics, probability theory and formulas. Examples of problem solving below.

Exercise 1: The firm is bidding for contracts for three types of work. Possible events that may occur:

  • A = "the firm will receive the first contract."
  • A 1 = "the firm will not receive the first contract."
  • B = "the firm will receive a second contract."
  • B 1 = "the firm will not receive a second contract"
  • C = "the firm will receive a third contract."
  • C 1 = "the firm will not receive a third contract."

Let's try to express the following situations using actions on events:

  • K = "the firm will receive all contracts."

In mathematical form, the equation will look like this: K = ABC.

  • M = "the firm will not receive a single contract."

M \u003d A 1 B 1 C 1.

We complicate the task: H = "the firm will receive one contract." Since it is not known which contract the firm will receive (the first, second or third), it is necessary to record the entire range of possible events:

H \u003d A 1 BC 1 υ AB 1 C 1 υ A 1 B 1 C.

And 1 BC 1 is a series of events where the firm does not receive the first and third contract, but receives the second one. Other possible events are also recorded by the corresponding method. The symbol υ in the discipline denotes a bunch of "OR". If we translate the above example into human language, then the company will receive either the third contract, or the second, or the first. Similarly, you can write other conditions in the discipline "Probability Theory". The formulas and examples of solving problems presented above will help you do it yourself.

Actually, the probability

Perhaps, in this mathematical discipline, the probability of an event is a central concept. There are 3 definitions of probability:

  • classical;
  • statistical;
  • geometric.

Each has its place in the study of probabilities. Probability theory, formulas and examples (Grade 9) mostly use the classic definition, which sounds like this:

  • The probability of situation A is equal to the ratio of the number of outcomes that favor its occurrence to the number of all possible outcomes.

The formula looks like this: P (A) \u003d m / n.

And, actually, an event. If the opposite of A occurs, it can be written as Ā or A 1 .

m is the number of possible favorable cases.

n - all events that can happen.

For example, A \u003d "pull out a heart suit card." There are 36 cards in a standard deck, 9 of them are of hearts. Accordingly, the formula for solving the problem will look like:

P(A)=9/36=0.25.

As a result, the probability that a heart-suited card will be drawn from the deck will be 0.25.

to higher mathematics

Now it has become a little known what the theory of probability is, formulas and examples of solving tasks that come across in the school curriculum. However, the theory of probability is also found in higher mathematics, which is taught in universities. Most often, they operate with geometric and statistical definitions of the theory and complex formulas.

The theory of probability is very interesting. Formulas and examples (higher mathematics) are better to start learning from a small one - from a statistical (or frequency) definition of probability.

The statistical approach does not contradict the classical approach, but slightly expands it. If in the first case it was necessary to determine with what degree of probability an event will occur, then in this method it is necessary to indicate how often it will occur. Here a new concept of “relative frequency” is introduced, which can be denoted by W n (A). The formula is no different from the classic:

If the classical formula is calculated for forecasting, then the statistical one is calculated according to the results of the experiment. Take, for example, a small task.

The department of technological control checks products for quality. Among 100 products, 3 were found to be of poor quality. How to find the frequency probability of a quality product?

A = "the appearance of a quality product."

W n (A)=97/100=0.97

Thus, the frequency of a quality product is 0.97. Where did you get 97 from? Of the 100 products that were checked, 3 turned out to be of poor quality. We subtract 3 from 100, we get 97, this is the quantity of a quality product.

A bit about combinatorics

Another method of probability theory is called combinatorics. Its basic principle is that if a certain choice A can be made in m different ways, and a choice B in n different ways, then the choice of A and B can be made by multiplying.

For example, there are 5 roads from city A to city B. There are 4 routes from city B to city C. How many ways are there to get from city A to city C?

It's simple: 5x4 = 20, that is, there are twenty different ways to get from point A to point C.

Let's make the task harder. How many ways are there to play cards in solitaire? In a deck of 36 cards, this is the starting point. To find out the number of ways, you need to “subtract” one card from the starting point and multiply.

That is, 36x35x34x33x32…x2x1= the result does not fit on the calculator screen, so it can simply be denoted as 36!. Sign "!" next to the number indicates that the entire series of numbers is multiplied among themselves.

In combinatorics, there are such concepts as permutation, placement and combination. Each of them has its own formula.

An ordered set of set elements is called a layout. Placements can be repetitive, meaning one element can be used multiple times. And without repetition, when the elements are not repeated. n is all elements, m is the elements that participate in the placement. The formula for placement without repetitions will look like:

A n m =n!/(n-m)!

Connections of n elements that differ only in the order of placement are called permutations. In mathematics, this looks like: P n = n!

Combinations of n elements by m are such compounds in which it is important which elements they were and what is their total number. The formula will look like:

A n m =n!/m!(n-m)!

Bernoulli formula

In the theory of probability, as well as in every discipline, there are works of outstanding researchers in their field who have taken it to a new level. One of these works is the Bernoulli formula, which allows you to determine the probability of a certain event occurring under independent conditions. This suggests that the appearance of A in an experiment does not depend on the appearance or non-occurrence of the same event in previous or subsequent tests.

Bernoulli equation:

P n (m) = C n m ×p m ×q n-m .

The probability (p) of the occurrence of the event (A) is unchanged for each trial. The probability that the situation will happen exactly m times in n number of experiments will be calculated by the formula that is presented above. Accordingly, the question arises of how to find out the number q.

If event A occurs p number of times, accordingly, it may not occur. A unit is a number that is used to designate all outcomes of a situation in a discipline. Therefore, q is a number that indicates the possibility of the event not occurring.

Now you know the Bernoulli formula (probability theory). Examples of problem solving (the first level) will be considered below.

Task 2: A store visitor will make a purchase with a probability of 0.2. 6 visitors entered the store independently. What is the probability that a visitor will make a purchase?

Solution: Since it is not known how many visitors should make a purchase, one or all six, it is necessary to calculate all possible probabilities using the Bernoulli formula.

A = "the visitor will make a purchase."

In this case: p = 0.2 (as indicated in the task). Accordingly, q=1-0.2 = 0.8.

n = 6 (because there are 6 customers in the store). The number m will change from 0 (no customer will make a purchase) to 6 (all store visitors will purchase something). As a result, we get the solution:

P 6 (0) \u003d C 0 6 × p 0 × q 6 \u003d q 6 \u003d (0.8) 6 \u003d 0.2621.

None of the buyers will make a purchase with a probability of 0.2621.

How else is the Bernoulli formula (probability theory) used? Examples of problem solving (second level) below.

After the above example, questions arise about where C and p have gone. With respect to p, a number to the power of 0 will be equal to one. As for C, it can be found by the formula:

C n m = n! /m!(n-m)!

Since in the first example m = 0, respectively, C=1, which in principle does not affect the result. Using the new formula, let's try to find out what is the probability of buying goods by two visitors.

P 6 (2) = C 6 2 ×p 2 ×q 4 = (6×5×4×3×2×1) / (2×1×4×3×2×1) × (0.2) 2 × (0.8) 4 = 15 × 0.04 × 0.4096 = 0.246.

The theory of probability is not so complicated. The Bernoulli formula, examples of which are presented above, is a direct proof of this.

Poisson formula

The Poisson equation is used to calculate unlikely random situations.

Basic formula:

P n (m)=λ m /m! × e (-λ) .

In this case, λ = n x p. Here is such a simple Poisson formula (probability theory). Examples of problem solving will be considered below.

Task 3 A: The factory produced 100,000 parts. The appearance of a defective part = 0.0001. What is the probability that there will be 5 defective parts in a batch?

As you can see, marriage is an unlikely event, and therefore the Poisson formula (probability theory) is used for calculation. Examples of solving problems of this kind are no different from other tasks of the discipline, we substitute the necessary data into the above formula:

A = "a randomly selected part will be defective."

p = 0.0001 (according to the assignment condition).

n = 100000 (number of parts).

m = 5 (defective parts). We substitute the data in the formula and get:

R 100000 (5) = 10 5 / 5! X e -10 = 0.0375.

Just like the Bernoulli formula (probability theory), examples of solutions using which are written above, the Poisson equation has an unknown e. In essence, it can be found by the formula:

e -λ = lim n ->∞ (1-λ/n) n .

However, there are special tables that contain almost all the values ​​of e.

De Moivre-Laplace theorem

If in the Bernoulli scheme the number of trials is large enough, and the probability of occurrence of event A in all schemes is the same, then the probability of occurrence of event A a certain number of times in a series of trials can be found by the Laplace formula:

Р n (m)= 1/√npq x ϕ(X m).

Xm = m-np/√npq.

To better remember the Laplace formula (probability theory), examples of tasks to help below.

First we find X m , we substitute the data (they are all indicated above) into the formula and get 0.025. Using tables, we find the number ϕ (0.025), the value of which is 0.3988. Now you can substitute all the data in the formula:

P 800 (267) \u003d 1 / √ (800 x 1/3 x 2/3) x 0.3988 \u003d 3/40 x 0.3988 \u003d 0.03.

So the probability that the flyer will hit exactly 267 times is 0.03.

Bayes formula

The Bayes formula (probability theory), examples of solving tasks using which will be given below, is an equation that describes the probability of an event based on the circumstances that could be associated with it. The main formula is as follows:

P (A|B) = P (B|A) x P (A) / P (B).

A and B are definite events.

P(A|B) - conditional probability, that is, event A can occur, provided that event B is true.

Р (В|А) - conditional probability of event В.

So, the final part of the short course "Theory of Probability" is the Bayes formula, examples of solving problems with which are below.

Task 5: Phones from three companies were brought to the warehouse. At the same time, part of the phones that are manufactured at the first plant is 25%, at the second - 60%, at the third - 15%. It is also known that the average percentage of defective products at the first factory is 2%, at the second - 4%, and at the third - 1%. It is necessary to find the probability that a randomly selected phone will be defective.

A = "randomly taken phone."

B 1 - the phone that the first factory made. Accordingly, introductory B 2 and B 3 will appear (for the second and third factories).

As a result, we get:

P (B 1) \u003d 25% / 100% \u003d 0.25; P (B 2) \u003d 0.6; P (B 3) \u003d 0.15 - so we found the probability of each option.

Now you need to find the conditional probabilities of the desired event, that is, the probability of defective products in firms:

P (A / B 1) \u003d 2% / 100% \u003d 0.02;

P (A / B 2) \u003d 0.04;

P (A / B 3) \u003d 0.01.

Now we substitute the data into the Bayes formula and get:

P (A) \u003d 0.25 x 0.2 + 0.6 x 0.4 + 0.15 x 0.01 \u003d 0.0305.

The article presents the theory of probability, formulas and examples of problem solving, but this is only the tip of the iceberg of a vast discipline. And after all that has been written, it will be logical to ask the question of whether the theory of probability is needed in life. It is difficult for a simple person to answer, it is better to ask someone who has hit the jackpot more than once with her help.

As to the properties of real events, and they were formulated in visual representations. The earliest works of scientists in the field of probability theory date back to the 17th century. While researching the prediction of winnings in gambling, Blaise Pascal and Pierre Fermat discovered the first probabilistic patterns that occur when rolling dice. Under the influence of the questions raised and considered by them, Christian Huygens was also engaged in solving the same problems. At the same time, he was not familiar with the correspondence between Pascal and Fermat, so he invented the solution technique on his own. His work, which introduces the basic concepts of probability theory (the concept of probability as a quantity of chance; mathematical expectation for discrete cases, in the form of the price of a chance), and also uses the theorems of addition and multiplication of probabilities (not explicitly formulated), appeared in print for twenty years before (1657) the publication of the letters of Pascal and Fermat (1679).

An important contribution to the theory of probability was made by Jacob Bernoulli: he gave a proof of the law of large numbers in the simplest case of independent trials. In the first half of the 19th century, probability theory began to be applied to the analysis of observational errors; Laplace and Poisson proved the first limit theorems. In the second half of the 19th century, the main contribution was made by Russian scientists P. L. Chebyshev, A. A. Markov and A. M. Lyapunov. During this time, the law of large numbers, the central limit theorem, and the theory of Markov chains were developed. The theory of probability received its modern form thanks to the axiomatization proposed by Andrey Nikolaevich Kolmogorov. As a result, the theory of probability acquired a rigorous mathematical form and finally began to be perceived as one of the branches of mathematics.

Basic concepts of the theory

see also

Write a review on the article "Probability Theory"

Notes

Introductory Links

  • Probability theory // Great Soviet Encyclopedia: [in 30 volumes] / ch. ed. A. M. Prokhorov. - 3rd ed. - M. : Soviet Encyclopedia, 1969-1978.
  • - article from the encyclopedia "Round the World"

Literature

BUT

  • Akhtyamov, A. M. "Economic and mathematical methods": textbook. allowance Bashk. state un-t. - Ufa: BSU, 2007.
  • Akhtyamov, A. M. Probability Theory. - M.: Fizmatlit, 2009

B

  • Borovkov, A. A. "Math statistics", M.: Nauka, 1984.
  • Borovkov, A. A. "Probability Theory", M.: Nauka, 1986.
  • Buldyk, G. M. , Mn., Higher. school, 1989.
  • Bulinsky, A. V., Shiryaev, A. N. "Theory of random processes", M.: Fizmatlit, 2003.
  • Bekareva, N. D. «Probability theory. Lecture notes", Novosibirsk NSTU
  • Bavrin, I. I. "Higher Mathematics" (Part 2 "Elements of Probability Theory and Mathematical Statistics"), M .: Nauka, 2000.

AT

  • Wentzel E. S. Probability Theory.- M.: Nauka, 1969. - 576 p.
  • Wentzel E. S. Probability Theory. - 10th ed., erased .. - M .: "Academy", 2005. - 576 p. - ISBN 5-7695-2311-5.

G

  • Gikhman II, Skorokhod AV Introduction to the theory of random processes. - M.: Nauka, 1977.
  • Gmurman, V. E. "Theory of Probability and Mathematical Statistics": Proc. allowance - 12th ed., revised - M .: Higher education, 2006.-479 p.: il (Fundamentals of Sciences).
  • Gmurman, V. E. "A Guide to Solving Problems in Probability Theory and Mathematical Statistics": Proc. allowance - 11th ed., revised. - M.: Higher education, 2006.-404 p. (Fundamentals of Sciences).
  • Gnedenko, B. V. "Course of Probability Theory", - M.: Nauka, 1988.
  • Gnedenko, B. V. "Course of Probability Theory", URSS. M.: 2001.
  • Gnedenko B. V., Khinchin A. Ya., 1970.
  • Gursky E.I. "Collection of Problems in Probability Theory and Mathematical Statistics", - Minsk: Higher School, 1975.

D

  • P. E. Danko, A. G. Popov, T. Ya. Kozhevnikov. Higher mathematics in exercises and tasks. (In 2 parts) - M .: Vyssh.shk, 1986.

E

  • A. V. Efimov, A. E. Pospelov and others. Part 4 // Collection of problems in mathematics for higher education institutions. - 3rd ed., revised. and additional .. - M .: "Fizmatlit", 2003. - T. 4. - 432 p. - ISBN 5-94052-037-5.

To

  • Kolemaev, V. A. and others. "Theory of Probability and Mathematical Statistics", - M.: Higher school, 1991.
  • Kolmogorov, A. N. "Basic concepts of probability theory", M.: Nauka, 1974.
  • Korshunov, D. A., Foss, S. G. "Collection of Problems and Exercises in Probability Theory", Novosibirsk, 1997.
  • Korshunov, D. A., Chernova, N. I. "Collection of problems and exercises in mathematical statistics", Novosibirsk. 2001.
  • Kremer N. Sh. Probability Theory and Mathematical Statistics: Textbook for High Schools. - 2nd ed., revised. and additional - M: UNITY-DANA, 2004. - 573 p.
  • Kuznetsov, A. V. "Application of goodness-of-fit criteria in mathematical modeling of economic processes", Minsk: BGINH, 1991.

L

  • Likholetov I. I., Matskevich I. E. "A Guide to Solving Problems in Higher Mathematics, Probability Theory and Mathematical Statistics", Mn.: Vysh. school, 1976.
  • Likholetov I. I. "Higher Mathematics, Probability Theory and Mathematical Statistics", Mn.: Vysh. school, 1976.
  • Loev M.V "Probability Theory", - M .: Publishing house of foreign literature, 1962.

M

  • Mankovsky B. Yu., "Probability table".
  • Matskevich I. P., Svirid G. P. “Higher Mathematics. Theory of Probability and Mathematical Statistics", Mn.: Vysh. school, 1993.
  • Matskevich I. P., Svirid G. P., Buldyk G. M. Collection of problems and exercises in higher mathematics. Theory of Probability and Mathematical Statistics", Mn.: Vysh. school, 1996.
  • Meyer P.-A. Probability and potentials. Mir Publishing House, Moscow, 1973.
  • Mlodinov L.

P

  • Prokhorov, A. V., V. G. Ushakov, N. G. Ushakov. "Problem Theory Problems", The science. M.: 1986.
  • Prokhorov Yu. V., Rozanov Yu. A. "Probability Theory", - M.: Nauka, 1967.
  • Pugachev, V. S. "Theory of Probability and Mathematical Statistics", The science. M.: 1979.

R

  • Rotar V.I., "Probability Theory", - M.: Higher school, 1992.

With

  • Sveshnikov A. A. and others, "Collection of Problems in Probability Theory, Mathematical Statistics and the Theory of Random Functions", - M.: Nauka, 1970.
  • Svirid, G. P., Makarenko, Ya. S., Shevchenko, L. I. "Solution of problems of mathematical statistics on a PC", Mn., Vysh. school, 1996.
  • Sevastyanov B. A., "Course of Probability Theory and Mathematical Statistics", - M.: Nauka, 1982.
  • Sevastyanov, B. A., Chistyakov, V. P., Zubkov, A. M. "Collection of Problems in Probability Theory", M.: Nauka, 1986.
  • Sokolenko A.I., "Higher Mathematics", textbook. M.: Academy, 2002.

F

  • Feller, V. "Introduction to Probability Theory and Its Applications".

X

  • Khamitov, G. P., Vedernikova, T. I. "Probability and Statistics", BSUEP. Irkutsk: 2006.

H

  • Chistyakov, V.P. "Course of Probability Theory", M., 1982.
  • Chernova, N. I. "Probability Theory", Novosibirsk. 2007.

W

  • Sheinin O. B. Berlin: NG Ferlag, 2005, 329 pp.
  • Shiryaev, A. N. "Probability", The science. M.: 1989.
  • Shiryaev, A. N. "Fundamentals of stochastic financial mathematics in 2 vols.", FASIS. M.: 1998.

An excerpt characterizing the Theory of Probability

“We have the master’s bread, bro?” she asked.
“The Lord’s bread is all intact,” Dron said proudly, “our prince did not order to sell it.
“Give him to the peasants, give him everything they need: I give you permission in the name of your brother,” said Princess Mary.
Drone did not answer and took a deep breath.
- You give them this bread, if it will be enough for them. Distribute everything. I command you in the name of a brother, and tell them: whatever is ours, so is theirs. We will spare nothing for them. So you say.
Drone gazed at the princess intently while she spoke.
“Fire me, mother, for God’s sake, send me the keys to accept,” he said. - He served twenty-three years, did not do anything bad; quit, for God's sake.
Princess Mary did not understand what he wanted from her and why he asked to be fired. She answered him that she never doubted his devotion and that she was ready to do everything for him and for the peasants.

An hour later, Dunyasha came to the princess with the news that Dron had come and all the peasants, on the orders of the princess, had gathered at the barn, wanting to talk to the mistress.
“Yes, I never called them,” said Princess Marya, “I only told Dronushka to distribute bread to them.
- Only for God's sake, Princess Mother, order them to drive away and do not go to them. It’s all a deception,” Dunyasha said, “but Yakov Alpatych will come, and we’ll go ... and you don’t mind ...
- What kind of deception? the princess asked in surprise.
“Yes, I know, just listen to me, for God’s sake. Just ask the nanny. They say they do not agree to leave on your orders.
- You don't say anything. Yes, I never ordered to leave ... - said Princess Mary. - Call Dronushka.
Dron, who came, confirmed Dunyasha's words: the peasants came at the order of the princess.
“Yes, I never called them,” said the princess. You must have told them wrong. I only told you to give them the bread.
Drone sighed without answering.
“If you tell them to, they will leave,” he said.
“No, no, I will go to them,” said Princess Mary
Despite Dunyasha's and the nurse's dissuades, Princess Mary went out onto the porch. Dron, Dunyasha, the nurse, and Mikhail Ivanovich followed her. “They probably think that I am offering them bread so that they remain in their places, and I myself will leave, leaving them to the mercy of the French,” thought Princess Mary. - I will promise them a month in an apartment near Moscow; I am sure that Andre would have done even more in my place, ”she thought, approaching the crowd in the pasture near the barn at dusk.
The crowd, crowding together, began to stir, and hats were quickly taken off. Princess Mary, lowering her eyes and tangling her feet in her dress, went close to them. So many varied old and young eyes were fixed on her, and there were so many different faces, that Princess Mary did not see a single face and, feeling the need to suddenly talk to everyone, did not know what to do. But again, the realization that she was the representative of her father and brother gave her strength, and she boldly began her speech.
“I am very glad that you have come,” Princess Marya began, without raising her eyes and feeling how quickly and strongly her heart was beating. “Dronushka told me that the war ruined you. This is our common grief, and I will spare nothing to help you. I am going myself, because it is already dangerous here and the enemy is close ... because ... I give you everything, my friends, and I ask you to take everything, all our bread, so that you do not have a need. And if you were told that I am giving you bread so that you stay here, then this is not true. On the contrary, I ask you to leave with all your property to our suburban area, and there I take upon myself and promise you that you will not be in need. You will be given houses and bread. The princess stopped. Only sighs could be heard in the crowd.
“I am not doing this on my own,” the princess continued, “I am doing this in the name of my late father, who was a good master to you, and for my brother and his son.
She stopped again. No one interrupted her silence.
- Woe is our common, and we will divide everything in half. Everything that is mine is yours,” she said, looking around at the faces that stood before her.
All eyes looked at her with the same expression, the meaning of which she could not understand. Whether it was curiosity, devotion, gratitude, or fear and distrust, the expression on all faces was the same.
“Many are pleased with your grace, only we don’t have to take the master’s bread,” said a voice from behind.
- Yes, why? - said the princess.
No one answered, and Princess Mary, looking around the crowd, noticed that now all the eyes she met immediately dropped.
- Why don't you want to? she asked again.
Nobody answered.
Princess Marya felt heavy from this silence; she tried to catch someone's gaze.
- Why don't you speak? - the princess turned to the old old man, who, leaning on a stick, stood in front of her. Tell me if you think you need anything else. I'll do anything," she said, catching his eye. But he, as if angry at this, lowered his head completely and said:
- Why agree, we do not need bread.
- Well, should we quit everything? Do not agree. Disagree... There is no our consent. We pity you, but there is no our consent. Go on your own, alone…” was heard in the crowd from different directions. And again the same expression appeared on all the faces of this crowd, and now it was probably no longer an expression of curiosity and gratitude, but an expression of embittered determination.
“Yes, you didn’t understand, right,” said Princess Marya with a sad smile. Why don't you want to go? I promise to accommodate you, feed you. And here the enemy will ruin you ...
But her voice was drowned out by the voices of the crowd.
- There is no our consent, let them ruin! We do not take your bread, there is no our consent!
Princess Mary tried again to catch someone's gaze from the crowd, but not a single glance was directed at her; her eyes obviously avoided her. She felt strange and uncomfortable.
“Look, she taught me cleverly, follow her to the fortress!” Ruin the houses and into bondage and go. How! I'll give you bread! voices were heard in the crowd.
Princess Mary, lowering her head, left the circle and went into the house. Having repeated the order to Dron that there should be horses for departure tomorrow, she went to her room and was left alone with her thoughts.

For a long time that night Princess Marya sat at the open window in her room, listening to the sounds of peasants talking from the village, but she did not think about them. She felt that no matter how much she thought about them, she could not understand them. She kept thinking about one thing - about her grief, which now, after the break made by worries about the present, has already become past for her. She could now remember, she could cry and she could pray. As the sun went down, the wind died down. The night was calm and cool. At twelve o'clock the voices began to subside, a rooster crowed, the full moon began to emerge from behind the linden trees, a fresh, white dew mist rose, and silence reigned over the village and over the house.
One after another, she imagined pictures of the close past - illness and the last moments of her father. And with sad joy she now dwelled on these images, driving away from herself with horror only one last idea of ​​​​his death, which - she felt - she was unable to contemplate even in her imagination at this quiet and mysterious hour of the night. And these pictures appeared to her with such clarity and in such detail that they seemed to her either reality, or the past, or the future.
Then she vividly imagined the moment when he had a stroke and he was dragged from the garden in the Bald Mountains by the arms and he muttered something in an impotent tongue, twitched his gray eyebrows and looked restlessly and timidly at her.
“He wanted to tell me even then what he told me on the day of his death,” she thought. “He always thought what he said to me.” And now she remembered with all the details that night in the Bald Mountains on the eve of the blow that happened to him, when Princess Mary, anticipating trouble, stayed with him against his will. She did not sleep and went downstairs on tiptoe at night and, going to the door to the flower room, where her father spent the night that night, she listened to his voice. He was saying something to Tikhon in an exhausted, tired voice. He seemed to want to talk. "Why didn't he call me? Why didn't he allow me to be here in Tikhon's place? thought then and now Princess Marya. - He will never tell anyone now all that was in his soul. This moment will never return for him and for me when he would say everything that he wanted to express, and I, and not Tikhon, would listen and understand him. Why didn't I come into the room then? she thought. “Perhaps he would have told me then what he said on the day of his death. Even then, in a conversation with Tikhon, he asked twice about me. He wanted to see me, and I was standing there, outside the door. He was sad, it was hard to talk with Tikhon, who did not understand him. I remember how he spoke to him about Liza, as if alive - he forgot that she was dead, and Tikhon reminded him that she was no longer there, and he shouted: "Fool." It was hard for him. I heard from behind the door how, groaning, he lay down on the bed and shouted loudly: “My God! Why didn’t I go up then? What would he do to me? What would I lose? Or maybe then he would have consoled himself, he would have said this word to me. And Princess Marya uttered aloud that affectionate word that he had spoken to her on the day of his death. “Dude she nka! - Princess Marya repeated this word and sobbed tears that relieved her soul. She saw his face in front of her now. And not the face she had known since she could remember, and which she had always seen from afar; and that face - timid and weak, which on the last day, bending down to his mouth in order to hear what he was saying, for the first time examined closely with all its wrinkles and details.