Biographies Characteristics Analysis

Why is the Poisson formula called the formula of rare phenomena. Distribution and Poisson formula

probability p = 0.7 . Find the most probable number m 0 of people who will come to the meeting and the corresponding probability P n (m 0 ) .

Decision. Since P 50 (m 0 )= C 50 m 0 (0.7)m 0 (0.3)50 − m 0 , the problem is to find a non-negative integer m 0 ≤ 50 that maximizes the function P 50 (m 0 ) . We saw above that such a number is given by formula (6.4). AT

P 50 (35)= C 50 35 (0.7)35 (0.3)15 ≈ 0.123.

6.4. Poisson formula

Formulas (6.1) and (6.3) give the exact probabilities associated with the scheme of independent Bernoulli trials. However, calculations using these formulas, especially for large values ​​of n and m, are very difficult. It is of great practical interest to obtain fairly simple approximate formulas for calculating the corresponding probabilities. For the first time, such a formula was derived in 1837 by the French mathematician and physicist Simon Poisson (1781–1840). Below is the formulation of Poisson's result.

Consider a Bernoulli scheme of independent trials in which the number of trials n is “relatively large”, the probability of “success” p is “relatively small”, and the product λ= np is “neither small nor large”41. Under these conditions, the formula

This is the famous Poisson approximation for the binomial distribution . The proof of formula (6.6) will be given in the appendix to this section.

41 The exact meaning of the quoted terms will be explained below, in particular in § 6e.

The function on the right side of formula (6.6) is called

Poisson distribution:

With this notation, p(k, λ) will be an approximate expression for the probability b(k;n, λn) when n is "large enough".

Before discussing formula (6.6), let us give very illustrative examples of its use.

The values ​​of the binomial distribution and the values ​​of the Poisson distribution at n = 100, p = 0.01, λ= 1 are presented in Table. 6.2. As we can see, the accuracy of the approximate formula is quite high.

The larger n, the higher the accuracy of Poisson's formula. This is illustrated by the following example. Let us calculate the probability p k that in a society of 500 people exactly k people were born on the same specific day of the year. If these 500 people are chosen at random, then Bernoulli's scheme can be applied from n = 500 trials with a probability of "success" p = 1365. Calculations by exact formula (6.1) and approximate formula (6.6) at λ= 500365≈ 1.3699 are presented in Table. 6.3. As we can see, the error is only in the fourth decimal place, which is quite acceptable for practice.

Table 6.2

b(k; 100, 1.100)

p(k; 1)

Table 6.3.

b(k; 500.1/365)

p(k, λ)

Consider the following typical example of applying the formula

Poisson.

Let it be known that the probability of a "failure" in the operation of the telephone exchange for each call is 0.002. Received 1000 calls. Determine the probability that 7 "failures" will occur in this case.

Decision. It is natural to assume that under normal conditions the calls arriving at the telephone exchange are independent of each other. We will consider "success" in the test - the call - the failure of the telephone exchange. The probability of failure (p = 0.002) can be considered a “sufficiently small” value, and the number of calls (n = 1000) is “sufficiently large”. Thus, we are in the conditions of Poisson's theorem. For the parameter λ, we obtain the value

Let us now discuss the limits of applicability of the Poisson formula. At

When using any approximate formula, the question of the limits of its applicability arises naturally. In doing so, we encounter two aspects of the problem. Firstly, it is natural to ask under what real conditions Poisson's law is applicable? Experience shows that the simple Poisson distribution has relatively universal applicability. In general, from the point of view of applications, mathematical theorems are good and bad in the following sense: good theorems continue to operate even if their conditions are violated, and bad ones immediately cease to be true if the conditions for their derivation are violated. Poisson's theorem (6.6) is good and even excellent in this sense. Namely, Poisson's law continues to operate even when the conditions of the Bernoulli scheme are violated (i.e., one can assume a variable probability of success and even a not too strong dependence of the results of individual trials)42. One could even argue that the Poisson distribution has relatively universal applicability. This must be understood in the sense that if the experimental data show that Poisson's law does not apply, while, according to common sense, it should work, then it is more natural to question the statistical stability of our data than to look for some other law. distributions. In other words, The Poisson distribution is a very successful mathematical formulation of one of the universal (within the applicability of probability theory) laws of nature.

Secondly, the question arises about the orders of magnitude of those parameters that are included in the Poisson formula, and for which above we used the vague terms “relatively large”, “relatively small”, “not small and large”. Again, the practice of applying formula (6.6) provides clarifying answers. It turns out that Poisson's formula is accurate enough for practical use if the number of trials n is of the order

42 Naturally, these features of the Poisson distribution should not be abused. For example, Poisson's law is obviously violated in situations where the results of individual tests are highly dependent.

several tens (preferably hundreds), and the value of the parameter λ = np lies in the range from 0 to 10.

To illustrate the application of Poisson's formula, consider another example.

Let it be known that 10,000 raisins rely on baking 1000 sweet raisin buns. It is required to find the distribution of the number of raisins in some randomly selected bun.

Decision. We form the sequence of independent tests as follows. There will be n = 10,000 trials in total (according to the number of raisins), namely: trial number k will be that we determine whether we got raisins number k into our randomly selected bun43. Then, since there are 1000 buns in total, the probability that the k -th raisin got into our bun is p = 1/1000 (assuming the dough is well mixed when preparing the buns). We now apply the Poisson distribution with the parameter λ= np = 10000 11000= 10. We get:

P 10000 (k )≈ p (k ,10)= 10 k e − 10 .

In particular, the probability that we will get a bun without raisins at all (k = 0) is equal to e − 10 ≈ 0.5 10 − 4 . The most probable number of raisins will be, according to formula (6.4), equal to 10. The corresponding probability

P 10000(10) ≈ 10 10 e − 10 ≈ 0.125 . ten!

The buns and raisins example, despite its mundane wording, is very general. So, instead of raisins in buns, you can talk, for example, about the number of bacteria in a drop of water taken from a well-mixed bucket. Another example. Let us assume that the atoms of a radioactive substance decay independently of each other, and during a given time interval, the decay of a given atom occurs with

43 Note that the purchase of a bun in a store can be viewed as a random choice.

Let repeated tests be carried out in the experiment according to the Bernoulli scheme and the number of tests is large, the probability of the occurrence of the observed event in one test is small, and the parameter is a constant value. Then for the probability - the probability that the event in the tests will appear once, the relation is true

. (3.1)

When calculating the probability in such a random experiment, you can use the approximate formula

, (3.2)

which is called Poisson formula, and the number is the Poisson parameter.

Task 3.1. The probability of marriage in the manufacture of a certain product is 0.008. Find the probability that during the inspection there will be no more than two defective items among 500 items.

Solution: since the probability is small and the number of trials is large, we can apply the Poisson formula with the parameter . The desired probability is the probability of the sum of three events: there were two defective products, one or none. So

Definition 3.1

The flow of events is a sequence of events occurring at random times.

for example, the flow of events will be calls arriving at the PBX, signals during a radio session, messages arriving at the server, etc.

Definition 3.2

The stream of events is called Poisson(simplest) if it has the following properties:

1. Stationarity property, i.e. flow intensity- constant.

2. property of ordinariness, those. the occurrence of two or more events in a small interval is practically impossible.

3. The property of no aftereffect, those. the probability of occurrence of events in a period of time does not depend on how many events have appeared in any other segment.

If we denote - the probability of occurrence of Poisson flow events with intensity over time , then the formula is valid:

. (3.3)

Task 3.2. An insurance company serves 10,000 clients. The probability that a customer will contact the company within one day is 0.0003. What is the probability that 4 customers will contact it within two days?



Decision: The intensity of the flow of customers during one day is equal to

Hence, .

Solving problems 3.1 and 3.2 in the environment Mathcad shown in fig. 3.

Task 3.3. The probability of failure of the subway turnstile reader within an hour is small. Find this probability if the probability that there will be at least one failure in 8 hours is 0.98, and if it is known that an average of 1000 people pass through the turnstile per hour?

Decision: According to formulas (1.3) and (3.3) with , the probability that there will be at least one failure within 8 hours is equal to:

Using symbolic commands, and then the desired probability is determined.

Consider the equation

Where the function is defined on .

This equation defines the propagation of a traveling wave in an n-dimensional homogeneous medium with a speed a at points in time t > 0 .

In order for the solution to be unambiguous, it is necessary to determine the initial conditions. Initial conditions determine the state of space (or, as they say, "initial perturbation") at a moment in time t = 0 :

Then the generalized Kirchhoff formula gives a solution to this problem.

Kirchhoff himself considered only the three-dimensional case.

The idea of ​​getting a solution

A simple derivation of the solution to the main problem uses the Fourier transform. The generalized Kirchhoff formula has the following form:

.

If the wave equation has a right side f, the term will appear on the right side of the formula:

Physical Consequences

The leading and trailing wave fronts from a disturbance localized in space act on the observer for a limited period of time

Let at the initial time t= 0 on some compact M there is a local perturbation ( and/or ). If we are at some point , then, as can be seen from the formula (integration area), we will feel the perturbation after time .

Outside the time interval where , function u(x 0 , t) is equal to zero.

Thus, the initial perturbation, localized in space, causes at each point in space an action localized in time, that is, the perturbation propagates in the form of a wave with leading and trailing fronts, which expresses the Huygens principle). On the plane, this principle is violated. The justification for this is the fact that the perturbation carrier, which is compact at , will no longer be compact at , but will form an infinite cylinder, and, consequently, the perturbation will be unlimited in time (cylindrical waves have no trailing edge).

Poisson-Parseval formula

Solution of the Membrane Oscillation Equation

(function f(x,t)

with initial conditions

given by the formula:

tex" alt="(!LANG: +\frac(\partial)(\partial t)\frac(1)(2\pi a)\iint\limits_(r .

Formula D "Alamber

Solution of the one-dimensional wave equation

(function f(x,t) corresponds to the driving force)

with initial conditions

has the form

To area II characteristics come from only one family

When using the d "Alembert formula, it should be taken into account that sometimes the solution may not be unique in the entire area under consideration. The solution of the wave equation is represented as the sum of two functions: u(x,t) = f(x + at) + g(xat) , that is, it is determined by two families of characteristics: . The example shown in the figure on the right illustrates the wave equation for a semi-infinite string, and the initial conditions in it are given only on the green line x≥0. It can be seen that in the area I both ξ-characteristics and η-characteristics come, while in the region II there are only ξ-characteristics. That is, in the area II D'Alembert's formula does not work.

Application of formulas

In general, the Kirchhoff formula is rather cumbersome, and therefore solving problems of mathematical physics with its help is usually difficult. However, one can use the linearity of the wave equation with initial conditions and look for a solution in the form of a sum of three functions: u(x,t) = A(x,t) + B(x,t) + C(x,t) , which satisfy the following conditions:

By itself, such an operation does not simplify the use of the Kirchhoff formula, but for some problems it is possible to select a solution, or reduce a multidimensional problem to a one-dimensional one by changing variables. For example, let . Then, making the substitution ξ = x + 3y − 2z , the equation for problem "C" will take the form:

Thus, we came to a one-dimensional equation, which means that we can use the d "Alembert formula:

Due to the parity of the initial condition, the solution will retain its form in the entire region t > 0 .

Literature

Mikhailov V.P., Mikhailova T.V., Shabunin M.I. Collection of typical tasks for the course Equations of Mathematical Physics. - M.: MIPT, 2007. - ISBN 5-7417-0206-6

Links

Wikimedia Foundation. 2010 .

See what the "Poisson Formula" is in other dictionaries:

    The Kirchhoff formula is an analytical expression for solving a hyperbolic partial differential equation (the so-called "wave equation") in the entire three-dimensional space. By the descent method (that is, by reducing the dimension) from it you can ... ... Wikipedia

    The Kirchhoff formula is an analytical expression for solving a hyperbolic partial differential equation (the so-called "wave equation") in the entire space. By the descent method (that is, by reducing the dimension), one can obtain two-dimensional solutions from it ... ... Wikipedia

    The formula representing unity. classical solution u(x, t) of the Koshi problem for the wave equation in three-dimensional space-time, (where c is the signal propagation speed) if the initial data f(x), p(x), respectively, are three times and twice ... ... Physical Encyclopedia

    The formula for calculating the sum of a series of the form If the Fourier transform (somewhat differently than usual, normalized) of the function F (x), then (m and n are integers). This is P. f. with.; she may be… … Great Soviet Encyclopedia

    Formula P. f. with. holds if, for example, the function g(x) is absolutely integrable on an interval, has bounded variation, and a P.f. with. is also written in the form where ai and b are any two positive numbers that satisfy the condition ab = 2p, and c (u). is ... ... Mathematical Encyclopedia

    1) The same as the Poisson integral. 2) The formula that gives the integral representation of the solution of the Cauchy problem for the wave equation in space: and having the form (1) where is the average value of the function j on the sphere Sat in the space (x, y, z) of radius at with… … Mathematical Encyclopedia

    An infinitely divisible distribution in probability theory is a distribution of a random variable such that it can be represented as an arbitrary number of independent, equally distributed terms. Contents 1 Definition 2 ... ... Wikipedia

Where λ is equal to the average number of occurrences of events in the same independent trials, i.e. λ = n × p, where p is the probability of an event in one trial, e = 2.71828 .

The distribution series of Poisson's law has the form:


Service assignment. The online calculator is used to build the Poisson distribution and calculate all the characteristics of the series: mathematical expectation, variance and standard deviation. The report with the decision is drawn up in Word format.
Number of trials: n= , Probability p =
Calculate the probability for: m =
will come once
less once
at least once
more once
no more once
at least and no more once
come at least once
In the case when n is large, and λ = p n > 10, the Poisson formula gives a very rough approximation and the local and integral Moivre-Laplace theorems are used to calculate P n (m).

Numerical characteristics of a random variable X

The mathematical expectation of the Poisson distribution
M[X] = λ

Poisson distribution variance
D[X] = λ

Example #1. The seeds contain 0.1% weeds. What is the probability of finding 5 weed seeds in a random selection of 2000 seeds?
Decision.
The probability p is small, and the number n is large. np = 2 P(5) = λ 5 e -5 /5! = 0.03609
Expected value: M[X] = λ = 2
Dispersion: D[X] = λ = 2

Example #2. There are 0.4% weed seeds among rye seeds. Draw up the law of distribution of the number of weeds with a random selection of 5000 seeds. Find the mathematical expectation and variance of this random variable.
Decision. Expectation: M[X] = λ = 0.004*5000 = 20. Variance: D[X] = λ = 20
Distribution law:

X0 1 2 m
Pe-2020e-20200e-2020 meters -20 / meters!

Example #3. At the telephone exchange, an incorrect connection occurs with a probability of 1/200. Find the probability that among 200 connections there will be:
a) exactly one wrong connection;
b) less than three incorrect connections;
c) more than two incorrect connections.
Decision. According to the condition of the problem, the probability of an event is small, so we use the Poisson formula (15).
a) Given: n = 200, p = 1/200, k = 1. Find P 200 (1).
We get: . Then P 200 (1) ≈ e -1 ≈ 0.3679.
b) Given: n = 200, p = 1/200, k< 3. Найдем P 200 (k < 3).
We have: a = 1.

c) Given: n = 200, p = 1/200, k > 2. Find P 200 (k > 2).
This problem can be solved more simply: to find the probability of the opposite event, since in this case you need to calculate fewer terms. Taking into account the previous case, we have

Consider the case where n is large enough and p is small enough; we put np = a, where a is some number. In this case, the desired probability is determined by the Poisson formula:


The probability of occurrence of k events in a time of duration t can also be found using the Poisson formula:
where λ is the intensity of the flow of events, that is, the average number of events that appear per unit time.

Example #4. The probability that a part is defective is 0.005. 400 parts are checked. Specify the formula for calculating the probability that more than 3 parts are defective.

Example number 5. The probability of the appearance of defective parts in their mass production is equal to p. determine the probability that a batch of N parts contains a) exactly three parts; b) no more than three defective parts.
p=0.001; N=4500
Decision.
The probability p is small, and the number n is large. np = 4.5< 10. Значит случайная величина Х – распределена по Пуассоновскому распределению. Составим закон.
The random variable X has the range (0,1,2,...,m). The probabilities of these values ​​can be found by the formula:

Let's find the distribution series X.
Here λ = np = 4500*0.001 = 4.5
P(0) = e - λ = e -4.5 = 0.01111
P(1) = λe -λ = 4.5e -4.5 = 0.04999

Then the probability that a batch of N parts contains exactly three parts is equal to:

Then the probability that a batch of N parts contains no more than three defective parts is:
P(x<3) = P(0) + P(1) + P(2) = 0,01111 + 0,04999 + 0,1125 = 0,1736

Example number 6. An automatic telephone exchange receives, on average, N calls per hour. Determine the probability that in a given minute she will receive: a) exactly two calls; b) more than two calls.
N = 18
Decision.
In one minute, the ATS receives on average λ = 18/60 min. = 0.3
Assuming that a random number X of calls received at the PBX in one minute,
obeys Poisson's law, by the formula we find the required probability

Let's find the distribution series X.
Here λ = 0.3
P(0) = e - λ = e -0.3 = 0.7408
P(1) = λe -λ = 0.3e -0.3 = 0.2222

The probability that she will receive exactly two calls in a given minute is:
P(2) = 0.03334
The probability that she will receive more than two calls in a given minute is:
P(x>2) = 1 - 0.7408 - 0.2222 - 0.03334 = 0.00366

Example number 7. We consider two elements that work independently of each other. The duration of uptime has an exponential distribution with the parameter λ1 = 0.02 for the first element and λ2 = 0.05 for the second element. Find the probability that in 10 hours: a) both elements will work flawlessly; b) only Probability that element #1 will not fail in 10 hours:
Solution.
P 1 (0) \u003d e -λ1 * t \u003d e -0.02 * 10 \u003d 0.8187

The probability that element #2 will not fail in 10 hours is:
P 2 (0) \u003d e -λ2 * t \u003d e -0.05 * 10 \u003d 0.6065

a) both elements will work flawlessly;
P(2) = P 1 (0)*P 2 (0) = 0.8187*0.6065 = 0.4966
b) only one element will fail.
P(1) = P 1 (0)*(1-P 2 (0)) + (1-P 1 (0))*P 2 (0) = 0.8187*(1-0.6065) + (1-0.8187) *0.6065 = 0.4321

Example number 7. Production gives 1% of the marriage. What is the probability that out of 1100 products taken for research, no more than 17 will be rejected?
Note: since here n*p =1100*0.01=11 > 10, it is necessary to use

In many practical problems, one has to deal with random variables distributed according to a peculiar law, which is called Poisson's law.

Consider a discontinuous random variable , which can take only integer, non-negative values:

and the sequence of these values ​​is theoretically unlimited.

A random variable is said to be distributed according to Poisson's law if the probability that it takes on a certain value is expressed by the formula

where a is some positive value, called the Poisson law parameter.

The distribution series of a random variable , distributed according to Poisson's law, has the form:

Let us first of all make sure that the sequence of probabilities given by formula (5.9.1) can be a distribution series, i.e. that the sum of all probabilities is equal to one. We have:

.

On fig. 5.9.1 shows the distribution polygons of a random variable distributed according to Poisson's law, corresponding to different values ​​of the parameter . Table 8 of the appendix lists the values ​​for various .

Let's define the main characteristics - mathematical expectation and variance - of a random variable distributed according to the Poisson law. By definition of mathematical expectation

.

The first term of the sum (corresponding to ) is equal to zero, therefore, the summation can be started from :

Let's denote ; then

. (5.9.2)

Thus, the parameter is nothing more than the mathematical expectation of a random variable.

To determine the dispersion, we first find the second initial moment of the quantity :

According to the previously proven

besides,

Thus, the dispersion of a random variable distributed according to the Poisson law is equal to its mathematical expectation.

This property of the Poisson distribution is often used in practice to decide whether the hypothesis that a random variable is distributed according to Poisson's law is plausible. To do this, determine from experience the statistical characteristics - the mathematical expectation and variance - of a random variable. If their values ​​are close, then this can serve as an argument in favor of the Poisson distribution hypothesis; a sharp difference in these characteristics, on the contrary, testifies against the hypothesis.

For a random variable distributed according to Poisson's law, let's determine the probability that it will take a value not less than a given one. Let's denote this probability:

Obviously, the probability can be calculated as the sum

However, it is much easier to determine it from the probability of the opposite event:

(5.9.4)

In particular, the probability that the value will take a positive value is expressed by the formula

(5.9.5)

We have already mentioned that many practical tasks lead to a Poisson distribution. Consider one of the typical problems of this kind.

Let points be randomly distributed on the x-axis Ox (Fig. 5.9.2). Assume that the random distribution of points satisfies the following conditions:

1. The probability of hitting a given number of points on a segment depends only on the length of this segment, but does not depend on its position on the x-axis. In other words, the points are distributed on the x-axis with the same average density. Let's denote this density (i.e. the mathematical expectation of the number of points per unit length) as .

2. The points are distributed on the x-axis independently of each other, i.e. the probability of hitting one or another number of points on a given segment does not depend on how many of them fell on any other segment that does not overlap with it.

3. The probability of hitting a small area of ​​two or more points is negligible compared to the probability of hitting one point (this condition means the practical impossibility of coincidence of two or more points).

Let's single out a certain length segment on the abscissa axis and consider a discrete random variable - the number of points falling on this segment. Possible values ​​of the quantity will be

Since the points fall on the segment independently of each other, it is theoretically possible that there will be an arbitrarily large number of them, i.e. series (5.9.6) continues indefinitely.

Let us prove that the random variable has the Poisson distribution law. To do this, we calculate the probability that exactly points fall on the segment.

Let's solve a simpler problem first. Consider a small section on the Ox axis and calculate the probability that at least one point will fall on this section. We will argue as follows. The mathematical expectation of the number of points that fall on this section is obviously equal (because there are points on average per unit length). According to condition 3, for a small segment, the possibility of two or more points falling on it can be neglected. Therefore, the mathematical expectation of the number of points falling on the site will be approximately equal to the probability of one point falling on it (or, which is equivalent in our conditions, at least one).

Thus, up to infinitesimals of a higher order, at , we can assume that the probability that one (at least one) point will fall on the site is equal to , and the probability that none will fall is equal to .

Let's use this to calculate the probability of hitting exactly points on the segment. Divide the segment into equal parts of length . Let us agree to call an elementary segment "empty" if it does not contain a single point, and "occupied" if at least one has fallen into it. According to the above, the probability that the segment will be "occupied" is approximately equal to; the probability that it will be "empty" is . Since, according to condition 2, the hits of points in non-overlapping segments are independent, then our n segments can be considered as independent "experiments", in each of which the segment can be "occupied" with probability . Find the probability that among the segments there will be exactly "busy". According to the repetition theorem, this probability is equal to

or, denoting

(5.9.7)

For a sufficiently large value, this probability is approximately equal to the probability of hitting exactly points on the segment, since the hit of two or more points on the segment has a negligible probability. In order to find the exact value of , it is necessary in expression (5.9.7) to go to the limit at :

(5.9.8)

Let's transform the expression under the limit sign:

(5.9.9)

The first fraction and the denominator of the last fraction in expression (5.9.9) at obviously tend to unity. The expression does not depend on. The numerator of the last fraction can be converted as follows:

(5.9.10)

When and expression (5.9.10) tends to . Thus, it has been proved that the probability of exactly points falling into a segment is expressed by the formula

where , i.e. the quantity X is distributed according to the Poisson law with the parameter .

Note that the meaning of the value is the average number of points per segment .

The value (the probability that the value of X will take a positive value) in this case expresses the probability that at least one point will fall on the segment:

Thus, we made sure that the Poisson distribution occurs where some points (or other elements) occupy a random position independently of each other, and the number of these points that fall into some area is counted. In our case, such an "area" was a segment on the x-axis. However, our conclusion can be easily extended to the case of distribution of points in the plane (random flat field of points) and in space (random spatial field of points). It is easy to prove that if the following conditions are met:

1) the points are distributed statistically uniformly in the field with an average density ;

2) points fall into non-overlapping regions independently;

3) points appear singly, and not in pairs, triples, etc., then the number of points falling into any area (flat or spatial) are distributed according to Poisson's law:

where is the average number of points falling into the area .

For the flat case

where is the area of ​​the region; for spatial

where is the volume of the region.

Note that for the Poisson distribution of the number of points falling into a segment or area, the condition of constant density () is not essential. If the other two conditions are met, then Poisson's law still takes place, only the parameter a in it acquires a different expression: it is obtained not by simply multiplying the density by the length, area, or volume of the region, but by integrating the variable density over a segment, area, or volume. (For more on this, see n° 19.4)

The presence of random points scattered on a line, on a plane or on a volume is not the only condition under which the Poisson distribution occurs. One can, for example, prove that Poisson's law is limiting for the binomial distribution:

, (5.9.12)

if we simultaneously direct the number of experiments to infinity, and the probability to zero, and their product remains constant:

Indeed, this limiting property of the binomial distribution can be written as:

. (5.9.14)

But from condition (5.9.13) it follows that

Substituting (5.9.15) into (5.9.14), we obtain the equality

, (5.9.16)

which has just been proved by us on another occasion.

This limiting property of the binomial law is often used in practice. Let us assume that a large number of independent experiments are being made, in each of which an event has a very small probability . Then, to calculate the probability that an event will occur exactly once, you can use the approximate formula:

, (5.9.17)

where is the parameter of that Poisson's law, which approximately replaces the binomial distribution.

From this property of Poisson's law - to express the binomial distribution with a large number of experiments and a small probability of an event - comes its name, often used in statistics textbooks: the law of rare phenomena.

Let's look at a few examples related to the Poisson distribution from various fields of practice.

Example 1: An automatic telephone exchange receives calls with an average density of calls per hour. Assuming that the number of calls in any period of time is distributed according to the Poisson law, find the probability that exactly three calls will arrive at the station in two minutes.

Decision. The average number of calls per two minutes is:

sq.m. To hit the target, at least one fragment is enough to hit it. Find the probability of hitting the target for a given position of the discontinuity point.

Decision. . Using formula (5.9.4), we find the probability of hitting at least one fragment:

(To calculate the value of the exponential function, we use Table 2 of the Appendix).

Example 7. The average density of pathogenic microbes in one cubic meter of air is 100. 2 cubic meters are taken for a sample. dm air. Find the probability that at least one microbe will be found in it.

Decision. Accepting the hypothesis of the Poisson distribution of the number of microbes in a volume, we find:

Example 8. 50 independent shots are fired at some target. The probability of hitting the target with one shot is 0.04. Using the limiting property of the binomial distribution (formula (5.9.17)), find approximately the probability that the target will hit: no projectile, one projectile, two projectiles.

Decision. We have . According to table 8 of the application, we find the probabilities.