Biographies Characteristics Analysis

Mathematical expectation of the number of distinct digits. Mathematical expectation of a continuous random variable

The mathematical expectation is the probability distribution of a random variable

Mathematical expectation, definition, mathematical expectation of discrete and continuous random variables, selective, conditional expectation, calculation, properties, tasks, estimation of expectation, variance, distribution function, formulas, calculation examples

Expand content

Collapse content

The mathematical expectation is, the definition

One of the most important concepts in mathematical statistics and probability theory, characterizing the distribution of values ​​or probabilities of a random variable. Usually expressed as a weighted average of all possible parameters of a random variable. It is widely used in technical analysis, the study of number series, the study of continuous and long-term processes. It is important in assessing risks, predicting price indicators when trading in financial markets, and is used in the development of strategies and methods of game tactics in the theory of gambling.

The mathematical expectation is the mean value of a random variable, the probability distribution of a random variable is considered in probability theory.

The mathematical expectation is measure of the mean value of a random variable in probability theory. Mathematical expectation of a random variable x denoted M(x).

The mathematical expectation is

The mathematical expectation is in probability theory, the weighted average of all possible values ​​that this random variable can take.

The mathematical expectation is the sum of the products of all possible values ​​of a random variable by the probabilities of these values.

The mathematical expectation is the average benefit from a particular decision, provided that such a decision can be considered in the framework of the theory of large numbers and a long distance.


The mathematical expectation is in gambling theory, the amount of winnings that a player can earn or lose, on average, for each bet. In the language of gamblers, this is sometimes called "gamer's edge" (if positive for the player) or "house edge" (if negative for the player).

The mathematical expectation is Percentage of profit per win multiplied by average profit minus probability of loss multiplied by average loss.


Mathematical expectation of a random variable in mathematical theory

One of the important numerical characteristics of a random variable is the mathematical expectation. Let us introduce the concept of a system of random variables. Consider a set of random variables that are the results of the same random experiment. If is one of the possible values ​​of the system, then the event corresponds to a certain probability that satisfies the Kolmogorov axioms. A function defined for any possible values ​​of random variables is called a joint distribution law. This function allows you to calculate the probabilities of any events from. In particular, the joint law of distribution of random variables and, which take values ​​from the set and, is given by probabilities.


The term "expectation" was introduced by Pierre Simon Marquis de Laplace (1795) and originated from the concept of "expected value of payoff", which first appeared in the 17th century in the theory of gambling in the works of Blaise Pascal and Christian Huygens. However, the first complete theoretical understanding and evaluation of this concept was given by Pafnuty Lvovich Chebyshev (mid-19th century).


The distribution law of random numerical variables (the distribution function and the distribution series or probability density) completely describe the behavior of a random variable. But in a number of problems it is enough to know some numerical characteristics of the quantity under study (for example, its average value and possible deviation from it) in order to answer the question posed. The main numerical characteristics of random variables are the mathematical expectation, variance, mode and median.

The mathematical expectation of a discrete random variable is the sum of the products of its possible values ​​and their corresponding probabilities. Sometimes the mathematical expectation is called the weighted average, since it is approximately equal to the arithmetic mean of the observed values ​​of a random variable over a large number of experiments. From the definition of mathematical expectation, it follows that its value is not less than the smallest possible value of a random variable and not more than the largest. The mathematical expectation of a random variable is a non-random (constant) variable.


The mathematical expectation has a simple physical meaning: if a unit mass is placed on a straight line, placing some mass at some points (for a discrete distribution), or “smearing” it with a certain density (for an absolutely continuous distribution), then the point corresponding to the mathematical expectation will be the coordinate "center of gravity" straight.


The average value of a random variable is a certain number, which is, as it were, its “representative” and replaces it in rough approximate calculations. When we say: “the average lamp operation time is 100 hours” or “the average point of impact is shifted relative to the target by 2 m to the right”, we indicate by this a certain numerical characteristic of a random variable that describes its location on the numerical axis, i.e. position description.

Of the characteristics of a position in probability theory, the most important role is played by the mathematical expectation of a random variable, which is sometimes called simply the average value of a random variable.


Consider a random variable X, which has possible values x1, x2, …, xn with probabilities p1, p2, …, pn. We need to characterize by some number the position of the values ​​of the random variable on the x-axis, taking into account the fact that these values ​​have different probabilities. For this purpose, it is natural to use the so-called "weighted average" of the values xi, and each value xi during averaging should be taken into account with a “weight” proportional to the probability of this value. Thus, we will calculate the mean of the random variable X, which we will denote M|X|:


This weighted average is called the mathematical expectation of the random variable. Thus, we introduced in consideration one of the most important concepts of probability theory - the concept of mathematical expectation. The mathematical expectation of a random variable is the sum of the products of all possible values ​​of a random variable and the probabilities of these values.

X due to a peculiar dependence with the arithmetic mean of the observed values ​​of a random variable with a large number of experiments. This dependence is of the same type as the dependence between frequency and probability, namely: with a large number of experiments, the arithmetic mean of the observed values ​​of a random variable approaches (converges in probability) its mathematical expectation. From the presence of a relationship between frequency and probability, one can deduce as a consequence the existence of a similar relationship between the arithmetic mean and mathematical expectation. Indeed, consider a random variable X, characterized by a series of distributions:


Let it be produced N independent experiments, in each of which the value X takes on a certain value. Suppose the value x1 appeared m1 times, value x2 appeared m2 times, general meaning xi appeared mi times. Let us calculate the arithmetic mean of the observed values ​​of X, which, in contrast to the mathematical expectation M|X| we will denote M*|X|:

With an increase in the number of experiments N frequencies pi will approach (converge in probability) the corresponding probabilities. Therefore, the arithmetic mean of the observed values ​​of the random variable M|X| with an increase in the number of experiments, it will approach (converge in probability) to its mathematical expectation. The connection between the arithmetic mean and the mathematical expectation formulated above constitutes the content of one of the forms of the law of large numbers.

We already know that all forms of the law of large numbers state the fact that certain averages are stable over a large number of experiments. Here we are talking about the stability of the arithmetic mean from a series of observations of the same value. With a small number of experiments, the arithmetic mean of their results is random; with a sufficient increase in the number of experiments, it becomes "almost not random" and, stabilizing, approaches a constant value - the mathematical expectation.


The property of stability of averages for a large number of experiments is easy to verify experimentally. For example, weighing any body in the laboratory on accurate scales, as a result of weighing we get a new value each time; to reduce the error of observation, we weigh the body several times and use the arithmetic mean of the obtained values. It is easy to see that with a further increase in the number of experiments (weighings), the arithmetic mean reacts to this increase less and less, and with a sufficiently large number of experiments it practically ceases to change.

It should be noted that the most important characteristic of the position of a random variable - the mathematical expectation - does not exist for all random variables. It is possible to make examples of such random variables for which the mathematical expectation does not exist, since the corresponding sum or integral diverges. However, for practice, such cases are not of significant interest. Usually, the random variables we are dealing with have a limited range of possible values ​​and, of course, have an expectation.


In addition to the most important of the characteristics of the position of a random variable - the mathematical expectation, other position characteristics are sometimes used in practice, in particular, the mode and median of the random variable.


The mode of a random variable is its most probable value. The term "most likely value", strictly speaking, applies only to discontinuous quantities; for a continuous quantity, the mode is the value at which the probability density is maximum. The figures show the mode for discontinuous and continuous random variables, respectively.


If the distribution polygon (distribution curve) has more than one maximum, the distribution is said to be "polymodal".



Sometimes there are distributions that have in the middle not a maximum, but a minimum. Such distributions are called "antimodal".


In the general case, the mode and the mathematical expectation of a random variable do not coincide. In a particular case, when the distribution is symmetric and modal (i.e. has a mode) and there is a mathematical expectation, then it coincides with the mode and the center of symmetry of the distribution.

Another characteristic of the position is often used - the so-called median of a random variable. This characteristic is usually used only for continuous random variables, although it can be formally defined for a discontinuous variable as well. Geometrically, the median is the abscissa of the point at which the area bounded by the distribution curve is bisected.


In the case of a symmetric modal distribution, the median coincides with the mean and the mode.

Mathematical expectation is the average value of a random variable - a numerical characteristic of the probability distribution of a random variable. In the most general way, the mathematical expectation of a random variable X(w) is defined as the Lebesgue integral with respect to the probability measure R in the original probability space:


The mathematical expectation can also be calculated as the Lebesgue integral of X by probability distribution px quantities X:


In a natural way, one can define the concept of a random variable with infinite mathematical expectation. A typical example is the return times in some random walks.

With the help of mathematical expectation, many numerical and functional characteristics of the distribution are determined (as the mathematical expectation of the corresponding functions of a random variable), for example, generating function, characteristic function, moments of any order, in particular, variance, covariance.

Mathematical expectation is a characteristic of the location of the values ​​of a random variable (the average value of its distribution). In this capacity, the mathematical expectation serves as some "typical" distribution parameter and its role is similar to the role of the static moment - the coordinate of the center of gravity of the mass distribution - in mechanics. From other characteristics of the location, with the help of which the distribution is described in general terms - medians, modes, the mathematical expectation differs in the greater value that it and the corresponding scattering characteristic - dispersion - have in the limit theorems of probability theory. With the greatest completeness, the meaning of mathematical expectation is revealed by the law of large numbers (Chebyshev's inequality) and the strengthened law of large numbers.

Mathematical expectation of a discrete random variable

Let there be some random variable that can take one of several numerical values ​​(for example, the number of points in a die roll can be 1, 2, 3, 4, 5, or 6). Often in practice, for such a value, the question arises: what value does it take "on average" with a large number of tests? What will be our average return (or loss) from each of the risky operations?


Let's say there is some kind of lottery. We want to understand whether it is profitable or not to participate in it (or even participate repeatedly, regularly). Let's say that every fourth ticket wins, the prize will be 300 rubles, and the price of any ticket will be 100 rubles. With an infinite number of participations, this is what happens. In three-quarters of the cases, we will lose, every three losses will cost 300 rubles. In every fourth case, we will win 200 rubles. (prize minus cost), that is, for four participations, we lose an average of 100 rubles, for one - an average of 25 rubles. In total, the average rate of our ruin will be 25 rubles per ticket.

We throw a dice. If it's not cheating (without shifting the center of gravity, etc.), then how many points will we have on average at a time? Since each option is equally likely, we take the stupid arithmetic mean and get 3.5. Since this is AVERAGE, there is no need to be indignant that no particular throw will give 3.5 points - well, this cube does not have a face with such a number!

Now let's summarize our examples:


Let's take a look at the picture just above. On the left is a table of the distribution of a random variable. The value of X can take one of n possible values ​​(given in the top row). There can be no other values. Under each possible value, its probability is signed below. On the right is a formula, where M(X) is called the mathematical expectation. The meaning of this value is that with a large number of trials (with a large sample), the average value will tend to this very mathematical expectation.

Let's go back to the same playing cube. The mathematical expectation of the number of points in a throw is 3.5 (calculate yourself using the formula if you don’t believe it). Let's say you threw it a couple of times. 4 and 6 fell out. On average, it turned out 5, that is, far from 3.5. They threw it again, 3 fell out, that is, on average (4 + 6 + 3) / 3 = 4.3333 ... Somehow far from the mathematical expectation. Now do a crazy experiment - roll the cube 1000 times! And if the average is not exactly 3.5, then it will be close to that.

Let's calculate the mathematical expectation for the above described lottery. The table will look like this:


Then the mathematical expectation will be, as we have established above.:


Another thing is that it is also "on the fingers", without a formula, it would be difficult if there were more options. Well, let's say there were 75% losing tickets, 20% winning tickets, and 5% winning tickets.

Now some properties of mathematical expectation.

It's easy to prove it:


A constant multiplier can be taken out of the expectation sign, that is:


This is a special case of the linearity property of the mathematical expectation.

Another consequence of the linearity of the mathematical expectation:

that is, the mathematical expectation of the sum of random variables is equal to the sum of the mathematical expectations of random variables.

Let X, Y be independent random variables, then:

This is also easy to prove) XY itself is a random variable, while if the initial values ​​could take n and m values, respectively, then XY can take nm values. The probability of each of the values ​​is calculated based on the fact that the probabilities of independent events are multiplied. As a result, we get this:


Mathematical expectation of a continuous random variable

Continuous random variables have such a characteristic as the distribution density (probability density). It, in fact, characterizes the situation that a random variable takes some values ​​from the set of real numbers more often, some - less often. For example, consider this chart:


Here X- actually a random variable, f(x)- distribution density. Judging by this graph, during the experiments, the value X will often be a number close to zero. chances to exceed 3 or be less -3 rather purely theoretical.


Let, for example, there is a uniform distribution:



This is quite consistent with the intuitive understanding. Let's say if we get a lot of random real numbers with a uniform distribution, each of the segment |0; 1| , then the arithmetic mean should be about 0.5.

The properties of mathematical expectation - linearity, etc., applicable for discrete random variables, are applicable here as well.

The relationship of mathematical expectation with other statistical indicators

In statistical analysis, along with mathematical expectation, there is a system of interdependent indicators that reflect the homogeneity of phenomena and the stability of processes. Often, variation indicators do not have independent meaning and are used for further data analysis. The exception is the coefficient of variation, which characterizes the homogeneity of the data, which is a valuable statistical characteristic.


The degree of variability or stability of processes in statistical science can be measured using several indicators.

The most important indicator characterizing the variability of a random variable is Dispersion, which is most closely and directly related to the mathematical expectation. This parameter is actively used in other types of statistical analysis (hypothesis testing, analysis of cause-and-effect relationships, etc.). Like the mean linear deviation, the variance also reflects the extent to which the data spread around the mean.


It is useful to translate the language of signs into the language of words. It turns out that the variance is the average square of the deviations. That is, the average value is first calculated, then the difference between each original and average value is taken, squared, added up and then divided by the number of values ​​in this population. The difference between the individual value and the mean reflects the measure of the deviation. It is squared to ensure that all deviations become exclusively positive numbers and to avoid mutual cancellation of positive and negative deviations when they are summed. Then, given the squared deviations, we simply calculate the arithmetic mean. Average - square - deviations. Deviations are squared, and the average is considered. The answer to the magic word "dispersion" is just three words.

However, in its pure form, such as, for example, the arithmetic mean, or index, dispersion is not used. It is rather an auxiliary and intermediate indicator that is used for other types of statistical analysis. She doesn't even have a normal unit of measure. Judging by the formula, this is the square of the original data unit.

Let's measure a random variable N times, for example, we measure the wind speed ten times and want to find the average value. How is the mean value related to the distribution function?

Or we will roll the dice a large number of times. The number of points that will fall out on the die during each throw is a random variable and can take any natural values ​​from 1 to 6. N it tends to a very specific number - the mathematical expectation Mx. In this case, Mx = 3.5.

How did this value come about? Let in N trials n1 once 1 point is dropped, n2 times - 2 points and so on. Then the number of outcomes in which one point fell:


Similarly for the outcomes when 2, 3, 4, 5 and 6 points fell out.


Let us now assume that we know the distribution law of the random variable x, that is, we know that the random variable x can take the values ​​x1, x2, ..., xk with probabilities p1, p2, ..., pk.

The mathematical expectation Mx of a random variable x is:


The mathematical expectation is not always a reasonable estimate of some random variable. So, to estimate the average wage, it is more reasonable to use the concept of the median, that is, such a value that the number of people who receive less than the median salary and more, are the same.

The probability p1 that the random variable x is less than x1/2 and the probability p2 that the random variable x is greater than x1/2 are the same and equal to 1/2. The median is not uniquely determined for all distributions.


Standard or Standard Deviation in statistics, the degree of deviation of observational data or sets from the AVERAGE value is called. Denoted by the letters s or s. A small standard deviation indicates that the data is grouped around the mean, and a large standard deviation indicates that the initial data is far from it. The standard deviation is equal to the square root of a quantity called the variance. It is the average of the sum of the squared differences of the initial data deviating from the mean. The standard deviation of a random variable is the square root of the variance:


Example. Under test conditions when shooting at a target, calculate the variance and standard deviation of a random variable:


Variation- fluctuation, variability of the value of the attribute in units of the population. Separate numerical values ​​of a feature that occur in the studied population are called variants of values. The insufficiency of the average value for a complete characterization of the population makes it necessary to supplement the average values ​​with indicators that make it possible to assess the typicality of these averages by measuring the fluctuation (variation) of the trait under study. The coefficient of variation is calculated by the formula:


Span variation(R) is the difference between the maximum and minimum values ​​of the trait in the studied population. This indicator gives the most general idea of ​​the fluctuation of the trait under study, as it shows the difference only between the extreme values ​​of the variants. The dependence on the extreme values ​​of the attribute gives the range of variation an unstable, random character.


Average linear deviation is the arithmetic mean of the absolute (modulo) deviations of all values ​​of the analyzed population from their average value:


Mathematical expectation in gambling theory

The mathematical expectation is the average amount of money a gambler can win or lose on a given bet. This is a very significant concept for a player, because it is fundamental to the assessment of most game situations. Mathematical expectation is also the best tool for analyzing basic card layouts and game situations.

Let's say you're playing coin with a friend, making an equal $1 bet each time, no matter what comes up. Tails - you win, heads - you lose. The chances of it coming up tails are one to one and you are betting $1 to $1. Thus, your mathematical expectation is zero, because mathematically speaking, you can't know if you'll lead or lose after two rolls or after 200.


Your hourly gain is zero. Hourly payout is the amount of money you expect to win in an hour. You can flip a coin 500 times within an hour, but you won't win or lose because your odds are neither positive nor negative. If you look, from the point of view of a serious player, such a betting system is not bad. But it's just a waste of time.

But suppose someone wants to bet $2 against your $1 in the same game. Then you immediately have a positive expectation of 50 cents from each bet. Why 50 cents? On average, you win one bet and lose the second. Bet the first dollar and lose $1, bet the second and win $2. You've bet $1 twice and are ahead by $1. So each of your one dollar bets gave you 50 cents.


If the coin falls 500 times in one hour, your hourly gain will be already $250, because. on average, you lost $1 250 times and won $2 250 times. $500 minus $250 equals $250, which is the total win. Note that the expected value, which is the amount you win on average on a single bet, is 50 cents. You won $250 by betting a dollar 500 times, which equals 50 cents of your bet.

Mathematical expectation has nothing to do with short-term results. Your opponent, who decided to bet $2 against you, could beat you on the first ten tosses in a row, but you, with a 2-to-1 betting advantage, all else being equal, make 50 cents on every $1 bet under any circumstances. It doesn't matter if you win or lose one bet or several bets, but only on the condition that you have enough cash to easily compensate for the costs. If you keep betting the same way, then over a long period of time your winnings will come up to the sum of expected values ​​in individual rolls.


Every time you make a best bet (a bet that can be profitable in the long run) when the odds are in your favor, you are bound to win something on it, whether you lose it or not in a given hand. Conversely, if you made a worse bet (a bet that is unprofitable in the long run) when the odds are not in your favor, you lose something, whether you win or lose the hand.

You bet with the best outcome if your expectation is positive, and it is positive if the odds are in your favor. By betting with the worst outcome, you have a negative expectation, which happens when the odds are against you. Serious players only bet with the best outcome, with the worst - they fold. What does the odds in your favor mean? You may end up winning more than the actual odds bring. The real odds of hitting tails are 1 to 1, but you get 2 to 1 due to the betting ratio. In this case, the odds are in your favor. You definitely get the best outcome with a positive expectation of 50 cents per bet.


Here is a more complex example of mathematical expectation. The friend writes down the numbers from one to five and bets $5 against your $1 that you won't pick the number. Do you agree to such a bet? What is the expectation here?

On average, you'll be wrong four times. Based on this, the odds against you guessing the number will be 4 to 1. The odds are that you will lose a dollar in one attempt. However, you win 5 to 1, with the possibility of losing 4 to 1. Therefore, the odds are in your favor, you can take the bet and hope for the best outcome. If you make this bet five times, on average you will lose four times $1 and win $5 once. Based on this, for all five attempts you will earn $1 with a positive mathematical expectation of 20 cents per bet.


A player who is going to win more than he bets, as in the example above, is catching the odds. Conversely, he ruins the chances when he expects to win less than he bets. The bettor can have either positive or negative expectation depending on whether he is catching or ruining the odds.

If you bet $50 to win $10 with a 4 to 1 chance of winning, you will get a negative expectation of $2, because on average, you will win four times $10 and lose $50 once, which shows that the loss per bet will be $10. But if you bet $30 to win $10, with the same odds of winning 4 to 1, then in this case you have a positive expectation of $2, because you again win four times $10 and lose $30 once, for a profit of $10. These examples show that the first bet is bad and the second is good.


Mathematical expectation is the center of any game situation. When a bookmaker encourages football fans to bet $11 to win $10, they have a positive expectation of 50 cents for every $10. If the casino pays out even money from the Craps pass line, then the house's positive expectation is approximately $1.40 for every $100; this game is structured so that everyone who bets on this line loses 50.7% on average and wins 49.3% of the time. Undoubtedly, it is this seemingly minimal positive expectation that brings huge profits to casino owners around the world. As Vegas World casino owner Bob Stupak remarked, “A one-thousandth of a percent negative probability over a long enough distance will bankrupt the richest man in the world.”


Mathematical expectation when playing poker

The game of Poker is the most illustrative and illustrative example in terms of using the theory and properties of mathematical expectation.


Expected Value in Poker is the average benefit from a particular decision, provided that such a decision can be considered in the framework of the theory of large numbers and a long distance. Successful poker is about always accepting moves with a positive mathematical expectation.

The mathematical meaning of the mathematical expectation when playing poker is that we often encounter random variables when making a decision (we do not know which cards are in the opponent's hand, which cards will come on subsequent betting rounds). We must consider each of the solutions from the point of view of the theory of large numbers, which says that with a sufficiently large sample, the average value of a random variable will tend to its mathematical expectation.


Among the particular formulas for calculating the mathematical expectation, the following is most applicable in poker:

When playing poker, the mathematical expectation can be calculated for both bets and calls. In the first case, fold equity should be taken into account, in the second, the pot's own odds. When evaluating the mathematical expectation of a particular move, it should be remembered that a fold always has a zero mathematical expectation. Thus, discarding cards will always be a more profitable decision than any negative move.

Expectation tells you what you can expect (profit or loss) for every dollar you risk. Casinos make money because the mathematical expectation of all the games that are practiced in them is in favor of the casino. With a sufficiently long series of games, it can be expected that the client will lose his money, since the “probability” is in favor of the casino. However, professional casino players limit their games to short periods of time, thereby increasing the odds in their favor. The same goes for investing. If your expectation is positive, you can make more money by making many trades in a short period of time. The expectation is your percentage of profit per win times your average profit minus your probability of loss times your average loss.


Poker can also be considered in terms of mathematical expectation. You can assume that a certain move is profitable, but in some cases it may not be the best one, because another move is more profitable. Let's say you hit a full house in five card draw poker. Your opponent bets. You know that if you up the ante, he will call. So raising looks like the best tactic. But if you do raise, the remaining two players will fold for sure. But if you call the bet, you will be completely sure that the other two players after you will do the same. When you raise the bet, you get one unit, and simply by calling you get two. So calling gives you a higher positive expected value and is the best tactic.

The mathematical expectation can also give an idea of ​​which poker tactics are less profitable and which are more profitable. For example, if you play a particular hand and you think your average loss is 75 cents including the antes, then you should play that hand because this is better than folding when the ante is $1.


Another important reason for understanding expected value is that it gives you a sense of peace of mind whether you win a bet or not: if you made a good bet or folded in time, you will know that you have earned or saved a certain amount of money, which a weaker player could not save. It's much harder to fold if you're frustrated that your opponent has a better hand on the draw. That said, the money you save by not playing, instead of betting, is added to your overnight or monthly winnings.

Just remember that if you switched hands, your opponent would call you, and as you'll see in the Fundamental Theorem of Poker article, this is just one of your advantages. You should rejoice when this happens. You can even learn to enjoy losing a hand, because you know that other players in your shoes would lose much more.


As discussed in the coin game example at the beginning, the hourly rate of return is related to the mathematical expectation, and this concept is especially important for professional players. When you are going to play poker, you must mentally estimate how much you can win in an hour of play. In most cases, you will need to rely on your intuition and experience, but you can also use some mathematical calculations. For example, if you are playing draw lowball and you see three players bet $10 and then draw two cards, which is a very bad tactic, you can calculate for yourself that every time they bet $10 they lose about $2. Each of them does this eight times an hour, which means that all three lose about $48 per hour. You are one of the remaining four players, which are approximately equal, so these four players (and you among them) must share $48, and each will earn $12 per hour. Your hourly rate in this case is simply your share of the amount of money lost by three bad players per hour.

Over a long period of time, the total winnings of the player is the sum of his mathematical expectations in separate distributions. The more you play with positive expectation, the more you win, and conversely, the more hands you play with negative expectation, the more you lose. As a result, you should prioritize a game that can maximize your positive expectation or negate your negative one so that you can maximize your hourly gain.


Positive mathematical expectation in game strategy

If you know how to count cards, you may have an advantage over the casino if they don't notice and kick you out. Casinos love drunken gamblers and can't stand counting cards. The advantage will allow you to win more times than you lose over time. Good money management using expectation calculations can help you capitalize on your edge and cut your losses. Without an advantage, you're better off giving the money to charity. In the game on the stock exchange, the advantage is given by the system of the game, which creates more profit than losses, price differences and commissions. No amount of money management will save a bad gaming system.

A positive expectation is defined by a value greater than zero. The larger this number, the stronger the statistical expectation. If the value is less than zero, then the mathematical expectation will also be negative. The larger the modulus of a negative value, the worse the situation. If the result is zero, then the expectation is break even. You can only win when you have a positive mathematical expectation, a reasonable game system. Playing on intuition leads to disaster.


Mathematical expectation and stock trading

Mathematical expectation is a fairly widely demanded and popular statistical indicator in exchange trading in financial markets. First of all, this parameter is used to analyze the success of trading. It is not difficult to guess that the larger this value, the more reason to consider the trade under study successful. Of course, the analysis of the work of a trader cannot be carried out only with the help of this parameter. However, the calculated value, in combination with other methods of assessing the quality of work, can significantly increase the accuracy of the analysis.


The mathematical expectation is often calculated in trading account monitoring services, which allows you to quickly evaluate the work performed on the deposit. As exceptions, we can cite strategies that use the “overstaying” of losing trades. A trader may be lucky for some time, and therefore, in his work there may be no losses at all. In this case, it will not be possible to navigate only by the expectation, because the risks used in the work will not be taken into account.

In trading on the market, mathematical expectation is most often used when predicting the profitability of a trading strategy or when predicting a trader's income based on the statistics of his previous trades.

In terms of money management, it is very important to understand that when making trades with negative expectation, there is no money management scheme that can definitely bring high profits. If you continue to play the exchange under these conditions, then regardless of how you manage your money, you will lose your entire account, no matter how big it was at the beginning.

This axiom is not only true for negative expectation games or trades, it is also true for even odds games. Therefore, the only case where you have a chance to benefit in the long run is when making deals with a positive mathematical expectation.


The difference between negative expectation and positive expectation is the difference between life and death. It doesn't matter how positive or how negative the expectation is; what matters is whether it is positive or negative. Therefore, before considering money management, you must find a game with a positive expectation.

If you don't have that game, then no amount of money management in the world will save you. On the other hand, if you have a positive expectation, then it is possible, through proper money management, to turn it into an exponential growth function. It doesn't matter how small the positive expectation is! In other words, it doesn't matter how profitable a trading system based on one contract is. If you have a system that wins $10 per contract on a single trade (after fees and slippage), you can use money management techniques to make it more profitable than a system that shows an average profit of $1,000 per trade (after deduction of commissions and slippage).


What matters is not how profitable the system was, but how certain it can be said that the system will show at least a minimal profit in the future. Therefore, the most important preparation a trader can make is to make sure that the system shows a positive expected value in the future.

In order to have a positive expected value in the future, it is very important not to limit the degrees of freedom of your system. This is achieved not only by eliminating or reducing the number of parameters to be optimized, but also by reducing as many system rules as possible. Every parameter you add, every rule you make, every tiny change you make to the system reduces the number of degrees of freedom. Ideally, you want to build a fairly primitive and simple system that will constantly bring a small profit in almost any market. Again, it's important that you understand that it doesn't matter how profitable a system is, as long as it's profitable. The money you earn in trading will be earned through effective money management.

A trading system is simply a tool that gives you a positive mathematical expectation so that money management can be used. Systems that work (show at least a minimal profit) in only one or a few markets, or have different rules or parameters for different markets, will most likely not work in real time for long. The problem with most technical traders is that they spend too much time and effort optimizing the various rules and parameters of a trading system. This gives completely opposite results. Instead of wasting energy and computer time on increasing the profits of the trading system, direct your energy to increasing the level of reliability of obtaining a minimum profit.

Knowing that money management is just a number game that requires the use of positive expectations, a trader can stop looking for the "holy grail" of stock trading. Instead, he can start testing his trading method, find out how this method is logically sound, whether it gives positive expectations. Proper money management methods applied to any, even very mediocre trading methods, will do the rest of the work.


Any trader for success in their work needs to solve three most important tasks: . To ensure that the number of successful transactions exceeds the inevitable mistakes and miscalculations; Set up your trading system so that the opportunity to earn money is as often as possible; Achieve a stable positive result of your operations.

And here, for us, working traders, mathematical expectation can provide a good help. This term in the theory of probability is one of the key. With it, you can give an average estimate of some random value. The mathematical expectation of a random variable is like the center of gravity, if we imagine all possible probabilities as points with different masses.


In relation to a trading strategy, to evaluate its effectiveness, the mathematical expectation of profit (or loss) is most often used. This parameter is defined as the sum of the products of given levels of profit and loss and the probability of their occurrence. For example, the developed trading strategy assumes that 37% of all operations will bring profit, and the rest - 63% - will be unprofitable. At the same time, the average income from a successful transaction will be $7, and the average loss will be $1.4. Let's calculate the mathematical expectation of trading using the following system:

What does this number mean? It says that, following the rules of this system, on average, we will receive 1.708 dollars from each closed transaction. Since the resulting efficiency score is greater than zero, such a system can be used for real work. If, as a result of the calculation, the mathematical expectation turns out to be negative, then this already indicates an average loss and such trading will lead to ruin.

The amount of profit per trade can also be expressed as a relative value in the form of%. For example:

– percentage of income per 1 transaction - 5%;

– percentage of successful trading operations - 62%;

– loss percentage per 1 trade - 3%;

- the percentage of unsuccessful transactions - 38%;

That is, the average transaction will bring 1.96%.

It is possible to develop a system that, despite the predominance of losing trades, will give a positive result, since its MO>0.

However, waiting alone is not enough. It is difficult to make money if the system gives very few trading signals. In this case, its profitability will be comparable to bank interest. Let each operation bring in only 0.5 dollars on average, but what if the system assumes 1000 transactions per year? This will be a very serious amount in a relatively short time. It logically follows from this that another hallmark of a good trading system can be considered a short holding period.


Sources and links

dic.academic.ru - academic online dictionary

mathematics.ru - educational site on mathematics

nsu.ru – educational website of Novosibirsk State University

webmath.ru is an educational portal for students, applicants and schoolchildren.

exponenta.ru educational mathematical site

ru.tradimo.com - free online trading school

crypto.hut2.ru - multidisciplinary information resource

poker-wiki.ru - free encyclopedia of poker

sernam.ru - Scientific library of selected natural science publications

reshim.su - website SOLVE tasks control coursework

unfx.ru – Forex on UNFX: education, trading signals, trust management

slovopedia.com - Big Encyclopedic Dictionary

pokermansion.3dn.ru - Your guide to the world of poker

statanaliz.info - informational blog "Statistical data analysis"

forex-trader.rf - portal Forex-Trader

megafx.ru - up-to-date Forex analytics

fx-by.com - everything for a trader

Mathematical expectation and variance are the most commonly used numerical characteristics of a random variable. They characterize the most important features of the distribution: its position and degree of dispersion. In many problems of practice, a complete, exhaustive description of a random variable - the law of distribution - either cannot be obtained at all, or is not needed at all. In these cases, they are limited to an approximate description of a random variable using numerical characteristics.

The mathematical expectation is often referred to simply as the average value of a random variable. Dispersion of a random variable is a characteristic of dispersion, dispersion of a random variable around its mathematical expectation.

Mathematical expectation of a discrete random variable

Let's approach the concept of mathematical expectation, first proceeding from the mechanical interpretation of the distribution of a discrete random variable. Let the unit mass be distributed between the points of the x-axis x1 , x 2 , ..., x n, and each material point has a mass corresponding to it from p1 , p 2 , ..., p n. It is required to choose one point on the x-axis, which characterizes the position of the entire system of material points, taking into account their masses. It is natural to take the center of mass of the system of material points as such a point. This is the weighted average of the random variable X, in which the abscissa of each point xi enters with a "weight" equal to the corresponding probability. The mean value of the random variable thus obtained X is called its mathematical expectation.

The mathematical expectation of a discrete random variable is the sum of the products of all its possible values ​​and the probabilities of these values:

Example 1 Organized a win-win lottery. There are 1000 winnings, 400 of which are 10 rubles each. 300 - 20 rubles each 200 - 100 rubles each. and 100 - 200 rubles each. What is the average winnings for a person who buys one ticket?

Decision. We will find the average win if the total amount of winnings, which is equal to 10*400 + 20*300 + 100*200 + 200*100 = 50,000 rubles, is divided by 1000 (the total amount of winnings). Then we get 50000/1000 = 50 rubles. But the expression for calculating the average gain can also be represented in the following form:

On the other hand, under these conditions, the amount of winnings is a random variable that can take on the values ​​of 10, 20, 100 and 200 rubles. with probabilities equal to 0.4, respectively; 0.3; 0.2; 0.1. Therefore, the expected average payoff is equal to the sum of the products of the size of the payoffs and the probability of receiving them.

Example 2 The publisher decided to publish a new book. He is going to sell the book for 280 rubles, of which 200 will be given to him, 50 to the bookstore, and 30 to the author. The table gives information about the cost of publishing a book and the likelihood of selling a certain number of copies of the book.

Find the publisher's expected profit.

Decision. The random variable "profit" is equal to the difference between the income from the sale and the cost of the costs. For example, if 500 copies of a book are sold, then the income from the sale is 200 * 500 = 100,000, and the cost of publishing is 225,000 rubles. Thus, the publisher faces a loss of 125,000 rubles. The following table summarizes the expected values ​​of the random variable - profit:

NumberProfit xi Probability pi xi p i
500 -125000 0,20 -25000
1000 -50000 0,40 -20000
2000 100000 0,25 25000
3000 250000 0,10 25000
4000 400000 0,05 20000
Total: 1,00 25000

Thus, we obtain the mathematical expectation of the publisher's profit:

.

Example 3 Chance to hit with one shot p= 0.2. Determine the consumption of shells that provide the mathematical expectation of the number of hits equal to 5.

Decision. From the same expectation formula that we have used so far, we express x- consumption of shells:

.

Example 4 Determine the mathematical expectation of a random variable x number of hits with three shots, if the probability of hitting with each shot p = 0,4 .

Hint: find the probability of the values ​​of a random variable by Bernoulli formula .

Expectation Properties

Consider the properties of mathematical expectation.

Property 1. The mathematical expectation of a constant value is equal to this constant:

Property 2. The constant factor can be taken out of the expectation sign:

Property 3. The mathematical expectation of the sum (difference) of random variables is equal to the sum (difference) of their mathematical expectations:

Property 4. The mathematical expectation of the product of random variables is equal to the product of their mathematical expectations:

Property 5. If all values ​​of the random variable X decrease (increase) by the same number With, then its mathematical expectation will decrease (increase) by the same number:

When you can not be limited only to mathematical expectation

In most cases, only the mathematical expectation cannot adequately characterize a random variable.

Let random variables X and Y are given by the following distribution laws:

Meaning X Probability
-0,1 0,1
-0,01 0,2
0 0,4
0,01 0,2
0,1 0,1
Meaning Y Probability
-20 0,3
-10 0,1
0 0,2
10 0,1
20 0,3

The mathematical expectations of these quantities are the same - equal to zero:

However, their distribution is different. Random value X can only take values ​​that are little different from the mathematical expectation, and the random variable Y can take values ​​that deviate significantly from the mathematical expectation. A similar example: the average wage does not make it possible to judge the proportion of high- and low-paid workers. In other words, by mathematical expectation one cannot judge what deviations from it, at least on average, are possible. To do this, you need to find the variance of a random variable.

Dispersion of a discrete random variable

dispersion discrete random variable X is called the mathematical expectation of the square of its deviation from the mathematical expectation:

The standard deviation of a random variable X is the arithmetic value of the square root of its variance:

.

Example 5 Calculate variances and standard deviations of random variables X and Y, whose distribution laws are given in the tables above.

Decision. Mathematical expectations of random variables X and Y, as found above, are equal to zero. According to the dispersion formula for E(X)=E(y)=0 we get:

Then the standard deviations of random variables X and Y constitute

.

Thus, with the same mathematical expectations, the variance of the random variable X very small and random Y- significant. This is a consequence of the difference in their distribution.

Example 6 The investor has 4 alternative investment projects. The table summarizes the data on the expected profit in these projects with the corresponding probability.

Project 1Project 2Project 3Project 4
500, P=1 1000, P=0,5 500, P=0,5 500, P=0,5
0, P=0,5 1000, P=0,25 10500, P=0,25
0, P=0,25 9500, P=0,25

Find for each alternative the mathematical expectation, variance and standard deviation.

Decision. Let us show how these quantities are calculated for the 3rd alternative:

The table summarizes the found values ​​for all alternatives.

All alternatives have the same mathematical expectation. This means that in the long run everyone has the same income. The standard deviation can be interpreted as a measure of risk - the larger it is, the greater the risk of the investment. An investor who doesn't want much risk will choose project 1 because it has the smallest standard deviation (0). If the investor prefers risk and high returns in a short period, then he will choose the project with the largest standard deviation - project 4.

Dispersion Properties

Let us present the properties of the dispersion.

Property 1. The dispersion of a constant value is zero:

Property 2. The constant factor can be taken out of the dispersion sign by squaring it:

.

Property 3. The variance of a random variable is equal to the mathematical expectation of the square of this value, from which the square of the mathematical expectation of the value itself is subtracted:

,

where .

Property 4. The variance of the sum (difference) of random variables is equal to the sum (difference) of their variances:

Example 7 It is known that a discrete random variable X takes only two values: −3 and 7. In addition, the mathematical expectation is known: E(X) = 4 . Find the variance of a discrete random variable.

Decision. Denote by p the probability with which a random variable takes on a value x1 = −3 . Then the probability of the value x2 = 7 will be 1 − p. Let's derive the equation for mathematical expectation:

E(X) = x 1 p + x 2 (1 − p) = −3p + 7(1 − p) = 4 ,

where we get the probabilities: p= 0.3 and 1 − p = 0,7 .

The law of distribution of a random variable:

X −3 7
p 0,3 0,7

We calculate the variance of this random variable using the formula from property 3 of the variance:

D(X) = 2,7 + 34,3 − 16 = 21 .

Find the mathematical expectation of a random variable yourself, and then see the solution

Example 8 Discrete random variable X takes only two values. It takes the larger value of 3 with a probability of 0.4. In addition, the variance of the random variable is known D(X) = 6 . Find the mathematical expectation of a random variable.

Example 9 An urn contains 6 white and 4 black balls. 3 balls are taken from the urn. The number of white balls among the drawn balls is a discrete random variable X. Find the mathematical expectation and variance of this random variable.

Decision. Random value X can take the values ​​0, 1, 2, 3. The corresponding probabilities can be calculated from rule of multiplication of probabilities. The law of distribution of a random variable:

X 0 1 2 3
p 1/30 3/10 1/2 1/6

Hence the mathematical expectation of this random variable:

M(X) = 3/10 + 1 + 1/2 = 1,8 .

The variance of a given random variable is:

D(X) = 0,3 + 2 + 1,5 − 3,24 = 0,56 .

Mathematical expectation and dispersion of a continuous random variable

For a continuous random variable, the mechanical interpretation of the mathematical expectation will retain the same meaning: the center of mass for a unit mass distributed continuously on the x-axis with density f(x). In contrast to a discrete random variable, for which the function argument xi changes abruptly, for a continuous random variable, the argument changes continuously. But the mathematical expectation of a continuous random variable is also related to its mean value.

To find the mathematical expectation and variance of a continuous random variable, you need to find definite integrals . If a density function of a continuous random variable is given, then it enters directly into the integrand. If a probability distribution function is given, then by differentiating it, you need to find the density function.

The arithmetic average of all possible values ​​of a continuous random variable is called its mathematical expectation, denoted by or .

Probability theory is a special branch of mathematics that is studied only by students of higher educational institutions. Do you love calculations and formulas? Are you not afraid of the prospects of acquaintance with the normal distribution, the entropy of the ensemble, the mathematical expectation and the variance of a discrete random variable? Then this subject will be of great interest to you. Let's get acquainted with some of the most important basic concepts of this section of science.

Let's remember the basics

Even if you remember the simplest concepts of probability theory, do not neglect the first paragraphs of the article. The fact is that without a clear understanding of the basics, you will not be able to work with the formulas discussed below.

So, there is some random event, some experiment. As a result of the actions performed, we can get several outcomes - some of them are more common, others less common. The probability of an event is the ratio of the number of actually obtained outcomes of one type to the total number of possible ones. Only knowing the classical definition of this concept, you can begin to study the mathematical expectation and dispersion of continuous random variables.

Average

Back in school, in mathematics lessons, you started working with the arithmetic mean. This concept is widely used in probability theory, and therefore it cannot be ignored. The main thing for us at the moment is that we will encounter it in the formulas for the mathematical expectation and variance of a random variable.

We have a sequence of numbers and want to find the arithmetic mean. All that is required of us is to sum everything available and divide by the number of elements in the sequence. Let we have numbers from 1 to 9. The sum of the elements will be 45, and we will divide this value by 9. Answer: - 5.

Dispersion

In scientific terms, variance is the average square of the deviations of the obtained feature values ​​from the arithmetic mean. One is denoted by a capital Latin letter D. What is needed to calculate it? For each element of the sequence, we calculate the difference between the available number and the arithmetic mean and square it. There will be exactly as many values ​​as there can be outcomes for the event we are considering. Next, we summarize everything received and divide by the number of elements in the sequence. If we have five possible outcomes, then divide by five.

The variance also has properties that you need to remember in order to apply it when solving problems. For example, if the random variable is increased by X times, the variance increases by X times the square (i.e., X*X). It is never less than zero and does not depend on shifting values ​​by an equal value up or down. Also, for independent trials, the variance of the sum is equal to the sum of the variances.

Now we definitely need to consider examples of the variance of a discrete random variable and the mathematical expectation.

Let's say we run 21 experiments and get 7 different outcomes. We observed each of them, respectively, 1,2,2,3,4,4 and 5 times. What will be the variance?

First, we calculate the arithmetic mean: the sum of the elements, of course, is 21. We divide it by 7, getting 3. Now we subtract 3 from each number in the original sequence, square each value, and add the results together. It turns out 12. Now it remains for us to divide the number by the number of elements, and, it would seem, that's all. But there is a catch! Let's discuss it.

Dependence on the number of experiments

It turns out that when calculating the variance, the denominator can be one of two numbers: either N or N-1. Here N is the number of experiments performed or the number of elements in the sequence (which is essentially the same thing). What does it depend on?

If the number of tests is measured in hundreds, then we must put N in the denominator. If in units, then N-1. The scientists decided to draw the border quite symbolically: today it runs along the number 30. If we conducted less than 30 experiments, then we will divide the amount by N-1, and if more, then by N.

Task

Let's go back to our example of solving the variance and expectation problem. We got an intermediate number of 12, which had to be divided by N or N-1. Since we conducted 21 experiments, which is less than 30, we will choose the second option. So the answer is: the variance is 12 / 2 = 2.

Expected value

Let's move on to the second concept, which we must consider in this article. The mathematical expectation is the result of adding all possible outcomes multiplied by the corresponding probabilities. It is important to understand that the resulting value, as well as the result of calculating the variance, is obtained only once for the whole task, no matter how many outcomes it considers.

The mathematical expectation formula is quite simple: we take the outcome, multiply it by its probability, add the same for the second, third result, etc. Everything related to this concept is easy to calculate. For example, the sum of mathematical expectations is equal to the mathematical expectation of the sum. The same is true for the work. Not every quantity in probability theory allows such simple operations to be performed. Let's take a task and calculate the value of two concepts we have studied at once. In addition, we were distracted by theory - it's time to practice.

One more example

We ran 50 trials and got 10 kinds of outcomes - numbers 0 to 9 - appearing in varying percentages. These are, respectively: 2%, 10%, 4%, 14%, 2%, 18%, 6%, 16%, 10%, 18%. Recall that to get the probabilities, you need to divide the percentage values ​​by 100. Thus, we get 0.02; 0.1 etc. Let us present an example of solving the problem for the variance of a random variable and the mathematical expectation.

We calculate the arithmetic mean using the formula that we remember from elementary school: 50/10 = 5.

Now let's translate the probabilities into the number of outcomes "in pieces" to make it more convenient to count. We get 1, 5, 2, 7, 1, 9, 3, 8, 5 and 9. Subtract the arithmetic mean from each value obtained, after which we square each of the results obtained. See how to do this with the first element as an example: 1 - 5 = (-4). Further: (-4) * (-4) = 16. For other values, do these operations yourself. If you did everything right, then after adding everything you get 90.

Let's continue calculating the variance and mean by dividing 90 by N. Why do we choose N and not N-1? That's right, because the number of experiments performed exceeds 30. So: 90/10 = 9. We got the dispersion. If you get a different number, don't despair. Most likely, you made a banal error in the calculations. Double-check what you wrote, and for sure everything will fall into place.

Finally, let's recall the mathematical expectation formula. We will not give all the calculations, we will only write the answer with which you can check after completing all the required procedures. The expected value will be 5.48. We only recall how to carry out operations, using the example of the first elements: 0 * 0.02 + 1 * 0.1 ... and so on. As you can see, we simply multiply the value of the outcome by its probability.

Deviation

Another concept closely related to dispersion and mathematical expectation is the standard deviation. It is denoted either by the Latin letters sd, or by the Greek lowercase "sigma". This concept shows how, on average, values ​​deviate from the central feature. To find its value, you need to calculate the square root of the variance.

If you plot a normal distribution and want to see the squared deviation directly on it, this can be done in several steps. Take half of the image to the left or right of the mode (central value), draw a perpendicular to the horizontal axis so that the areas of the resulting figures are equal. The value of the segment between the middle of the distribution and the resulting projection on the horizontal axis will be the standard deviation.

Software

As can be seen from the descriptions of the formulas and the examples presented, calculating the variance and mathematical expectation is not the easiest procedure from an arithmetic point of view. In order not to waste time, it makes sense to use the program used in higher education - it is called "R". It has functions that allow you to calculate values ​​for many concepts from statistics and probability theory.

For example, you define a vector of values. This is done as follows: vector<-c(1,5,2…). Теперь, когда вам потребуется посчитать какие-либо значения для этого вектора, вы пишете функцию и задаете его в качестве аргумента. Для нахождения дисперсии вам нужно будет использовать функцию var. Пример её использования: var(vector). Далее вы просто нажимаете «ввод» и получаете результат.

Finally

Dispersion and mathematical expectation are without which it is difficult to calculate anything in the future. In the main course of lectures at universities, they are considered already in the first months of studying the subject. It is precisely because of the lack of understanding of these simple concepts and the inability to calculate them that many students immediately begin to fall behind in the program and later receive poor marks in the session, which deprives them of scholarships.

Practice at least one week for half an hour a day, solving tasks similar to those presented in this article. Then, on any probability theory test, you will cope with examples without extraneous tips and cheat sheets.

As already known, the distribution law completely characterizes a random variable. However, the distribution law is often unknown and one has to limit oneself to lesser information. Sometimes it is even more profitable to use numbers that describe a random variable in total; such numbers are called numerical characteristics of a random variable. Mathematical expectation is one of the important numerical characteristics.

The mathematical expectation, as will be shown below, is approximately equal to the average value of the random variable. To solve many problems, it is enough to know the mathematical expectation. For example, if it is known that the mathematical expectation of the number of points scored by the first shooter is greater than that of the second, then the first shooter, on average, knocks out more points than the second, and therefore shoots better than the second. Although the mathematical expectation gives much less information about a random variable than the law of its distribution, but for solving problems like the one given and many others, knowledge of the mathematical expectation is sufficient.

§ 2. Mathematical expectation of a discrete random variable

mathematical expectation A discrete random variable is called the sum of the products of all its possible values ​​and their probabilities.

Let the random variable X can only take values X 1 , X 2 , ..., X P , whose probabilities are respectively equal R 1 , R 2 , . . ., R P . Then the mathematical expectation M(X) random variable X is defined by the equality

M(X) = X 1 R 1 + X 2 R 2 + … + x n p n .

If a discrete random variable X takes on a countable set of possible values, then

M(X)=

moreover, the mathematical expectation exists if the series on the right side of the equality converges absolutely.

Comment. It follows from the definition that the mathematical expectation of a discrete random variable is a non-random (constant) variable. We recommend that you remember this statement, as it is used repeatedly later on. Later it will be shown that the mathematical expectation of a continuous random variable is also a constant value.

Example 1 Find the mathematical expectation of a random variable X, knowing the law of its distribution:

Decision. The desired mathematical expectation is equal to the sum of the products of all possible values ​​of a random variable and their probabilities:

M(X)= 3* 0, 1+ 5* 0, 6+ 2* 0, 3= 3, 9.

Example 2 Find the mathematical expectation of the number of occurrences of an event BUT in one trial, if the probability of an event BUT is equal to R.

Decision. Random value X - number of occurrences of the event BUT in one test - can take only two values: X 1 = 1 (event BUT happened) with a probability R and X 2 = 0 (event BUT did not occur) with a probability q= 1 -R. The desired mathematical expectation

M(X)= 1* p+ 0* q= p

So, the mathematical expectation of the number of occurrences of an event in one trial is equal to the probability of this event. This result will be used below.

§ 3. Probabilistic meaning of mathematical expectation

Let produced P tests in which the random variable X accepted t 1 times value X 1 , t 2 times value X 2 ,...,m k times value x k , and t 1 + t 2 + …+t to = p. Then the sum of all values ​​taken X, is equal to

X 1 t 1 + X 2 t 2 + ... + X to t to .

Find the arithmetic mean of all values ​​accepted as a random variable, for which we divide the found sum by the total number of trials:

= (X 1 t 1 + X 2 t 2 + ... + X to t to)/P,

= X 1 (m 1 / n) + X 2 (m 2 / n) + ... + X to (t to /P). (*)

Noticing that the relationship m 1 / n- relative frequency W 1 values X 1 , m 2 / n - relative frequency W 2 values X 2 etc., we write the relation (*) as follows:

=X 1 W 1 + x 2 W 2 + .. . + X to W k . (**)

Let us assume that the number of trials is sufficiently large. Then the relative frequency is approximately equal to the probability of occurrence of the event (this will be proved in Chapter IX, § 6):

W 1 p 1 , W 2 p 2 , …, W k p k .

Replacing the relative frequencies in relation (**) with the corresponding probabilities, we obtain

x 1 p 1 + X 2 R 2 + … + X to R to .

The right side of this approximate equality is M(X). So,

M(X).

The probabilistic meaning of the result obtained is as follows: mathematical expectation is approximately equal to(the more accurate the greater the number of trials) the arithmetic mean of the observed values ​​of the random variable.

Remark 1. It is easy to see that the mathematical expectation is greater than the smallest and less than the largest possible values. In other words, on the number axis, the possible values ​​are located to the left and right of the expected value. In this sense, the expectation characterizes the location of the distribution and is therefore often referred to as distribution center.

This term is borrowed from mechanics: if the masses R 1 , R 2 , ..., R P located at points with abscissas x 1 , X 2 , ..., X n, and
then the abscissa of the center of gravity

x c =
.

Given that
=
M (X) and
we get M(X)= x with .

So, the mathematical expectation is the abscissa of the center of gravity of a system of material points, the abscissas of which are equal to the possible values ​​of a random variable, and the masses are equal to their probabilities.

Remark 2. The origin of the term "expectation" is associated with the initial period of the emergence of probability theory (XVI-XVII centuries), when its scope was limited to gambling. The player was interested in the average value of the expected payoff, or, in other words, the mathematical expectation of the payoff.

Random variables, in addition to distribution laws, can also be described numerical characteristics .

mathematical expectation M (x) of a random variable is called its average value.

The mathematical expectation of a discrete random variable is calculated by the formula

where values ​​of a random variable, p i- their probabilities.

Consider the properties of mathematical expectation:

1. The mathematical expectation of a constant is equal to the constant itself

2. If a random variable is multiplied by a certain number k, then the mathematical expectation will be multiplied by the same number

M (kx) = kM (x)

3. The mathematical expectation of the sum of random variables is equal to the sum of their mathematical expectations

M (x 1 + x 2 + ... + x n) \u003d M (x 1) + M (x 2) + ... + M (x n)

4. M (x 1 - x 2) \u003d M (x 1) - M (x 2)

5. For independent random variables x 1 , x 2 , … x n the mathematical expectation of the product is equal to the product of their mathematical expectations

M (x 1, x 2, ... x n) \u003d M (x 1) M (x 2) ... M (x n)

6. M (x - M (x)) \u003d M (x) - M (M (x)) \u003d M (x) - M (x) \u003d 0

Let's calculate the mathematical expectation for the random variable from Example 11.

M(x) == .

Example 12. Let the random variables x 1 , x 2 be given by the distribution laws, respectively:

x 1 Table 2

x 2 Table 3

Calculate M (x 1) and M (x 2)

M (x 1) \u003d (- 0.1) 0.1 + (- 0.01) 0.2 + 0 0.4 + 0.01 0.2 + 0.1 0.1 \u003d 0

M (x 2) \u003d (- 20) 0.3 + (- 10) 0.1 + 0 0.2 + 10 0.1 + 20 0.3 \u003d 0

The mathematical expectations of both random variables are the same - they are equal to zero. However, their distribution is different. If the values ​​of x 1 differ little from their mathematical expectation, then the values ​​of x 2 differ to a large extent from their mathematical expectation, and the probabilities of such deviations are not small. These examples show that it is impossible to determine from the average value what deviations from it take place both up and down. Thus, with the same average annual precipitation in two localities, it cannot be said that these localities are equally favorable for agricultural work. Similarly, by the indicator of average wages, it is not possible to judge the proportion of high- and low-paid workers. Therefore, a numerical characteristic is introduced - dispersion D(x) , which characterizes the degree of deviation of a random variable from its mean value:

D (x) = M (x - M (x)) 2 . (2)

Dispersion is the mathematical expectation of the squared deviation of a random variable from the mathematical expectation. For a discrete random variable, the variance is calculated by the formula:

D(x)= = (3)

It follows from the definition of variance that D (x) 0.

Dispersion properties:

1. Dispersion of the constant is zero

2. If a random variable is multiplied by some number k, then the variance is multiplied by the square of this number

D (kx) = k 2 D (x)

3. D (x) \u003d M (x 2) - M 2 (x)

4. For pairwise independent random variables x 1 , x 2 , … x n the variance of the sum is equal to the sum of the variances.

D (x 1 + x 2 + ... + x n) = D (x 1) + D (x 2) + ... + D (x n)

Let's calculate the variance for the random variable from Example 11.

Mathematical expectation M (x) = 1. Therefore, according to the formula (3) we have:

D (x) = (0 – 1) 2 1/4 + (1 – 1) 2 1/2 + (2 – 1) 2 1/4 =1 1/4 +1 1/4= 1/2

Note that it is easier to calculate the variance if we use property 3:

D (x) \u003d M (x 2) - M 2 (x).

Let's calculate the variances for random variables x 1 , x 2 from Example 12 using this formula. The mathematical expectations of both random variables are equal to zero.

D (x 1) \u003d 0.01 0.1 + 0.0001 0.2 + 0.0001 0.2 + 0.01 0.1 \u003d 0.001 + 0.00002 + 0.00002 + 0.001 \u003d 0.00204

D (x 2) \u003d (-20) 2 0.3 + (-10) 2 0.1 + 10 2 0.1 + 20 2 0.3 \u003d 240 +20 \u003d 260

The closer the dispersion value is to zero, the smaller the spread of the random variable relative to the mean value.

The value is called standard deviation. Random fashion x discrete type Md is the value of the random variable, which corresponds to the highest probability.

Random fashion x continuous type Md, is a real number defined as the maximum point of the probability distribution density f(x).

Median of a random variable x continuous type Mn is a real number that satisfies the equation