Math expectation by distribution density. Examples of problem solving. The mathematical expectation is

In the previous one, we gave a number of formulas that allow us to find the numerical characteristics of functions when the laws of distribution of arguments are known. However, in many cases, to find the numerical characteristics of functions, it is not even necessary to know the laws of distribution of arguments, but it is enough to know only some of their numerical characteristics; in this case, we do without any laws of distribution at all. Determining the numerical characteristics of functions by given numerical characteristics of the arguments is widely used in probability theory and makes it possible to significantly simplify the solution of a number of problems. For the most part, such simplified methods relate to linear functions; however, some elementary non-linear functions also allow this approach.

In the present, we present a number of theorems on the numerical characteristics of functions, which in their totality represent a very simple apparatus for calculating these characteristics, applicable in a wide range of conditions.

1. Expected value non-random value

The stated property is rather obvious; it can be proved by considering a non-random variable as a particular type of a random one, with one possible value with a probability of one; then according to the general formula for the mathematical expectation:

.

2. Dispersion of a non-random variable

If is a non-random value, then

3. Removal of a non-random variable beyond the sign of mathematical expectation

, (10.2.1)

i.e., a non-random value can be taken out of the expectation sign.

Proof.

a) For discontinuous quantities

b) For continuous quantities

.

4. Removal of a non-random value for the sign of the variance and standard deviation

If is a non-random variable, and is random, then

, (10.2.2)

i.e., a non-random value can be taken out of the dispersion sign by squaring it.

Proof. By definition of variance

Consequence

,

i.e., a non-random value can be taken out of the sign of the standard deviation by its absolute value. We obtain the proof by extracting the square root from the formula (10.2.2) and taking into account that the r.s.c. is an essentially positive value.

5. Mathematical expectation of the sum of random variables

Let us prove that for any two random variables and

i.e. the mathematical expectation of the sum of two random variables is equal to the sum of their mathematical expectations.

This property is known as the expectation addition theorem.

Proof.

a) Let be a system of discontinuous random variables. Let us apply to the sum of random variables the general formula (10.1.6) for the mathematical expectation of a function of two arguments:

.

Ho is nothing more than the total probability that the value will take on the value :

;

Consequently,

.

In a similar way, we will prove that

,

and the theorem is proven.

b) Let be a system of continuous random variables. According to the formula (10.1.7)

. (10.2.4)

We transform the first of the integrals (10.2.4):

;

likewise

,

and the theorem is proven.

It should be specially noted that the theorem of addition of mathematical expectations is valid for any random variables - both dependent and independent.

The expectation addition theorem can be generalized to an arbitrary number of terms:

, (10.2.5)

i.e. the mathematical expectation of the sum of several random variables is equal to the sum of their mathematical expectations.

To prove it, it suffices to apply the method of complete induction.

6. Mathematical expectation of a linear function

Consider a linear function of several random arguments:

where are non-random coefficients. Let's prove that

, (10.2.6)

i.e. the mean of a linear function is equal to the same linear function of the mean of the arguments.

Proof. Using the addition theorem m.o. and the rule of taking a non-random variable out of the sign of m. o., we get:

.

7. Dispepthis sum of random variables

The variance of the sum of two random variables is equal to the sum of their variances plus twice the correlation moment:

Proof. Denote

According to the addition theorem of mathematical expectations

Let's pass from random variables to the corresponding centered variables . Subtracting term by term from equality (10.2.8) equality (10.2.9), we have:

By definition of variance

Q.E.D.

Formula (10.2.7) for the variance of the sum can be generalized to any number of terms:

, (10.2.10)

where is the correlation moment of the values ​​, the sign under the sum means that the summation applies to all possible pairwise combinations of random variables .

The proof is similar to the previous one and follows from the formula for the square of a polynomial.

Formula (10.2.10) can be written in another form:

, (10.2.11)

where the double sum extends to all elements of the correlation matrix of the system of quantities , containing both correlation moments and variances.

If all random variables , included in the system, are uncorrelated (i.e., at ), formula (10.2.10) takes the form:

, (10.2.12)

i.e., the variance of the sum of uncorrelated random variables is equal to the sum of the variances of the terms.

This proposition is known as the variance addition theorem.

8. Dispersion of a linear function

Consider a linear function of several random variables.

where are non-random variables.

Let us prove that the dispersion of this linear function is expressed by the formula

, (10.2.13)

where is the correlation moment of the quantities , .

Proof. Let's introduce the notation:

. (10.2.14)

Applying formula (10.2.10) for the variance of the sum to the right side of expression (10.2.14) and taking into account that , we obtain:

where is the correlation moment of the quantities:

.

Let's calculate this moment. We have:

;

likewise

Substituting this expression into (10.2.15), we arrive at formula (10.2.13).

In the particular case when all quantities uncorrelated, formula (10.2.13) takes the form:

, (10.2.16)

i.e., the variance of a linear function of uncorrelated random variables is equal to the sum of the products of the squares of the coefficients and the variances of the corresponding arguments.

9. Mathematical expectation of the product of random variables

The mathematical expectation of the product of two random variables is equal to the product of their mathematical expectations plus the correlation moment:

Proof. We will proceed from the definition of the correlation moment:

We transform this expression using the properties of the mathematical expectation:

which is obviously equivalent to formula (10.2.17).

If random variables are uncorrelated, then formula (10.2.17) takes the form:

i.e., the mean of the product of two uncorrelated random variables is equal to the product of their mean.

This statement is known as the expectation multiplication theorem.

Formula (10.2.17) is nothing but an expression of the second mixed central moment of the system in terms of the second mixed initial moment and mathematical expectations:

. (10.2.19)

This expression is often used in practice when calculating the correlation moment in the same way that for one random variable the variance is often calculated through the second initial moment and the mathematical expectation.

The expectation multiplication theorem can also be generalized to an arbitrary number of factors, only in this case for its application it is not enough that the quantities are uncorrelated, but it is required that some higher mixed moments also vanish, the number of which depends on the number of terms in the product. These conditions are certainly satisfied if the random variables included in the product are independent. In this case

, (10.2.20)

i.e. the mathematical expectation of the product of independent random variables is equal to the product of their mathematical expectations.

This proposition can be easily proved by complete induction.

10. Dispersion of the product of independent random variables

Let us prove that for independent quantities

Proof. Let's denote . By definition of variance

Since the quantities are independent, and

For independent, the quantities are also independent; Consequently,

,

But there is nothing else than the second initial moment of the quantity , and, therefore, is expressed in terms of the variance:

;

likewise

.

Substituting these expressions into formula (10.2.22) and bringing like terms, we arrive at formula (10.2.21).

In the case when centered random variables are multiplied (values ​​with mathematical expectations equal to zero), formula (10.2.21) takes the form:

, (10.2.23)

i.e., the variance of the product of independent centered random variables is equal to the product of their variances.

11. Higher moments of the sum of random variables

In some cases it is necessary to calculate the higher moments of the sum of independent random variables. Let us prove some related relations.

1) If the quantities are independent, then

Proof.

whence by the expectation multiplication theorem

But the first central moment for any quantity is zero; two middle terms vanish, and formula (10.2.24) is proved.

Relation (10.2.24) can be easily generalized by induction to an arbitrary number of independent terms:

. (10.2.25)

2) The fourth central moment of the sum of two independent random variables is expressed by the formula

where are the dispersions of and .

The proof is exactly the same as the previous one.

Using the method of complete induction, it is easy to prove the generalization of formula (10.2.26) to an arbitrary number of independent terms.

Quantity

The main numerical characteristics of random

The density distribution law characterizes a random variable. But often it is unknown, and one has to confine oneself to lesser information. Sometimes it is even more profitable to use numbers that describe a random variable in total. Such numbers are called numerical characteristics random variable. Let's consider the main ones.

Definition:The mathematical expectation M(X) of a discrete random variable is the sum of the products of all possible values ​​of this variable and their probabilities:

If a discrete random variable X takes on a countable set of possible values, then

Moreover, the mathematical expectation exists if the given series converges absolutely.

It follows from the definition that M(X) discrete random variable is a non-random (constant) variable.

Example: Let be X– number of occurrences of the event BUT in one test P(A) = p. It is required to find the mathematical expectation X.

Solution: Let's make a tabular distribution law X:

X 0 1
P 1-p p

Let's find the mathematical expectation:

In this way, the mathematical expectation of the number of occurrences of an event in one trial is equal to the probability of this event.

Origin of the term expected value associated with the initial period of the emergence of probability theory (XVI-XVII centuries), when the scope of its application was limited to gambling. The player was interested in the average value of the expected payoff, i.e. mathematical expectation of winning.

Consider probabilistic meaning of mathematical expectation.

Let produced n tests in which the random variable X accepted m 1 times value x 1, m2 times value x2, and so on, and finally she accepted m k times value x k, moreover m 1 + m 2 +…+ + m k = n.

Then the sum of all values ​​taken by the random variable X, is equal to x 1 m1 +x2 m 2 +…+x k m k.

Arithmetic mean of all values ​​taken by the random variable X,equals:

since is the relative frequency of the value for any value i = 1, …, k.

As is known, if the number of trials n is large enough, then the relative frequency is approximately equal to the probability of occurrence of the event , therefore,

In this way, .

Output:The mathematical expectation of a discrete random variable is approximately equal (the more accurate, the greater the number of trials) to the arithmetic mean of the observed values ​​of the random variable.

Consider the basic properties of mathematical expectation.

Property 1:The mathematical expectation of a constant value is equal to the constant value itself:

M(S) = S.

Proof: permanent FROM can be considered which has one possible meaning FROM and accept it with probability p = 1. Consequently, M(S)=S 1= C.



Let's define product of a constant value C and a discrete random variable X as a discrete random variable SH, the possible values ​​of which are equal to the products of the constant FROM to possible values X SH are equal to the probabilities of the corresponding possible values X:

SH C C C
X
R

Property 2:The constant factor can be taken out of the expectation sign:

M(CX) = CM(X).

Proof: Let the random variable X given by the probability distribution law:

X
P

Let's write the law of probability distribution of a random variable CX:

CX C C C
P

M(CX) = C +C =C + ) = C M(X).

Definition:Two random variables are called independent if the distribution law of one of them does not depend on what possible values ​​the other variable has taken. Otherwise, the random variables are dependent.

Definition:Several random variables are called mutually independent if the laws of distribution of any number of them do not depend on what possible values ​​the other variables have taken.

Let's define product of independent discrete random variables X and Y as a discrete random variable XY, whose possible values ​​are equal to the products of each possible value X for every possible value Y. Probabilities of Possible Values XY are equal to the products of the probabilities of the possible values ​​of the factors.

Let distributions of random variables be given X And Y:

X
P
Y
G

Then the distribution of the random variable XY looks like:

XY
P

Some works may be equal. In this case, the probability of the possible value of the product is equal to the sum of the corresponding probabilities. For example, if = , then the probability of a value is

Property 3:The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations:

M(XY) = M(X) M(Y).

Proof: Let independent random variables X And Y given by their own probability distribution laws:

X
P
Y
G

To simplify calculations, we restrict ourselves to a small number of possible values. In general, the proof is similar.

Compose the law of distribution of a random variable XY:

XY
P

M(XY) =

M(X) M(Y).

Consequence:The mathematical expectation of the product of several mutually independent random variables is equal to the product of their mathematical expectations.

Proof: Let us prove for three mutually independent random variables X,Y,Z. random variables XY And Z independent, then we get:

M(XYZ) = M(XY Z) = M(XY) M(Z) = M(X) M(Y) M(Z).

For an arbitrary number of mutually independent random variables, the proof is carried out by the method of mathematical induction.

Example: Independent random variables X And Y

X 5 2
P 0,6 0,1 0,3
Y 7 9
G 0,8 0,2

Wanted to find M(XY).

Solution: Since the random variables X And Y independent, then M(XY)=M(X) M(Y)=(5 0,6+2 0,1+4 0,3) (7 0,8+9 0,2)= 4,4 7,4 = =32,56.

Let's define the sum of discrete random variables X and Y as a discrete random variable X+Y, whose possible values ​​are equal to the sums of each possible value X with every possible value Y. Probabilities of Possible Values X+Y for independent random variables X And Y are equal to the products of the probabilities of the terms, and for dependent random variables - to the products of the probability of one term and the conditional probability of the second.

If = and the probabilities of these values ​​are respectively equal to , then the probability (the same as ) is equal to .

Property 4:The mathematical expectation of the sum of two random variables (dependent or independent) is equal to the sum of the mathematical expectations of the terms:

M(X+Y) = M(X) + M(Y).

Proof: Let two random variables X And Y are given by the following distribution laws:

X
P
Y
G

To simplify the derivation, we restrict ourselves to two possible values ​​of each of the quantities. In general, the proof is similar.

Compose all possible values ​​of the random variable X+Y(assume, for simplicity, that these values ​​are different; if not, then the proof is similar):

X+Y
P

Let's find the mathematical expectation of this value.

M(X+Y) = + + + +

Let us prove that + = .

Event X= ( its probability P(X = ) entails the event that the random variable X+Y takes the value or (the probability of this event, according to the addition theorem, is ) and vice versa. Then = .

The equalities = = =

Substituting the right parts of these equalities into the resulting formula for the mathematical expectation, we get:

M(X + Y) = + ) = M(X) + M(Y).

Consequence:The mathematical expectation of the sum of several random variables is equal to the sum of the mathematical expectations of the terms.

Proof: Let us prove for three random variables X,Y,Z. Let's find the mathematical expectation of random variables X+Y And Z:

M(X+Y+Z)=M((X+Y Z)=M(X+Y) M(Z)=M(X)+M(Y)+M(Z)

For an arbitrary number of random variables, the proof is carried out by the method of mathematical induction.

Example: Find the average value of the sum of the number of points that can fall when throwing two dice.

Solution: Let be X- the number of points that can fall on the first die, Y- On the second. It is obvious that the random variables X And Y have the same distributions. Let's write the data of distributions X And Y into one table:

X 1 2 3 4 5 6
Y 1 2 3 4 5 6
P 1/6 1/6 1/6 1/6 1/6 1/6

M(X) = M(Y) (1+2+3+4+5+6) = =

M(X + Y) = 7.

So, the average value of the sum of the number of points that can fall out when throwing two dice is 7 .

Theorem:The mathematical expectation M(X) of the number of occurrences of event A in n independent trials is equal to the product of the number of trials and the probability of occurrence of the event in each trial: M(X) = np.

Proof: Let be X- the number of occurrences of the event A in n independent tests. Obviously, the total X event occurrences A in these trials is the sum of the number of occurrences of the event in the individual trials. Then, if the number of occurrences of the event in the first trial, in the second, and so on, finally, is the number of occurrences of the event in n th test, then the total number of occurrences of the event is calculated by the formula:

By property 4 of expectation we have:

M(X) = M( ) + … + M( ).

Since the mathematical expectation of the number of occurrences of an event in one trial is equal to the probability of the event, then

M( ) = M( )= … = M( ) = p.

Consequently, M(X) = np.

Example: The probability of hitting the target when firing from a gun is equal to p=0.6. Find the average number of hits if any 10 shots.

Solution: The hit at each shot does not depend on the outcomes of other shots, so the events under consideration are independent and, therefore, the desired mathematical expectation is equal to:

M(X) = np = 10 0,6 = 6.

So the average number of hits is 6.

Now consider the mathematical expectation of a continuous random variable.

Definition:The mathematical expectation of a continuous random variable X, the possible values ​​of which belong to the segment,is called the definite integral:

where f(x) is the probability distribution density.

If the possible values ​​of a continuous random variable X belong to the whole axis Ox, then

It is assumed that this improper integral converges absolutely, i.e. the integral converges If this requirement were not met, then the value of the integral would depend on the rate of tending (separately) of the lower limit to -∞, and the upper limit to +∞.

It can be proved that all properties of the mathematical expectation of a discrete random variable are preserved for a continuous random variable. The proof is based on the properties of definite and improper integrals.

Obviously, the expectation M(X) greater than the smallest and less than the largest of the possible values ​​of the random variable X. Those. on the number axis, the possible values ​​of a random variable are located to the left and to the right of its mathematical expectation. In this sense, the mathematical expectation M(X) characterizes the location of the distribution, and therefore it is often called distribution center.

Mathematical expectation and variance are the most commonly used numerical characteristics of a random variable. They characterize the most important features of the distribution: its position and degree of dispersion. In many problems of practice, a complete, exhaustive description of a random variable - the law of distribution - either cannot be obtained at all, or is not needed at all. In these cases, they are limited to an approximate description of a random variable using numerical characteristics.

The mathematical expectation is often referred to simply as the average value of a random variable. Dispersion of a random variable is a characteristic of dispersion, dispersion of a random variable around its mathematical expectation.

Mathematical expectation of a discrete random variable

Let's approach the concept of mathematical expectation, first proceeding from the mechanical interpretation of the distribution of a discrete random variable. Let the unit mass be distributed between the points of the x-axis x1 , x 2 , ..., x n, and each material point has a mass corresponding to it from p1 , p 2 , ..., p n. It is required to select one point on the x-axis, which characterizes the position of the entire system of material points, taking into account their masses. It is natural to take the center of mass of the system of material points as such a point. This is the weighted average of the random variable X, in which the abscissa of each point xi enters with a "weight" equal to the corresponding probability. The mean value of the random variable thus obtained X is called its mathematical expectation.

The mathematical expectation of a discrete random variable is the sum of the products of all its possible values ​​and the probabilities of these values:

Example 1 Organized a win-win lottery. There are 1000 winnings, 400 of which are 10 rubles each. 300 - 20 rubles each 200 - 100 rubles each. and 100 - 200 rubles each. What is the average winnings for a person who buys one ticket?

Solution. We will find the average win if the total amount of winnings, which is equal to 10*400 + 20*300 + 100*200 + 200*100 = 50,000 rubles, is divided by 1000 (the total amount of winnings). Then we get 50000/1000 = 50 rubles. But the expression for calculating the average gain can also be represented in the following form:

On the other hand, under these conditions, the amount of winnings is a random variable that can take on the values ​​of 10, 20, 100 and 200 rubles. with probabilities equal to 0.4, respectively; 0.3; 0.2; 0.1. Therefore, the expected average payoff is equal to the sum of the products of the size of the payoffs and the probability of receiving them.

Example 2 The publisher decided to publish a new book. He is going to sell the book for 280 rubles, of which 200 will be given to him, 50 to the bookstore, and 30 to the author. The table gives information about the cost of publishing a book and the likelihood of selling a certain number of copies of the book.

Find the publisher's expected profit.

Solution. The random variable "profit" is equal to the difference between the income from the sale and the cost of the costs. For example, if 500 copies of a book are sold, then the income from the sale is 200 * 500 = 100,000, and the cost of publishing is 225,000 rubles. Thus, the publisher faces a loss of 125,000 rubles. The following table summarizes the expected values ​​of the random variable - profit:

NumberProfit xi Probability pi xi p i
500 -125000 0,20 -25000
1000 -50000 0,40 -20000
2000 100000 0,25 25000
3000 250000 0,10 25000
4000 400000 0,05 20000
Total: 1,00 25000

Thus, we obtain the mathematical expectation of the publisher's profit:

.

Example 3 Chance to hit with one shot p= 0.2. Determine the consumption of shells that provide the mathematical expectation of the number of hits equal to 5.

Solution. From the same expectation formula that we have used so far, we express x- consumption of shells:

.

Example 4 Determine the mathematical expectation of a random variable x number of hits with three shots, if the probability of hitting with each shot p = 0,4 .

Hint: find the probability of the values ​​of a random variable by Bernoulli formula .

Expectation Properties

Consider the properties of mathematical expectation.

Property 1. The mathematical expectation of a constant value is equal to this constant:

Property 2. The constant factor can be taken out of the expectation sign:

Property 3. The mathematical expectation of the sum (difference) of random variables is equal to the sum (difference) of their mathematical expectations:

Property 4. The mathematical expectation of the product of random variables is equal to the product of their mathematical expectations:

Property 5. If all values ​​of the random variable X decrease (increase) by the same number FROM, then its mathematical expectation will decrease (increase) by the same number:

When you can not be limited only to mathematical expectation

In most cases, only the mathematical expectation cannot adequately characterize a random variable.

Let random variables X And Y are given by the following distribution laws:

Meaning X Probability
-0,1 0,1
-0,01 0,2
0 0,4
0,01 0,2
0,1 0,1
Meaning Y Probability
-20 0,3
-10 0,1
0 0,2
10 0,1
20 0,3

The mathematical expectations of these quantities are the same - equal to zero:

However, their distribution is different. Random value X can only take values ​​that are little different from the mathematical expectation, and the random variable Y can take values ​​that deviate significantly from the mathematical expectation. A similar example: the average wage does not make it possible to judge the proportion of high- and low-paid workers. In other words, by mathematical expectation one cannot judge what deviations from it, at least on average, are possible. To do this, you need to find the variance of a random variable.

Dispersion of a discrete random variable

dispersion discrete random variable X is called the mathematical expectation of the square of its deviation from the mathematical expectation:

The standard deviation of a random variable X is the arithmetic value of the square root of its variance:

.

Example 5 Calculate variances and standard deviations of random variables X And Y, whose distribution laws are given in the tables above.

Solution. Mathematical expectations of random variables X And Y, as found above, are equal to zero. According to the dispersion formula for E(X)=E(y)=0 we get:

Then the standard deviations of random variables X And Y constitute

.

Thus, with the same mathematical expectations, the variance of the random variable X very small and random Y- significant. This is a consequence of the difference in their distribution.

Example 6 The investor has 4 alternative investment projects. The table summarizes the data on the expected profit in these projects with the corresponding probability.

Project 1Project 2Project 3Project 4
500, P=1 1000, P=0,5 500, P=0,5 500, P=0,5
0, P=0,5 1000, P=0,25 10500, P=0,25
0, P=0,25 9500, P=0,25

Find for each alternative the mathematical expectation, variance and standard deviation.

Solution. Let us show how these quantities are calculated for the 3rd alternative:

The table summarizes the found values ​​for all alternatives.

All alternatives have the same mathematical expectation. This means that in the long run everyone has the same income. The standard deviation can be interpreted as a measure of risk - the larger it is, the greater the risk of the investment. An investor who doesn't want much risk will choose project 1 because it has the smallest standard deviation (0). If the investor prefers risk and high returns in a short period, then he will choose the project with the largest standard deviation - project 4.

Dispersion Properties

Let us present the properties of the dispersion.

Property 1. The dispersion of a constant value is zero:

Property 2. The constant factor can be taken out of the dispersion sign by squaring it:

.

Property 3. The variance of a random variable is equal to the mathematical expectation of the square of this value, from which the square of the mathematical expectation of the value itself is subtracted:

,

where .

Property 4. The variance of the sum (difference) of random variables is equal to the sum (difference) of their variances:

Example 7 It is known that a discrete random variable X takes only two values: −3 and 7. In addition, the mathematical expectation is known: E(X) = 4 . Find the variance of a discrete random variable.

Solution. Denote by p the probability with which a random variable takes on a value x1 = −3 . Then the probability of the value x2 = 7 will be 1 − p. Let's derive the equation for mathematical expectation:

E(X) = x 1 p + x 2 (1 − p) = −3p + 7(1 − p) = 4 ,

where we get the probabilities: p= 0.3 and 1 − p = 0,7 .

The law of distribution of a random variable:

X −3 7
p 0,3 0,7

We calculate the variance of this random variable using the formula from property 3 of the variance:

D(X) = 2,7 + 34,3 − 16 = 21 .

Find the mathematical expectation of a random variable yourself, and then see the solution

Example 8 Discrete random variable X takes only two values. It takes the larger value of 3 with a probability of 0.4. In addition, the variance of the random variable is known D(X) = 6 . Find the mathematical expectation of a random variable.

Example 9 An urn contains 6 white and 4 black balls. 3 balls are taken from the urn. The number of white balls among the drawn balls is a discrete random variable X. Find the mathematical expectation and variance of this random variable.

Solution. Random value X can take the values ​​0, 1, 2, 3. The corresponding probabilities can be calculated from rule of multiplication of probabilities. The law of distribution of a random variable:

X 0 1 2 3
p 1/30 3/10 1/2 1/6

Hence the mathematical expectation of this random variable:

M(X) = 3/10 + 1 + 1/2 = 1,8 .

The variance of a given random variable is:

D(X) = 0,3 + 2 + 1,5 − 3,24 = 0,56 .

Mathematical expectation and dispersion of a continuous random variable

For a continuous random variable, the mechanical interpretation of the mathematical expectation will retain the same meaning: the center of mass for a unit mass distributed continuously on the x-axis with density f(x). In contrast to a discrete random variable, for which the function argument xi changes abruptly, for a continuous random variable, the argument changes continuously. But the mathematical expectation of a continuous random variable is also related to its mean value.

To find the mathematical expectation and variance of a continuous random variable, you need to find definite integrals . If a density function of a continuous random variable is given, then it enters directly into the integrand. If a probability distribution function is given, then by differentiating it, you need to find the density function.

The arithmetic average of all possible values ​​of a continuous random variable is called its mathematical expectation, denoted by or .

Random variable a variable is called which, as a result of each test, takes on one previously unknown value, depending on random causes. Random variables are denoted by capital Latin letters: $X,\ Y,\ Z,\ \dots $ By their type, random variables can be discrete And continuous.

Discrete random variable- this is such a random variable, the values ​​​​of which can be no more than countable, that is, either finite or countable. Countability means that the values ​​of a random variable can be enumerated.

Example 1 . Let us give examples of discrete random variables:

a) the number of hits on the target with $n$ shots, here the possible values ​​are $0,\ 1,\ \dots ,\ n$.

b) the number of coats of arms that fell out when tossing a coin, here the possible values ​​are $0,\ 1,\ \dots ,\ n$.

c) the number of ships that arrived on board (a countable set of values).

d) the number of calls arriving at the exchange (a countable set of values).

1. Law of probability distribution of a discrete random variable.

A discrete random variable $X$ can take the values ​​$x_1,\dots ,\ x_n$ with probabilities $p\left(x_1\right),\ \dots ,\ p\left(x_n\right)$. The correspondence between these values ​​and their probabilities is called distribution law of a discrete random variable. As a rule, this correspondence is specified using a table, in the first line of which the values ​​of $x_1,\dots ,\ x_n$ are indicated, and in the second line the probabilities corresponding to these values ​​are $p_1,\dots ,\ p_n$.

$\begin(array)(|c|c|)
\hline
X_i & x_1 & x_2 & \dots & x_n \\
\hline
p_i & p_1 & p_2 & \dots & p_n \\
\hline
\end(array)$

Example 2 . Let the random variable $X$ be the number of points dropped when tossing dice. Such a random variable $X$ can take the following values ​​$1,\ 2,\ 3,\ 4,\ 5,\ 6$. The probabilities of all these values ​​are equal to $1/6$. Then the probability distribution law for the random variable $X$:

$\begin(array)(|c|c|)
\hline
1 & 2 & 3 & 4 & 5 & 6 \\
\hline

\hline
\end(array)$

Comment. Since the events $1,\ 2,\ \dots ,\ 6$ form a complete group of events in the distribution law of the discrete random variable $X$, the sum of the probabilities must be equal to one, i.e. $\sum(p_i)=1$.

2. Mathematical expectation of a discrete random variable.

Mathematical expectation of a random variable specifies its "central" value. For a discrete random variable, the mathematical expectation is calculated as the sum of the products of the values ​​$x_1,\dots ,\ x_n$ and the probabilities $p_1,\dots ,\ p_n$ corresponding to these values, i.e.: $M\left(X\right)=\sum ^n_(i=1)(p_ix_i)$. In English literature, another notation $E\left(X\right)$ is used.

Expectation Properties$M\left(X\right)$:

  1. $M\left(X\right)$ is between the smallest and highest values random variable $X$.
  2. The mathematical expectation of a constant is equal to the constant itself, i.e. $M\left(C\right)=C$.
  3. The constant factor can be taken out of the expectation sign: $M\left(CX\right)=CM\left(X\right)$.
  4. The mathematical expectation of the sum of random variables is equal to the sum of their mathematical expectations: $M\left(X+Y\right)=M\left(X\right)+M\left(Y\right)$.
  5. The mathematical expectation of the product of independent random variables is equal to the product of their mathematical expectations: $M\left(XY\right)=M\left(X\right)M\left(Y\right)$.

Example 3 . Let's find the mathematical expectation of the random variable $X$ from example $2$.

$$M\left(X\right)=\sum^n_(i=1)(p_ix_i)=1\cdot ((1)\over (6))+2\cdot ((1)\over (6) )+3\cdot ((1)\over (6))+4\cdot ((1)\over (6))+5\cdot ((1)\over (6))+6\cdot ((1 )\over (6))=3.5.$$

We can notice that $M\left(X\right)$ is between the smallest ($1$) and largest ($6$) values ​​of the random variable $X$.

Example 4 . It is known that the mathematical expectation of the random variable $X$ is equal to $M\left(X\right)=2$. Find the mathematical expectation of the random variable $3X+5$.

Using the above properties, we get $M\left(3X+5\right)=M\left(3X\right)+M\left(5\right)=3M\left(X\right)+5=3\cdot 2 +5=11$.

Example 5 . It is known that the mathematical expectation of the random variable $X$ is equal to $M\left(X\right)=4$. Find the mathematical expectation of the random variable $2X-9$.

Using the above properties, we get $M\left(2X-9\right)=M\left(2X\right)-M\left(9\right)=2M\left(X\right)-9=2\cdot 4 -9=-1$.

3. Dispersion of a discrete random variable.

Possible values ​​of random variables with equal mathematical expectations can scatter differently around their average values. For example, in two student groups, the average score for the exam in probability theory turned out to be 4, but in one group everyone turned out to be good students, and in the other group, only C students and excellent students. Therefore, there is a need for such a numerical characteristic of a random variable, which would show the spread of the values ​​of a random variable around its mathematical expectation. This characteristic is dispersion.

Dispersion of a discrete random variable$X$ is:

$$D\left(X\right)=\sum^n_(i=1)(p_i(\left(x_i-M\left(X\right)\right))^2).\ $$

In English literature, the notation $V\left(X\right),\ Var\left(X\right)$ is used. Very often the variance $D\left(X\right)$ is calculated by the formula $D\left(X\right)=\sum^n_(i=1)(p_ix^2_i)-(\left(M\left(X \right)\right))^2$.

Dispersion Properties$D\left(X\right)$:

  1. The dispersion is always greater than or equal to zero, i.e. $D\left(X\right)\ge 0$.
  2. The dispersion from a constant is equal to zero, i.e. $D\left(C\right)=0$.
  3. The constant factor can be taken out of the dispersion sign, provided that it is squared, i.e. $D\left(CX\right)=C^2D\left(X\right)$.
  4. The variance of the sum of independent random variables is equal to the sum of their variances, i.e. $D\left(X+Y\right)=D\left(X\right)+D\left(Y\right)$.
  5. The variance of the difference of independent random variables is equal to the sum of their variances, i.e. $D\left(X-Y\right)=D\left(X\right)+D\left(Y\right)$.

Example 6 . Let us calculate the variance of the random variable $X$ from example $2$.

$$D\left(X\right)=\sum^n_(i=1)(p_i(\left(x_i-M\left(X\right)\right))^2)=((1)\over (6))\cdot (\left(1-3,5\right))^2+((1)\over (6))\cdot (\left(2-3,5\right))^2+ \dots +((1)\over (6))\cdot (\left(6-3,5\right))^2=((35)\over (12))\approx 2.92.$$

Example 7 . It is known that the variance of the random variable $X$ is equal to $D\left(X\right)=2$. Find the variance of the random variable $4X+1$.

Using the above properties, we find $D\left(4X+1\right)=D\left(4X\right)+D\left(1\right)=4^2D\left(X\right)+0=16D\ left(X\right)=16\cdot 2=32$.

Example 8 . It is known that the variance of $X$ is equal to $D\left(X\right)=3$. Find the variance of the random variable $3-2X$.

Using the above properties, we find $D\left(3-2X\right)=D\left(3\right)+D\left(2X\right)=0+2^2D\left(X\right)=4D\ left(X\right)=4\cdot 3=12$.

4. Distribution function of a discrete random variable.

The method of representing a discrete random variable in the form of a distribution series is not the only one, and most importantly, it is not universal, since a continuous random variable cannot be specified using a distribution series. There is another way to represent a random variable - the distribution function.

distribution function random variable $X$ is the function $F\left(x\right)$, which determines the probability that the random variable $X$ takes a value less than some fixed value $x$, i.e. $F\left(x\right)$ )=P\left(X< x\right)$

Distribution function properties:

  1. $0\le F\left(x\right)\le 1$.
  2. The probability that the random variable $X$ takes values ​​from the interval $\left(\alpha ;\ \beta \right)$ is equal to the difference between the values ​​of the distribution function at the ends of this interval: $P\left(\alpha< X < \beta \right)=F\left(\beta \right)-F\left(\alpha \right)$
  3. $F\left(x\right)$ - non-decreasing.
  4. $(\mathop(lim)_(x\to -\infty ) F\left(x\right)=0\ ),\ (\mathop(lim)_(x\to +\infty ) F\left(x \right)=1\ )$.

Example 9 . Let us find the distribution function $F\left(x\right)$ for the distribution law of the discrete random variable $X$ from example $2$.

$\begin(array)(|c|c|)
\hline
1 & 2 & 3 & 4 & 5 & 6 \\
\hline
1/6 & 1/6 & 1/6 & 1/6 & 1/6 & 1/6 \\
\hline
\end(array)$

If $x\le 1$, then obviously $F\left(x\right)=0$ (including $x=1$ $F\left(1\right)=P\left(X< 1\right)=0$).

If $1< x\le 2$, то $F\left(x\right)=P\left(X=1\right)=1/6$.

If $2< x\le 3$, то $F\left(x\right)=P\left(X=1\right)+P\left(X=2\right)=1/6+1/6=1/3$.

If $3< x\le 4$, то $F\left(x\right)=P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right)=1/6+1/6+1/6=1/2$.

If $4< x\le 5$, то $F\left(X\right)=P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right)+P\left(X=4\right)=1/6+1/6+1/6+1/6=2/3$.

If $5< x\le 6$, то $F\left(x\right)=P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right)+P\left(X=4\right)+P\left(X=5\right)=1/6+1/6+1/6+1/6+1/6=5/6$.

If $x > 6$ then $F\left(x\right)=P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right) +P\left(X=4\right)+P\left(X=5\right)+P\left(X=6\right)=1/6+1/6+1/6+1/6+ 1/6+1/6=1$.

So $F(x)=\left\(\begin(matrix)
0,\ at\ x\le 1,\\
1/6, at \ 1< x\le 2,\\
1/3,\ at\ 2< x\le 3,\\
1/2, at \ 3< x\le 4,\\
2/3,\ at\ 4< x\le 5,\\
5/6, \ at \ 4< x\le 5,\\
1,\ for \ x > 6.
\end(matrix)\right.$

Basic numerical characteristics of discrete and continuous random variables: mathematical expectation, variance and standard deviation. Their properties and examples.

The distribution law (distribution function and distribution series or probability density) fully describe the behavior of a random variable. But in a number of problems it is enough to know some numerical characteristics of the quantity under study (for example, its average value and possible deviation from it) in order to answer the question posed. Consider the main numerical characteristics of discrete random variables.

Definition 7.1.mathematical expectation A discrete random variable is the sum of the products of its possible values ​​and their corresponding probabilities:

M(X) = X 1 R 1 + X 2 R 2 + … + x p r p(7.1)

If the number of possible values ​​of a random variable is infinite, then if the resulting series converges absolutely.

Remark 1. The mathematical expectation is sometimes called weighted average, since it is approximately equal to the arithmetic mean of the observed values ​​of the random variable at large numbers experiments.

Remark 2. From the definition of mathematical expectation, it follows that its value is not less than the smallest possible value of a random variable and not more than the largest.

Remark 3. The mathematical expectation of a discrete random variable is non-random(constant. Later we will see that the same is true for continuous random variables.

Example 1. Find the mathematical expectation of a random variable X- the number of standard parts among three selected from a batch of 10 parts, including 2 defective ones. Let us compose a distribution series for X. It follows from the condition of the problem that X can take the values ​​1, 2, 3. Then

Example 2. Define the mathematical expectation of a random variable X- the number of coin tosses until the first appearance of the coat of arms. This quantity can take on an infinite number of values ​​(the set of possible values ​​is the set natural numbers). Its distribution series has the form:

X P
R 0,5 (0,5) 2 (0,5)P

+ (when calculating, the formula for the sum of an infinitely decreasing geometric progression was used twice: , whence ).

Properties of mathematical expectation.

1) The mathematical expectation of a constant is equal to the constant itself:

M(FROM) = FROM.(7.2)

Proof. If we consider FROM as a discrete random variable that takes only one value FROM with probability R= 1, then M(FROM) = FROM?1 = FROM.

2) A constant factor can be taken out of the expectation sign:

M(SH) = CM(X). (7.3)

Proof. If the random variable X given by the distribution series


Then M(SH) = Cx 1 R 1 + Cx 2 R 2 + … + Cx p r p = FROM(X 1 R 1 + X 2 R 2 + … + x p r p) = CM(X).

Definition 7.2. Two random variables are called independent, if the distribution law of one of them does not depend on what values ​​the other has taken. Otherwise random variables dependent.

Definition 7.3. Let's call product of independent random variables X And Y random variable XY, whose possible values ​​are equal to the products of all possible values X for all possible values Y, and the probabilities corresponding to them are equal to the products of the probabilities of the factors.

3) The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations:

M(XY) = M(X)M(Y). (7.4)

Proof. To simplify the calculations, we restrict ourselves to the case when X And Y take only two possible values:

Consequently, M(XY) = x 1 y 1 ?p 1 g 1 + x 2 y 1 ?p 2 g 1 + x 1 y 2 ?p 1 g 2 + x 2 y 2 ?p 2 g 2 = y 1 g 1 (x 1 p 1 + x 2 p 2) + + y 2 g 2 (x 1 p 1 + x 2 p 2) = (y 1 g 1 + y 2 g 2) (x 1 p 1 + x 2 p 2) = M(X)?M(Y).

Remark 1. Similarly, one can prove this property for more possible values ​​of factors.

Remark 2. Property 3 is valid for the product of any number of independent random variables, which is proved by the method of mathematical induction.

Definition 7.4. Let's define sum of random variables X And Y as a random variable X + Y, whose possible values ​​are equal to the sums of each possible value X with every possible value Y; the probabilities of such sums are equal to the products of the probabilities of the terms (for dependent random variables - the products of the probability of one term and the conditional probability of the second).

4) The mathematical expectation of the sum of two random variables (dependent or independent) is equal to the sum of the mathematical expectations of the terms:

M (X+Y) = M (X) + M (Y). (7.5)

Proof.

Consider again the random variables given by the distribution series given in the proof of property 3. Then the possible values X+Y are X 1 + at 1 , X 1 + at 2 , X 2 + at 1 , X 2 + at 2. Denote their probabilities respectively as R 11 , R 12 , R 21 and R 22. Let's find M(X+Y) = (x 1 + y 1)p 11 + (x 1 + y 2)p 12 + (x 2 + y 1)p 21 + (x 2 + y 2)p 22 =

= x 1 (p 11 + p 12) + x 2 (p 21 + p 22) + y 1 (p 11 + p 21) + y 2 (p 12 + p 22).

Let's prove that R 11 + R 22 = R one . Indeed, the event that X+Y will take on the values X 1 + at 1 or X 1 + at 2 and whose probability is R 11 + R 22 coincides with the event that X = X 1 (its probability is R one). Similarly, it is proved that p 21 + p 22 = R 2 , p 11 + p 21 = g 1 , p 12 + p 22 = g 2. Means,

M(X+Y) = x 1 p 1 + x 2 p 2 + y 1 g 1 + y 2 g 2 = M (X) + M (Y).

Comment. Property 4 implies that the sum of any number of random variables is equal to the sum of the expected values ​​of the terms.

Example. Find the mathematical expectation of the sum of the number of points rolled when throwing five dice.

Let's find the mathematical expectation of the number of points that fell when throwing one die:

M(X 1) \u003d (1 + 2 + 3 + 4 + 5 + 6) The same number is equal to the mathematical expectation of the number of points that fell on any die. Therefore, by property 4 M(X)=

Dispersion.

In order to have an idea about the behavior of a random variable, it is not enough to know only its mathematical expectation. Consider two random variables: X And Y, given by distribution series of the form

X
R 0,1 0,8 0,1
Y
p 0,5 0,5

Let's find M(X) = 49?0,1 + 50?0,8 + 51?0,1 = 50, M(Y) \u003d 0? 0.5 + 100? 0.5 \u003d 50. As you can see, the mathematical expectations of both quantities are equal, but if for HM(X) describes well the behavior of a random variable, being its most probable possible value (moreover, the remaining values ​​differ slightly from 50), then the values Y significantly deviate from M(Y). Therefore, along with the mathematical expectation, it is desirable to know how much the values ​​of the random variable deviate from it. Dispersion is used to characterize this indicator.

Definition 7.5.Dispersion (scattering) random variable is called the mathematical expectation of the square of its deviation from its mathematical expectation:

D(X) = M (X-M(X))². (7.6)

Find the variance of a random variable X(number of standard parts among those selected) in example 1 of this lecture. Let's calculate the values ​​of the squared deviation of each possible value from the mathematical expectation:

(1 - 2.4) 2 = 1.96; (2 - 2.4) 2 = 0.16; (3 - 2.4) 2 = 0.36. Consequently,

Remark 1. In the definition of variance, it is not the deviation from the mean itself that is evaluated, but its square. This is done so that the deviations of different signs do not compensate each other.

Remark 2. It follows from the definition of dispersion that this quantity takes only non-negative values.

Remark 3. There is a more convenient formula for calculating the variance, the validity of which is proved in the following theorem:

Theorem 7.1.D(X) = M(X²) - M²( X). (7.7)

Proof.

By using what M(X) is a constant value, and the properties of the mathematical expectation, we transform the formula (7.6) to the form:

D(X) = M(X-M(X))² = M(X² - 2 X?M(X) + M²( X)) = M(X²) - 2 M(X)?M(X) + M²( X) =

= M(X²) - 2 M²( X) + M²( X) = M(X²) - M²( X), which was to be proved.

Example. Let us calculate the variances of random variables X And Y discussed at the beginning of this section. M(X) = (49 2 ?0,1 + 50 2 ?0,8 + 51 2 ?0,1) - 50 2 = 2500,2 - 2500 = 0,2.

M(Y) \u003d (0 2? 0.5 + 100²? 0.5) - 50² \u003d 5000 - 2500 \u003d 2500. So, the dispersion of the second random variable is several thousand times greater than the dispersion of the first. Thus, even without knowing the laws of distribution of these quantities, we can state from the known values ​​of the dispersion that X deviates little from its mathematical expectation, while for Y this deviation is very significant.

Dispersion properties.

1) Dispersion constant FROM equals zero:

D (C) = 0. (7.8)

Proof. D(C) = M((C-M(C))²) = M((C-C)²) = M(0) = 0.

2) The constant factor can be taken out of the dispersion sign by squaring it:

D(CX) = C² D(X). (7.9)

Proof. D(CX) = M((CX-M(CX))²) = M((CX-CM(X))²) = M(C²( X-M(X))²) =

= C² D(X).

3) The variance of the sum of two independent random variables is equal to the sum of their variances:

D(X+Y) = D(X) + D(Y). (7.10)

Proof. D(X+Y) = M(X² + 2 XY + Y²) - ( M(X) + M(Y))² = M(X²) + 2 M(X)M(Y) +

+ M(Y²) - M²( X) - 2M(X)M(Y) - M²( Y) = (M(X²) - M²( X)) + (M(Y²) - M²( Y)) = D(X) + D(Y).

Consequence 1. The variance of the sum of several mutually independent random variables is equal to the sum of their variances.

Consequence 2. The variance of the sum of a constant and a random variable is equal to the variance of the random variable.

4) The variance of the difference of two independent random variables is equal to the sum of their variances:

D(X-Y) = D(X) + D(Y). (7.11)

Proof. D(X-Y) = D(X) + D(-Y) = D(X) + (-1)² D(Y) = D(X) + D(X).

The variance gives the average value of the squared deviation of the random variable from the mean; to assess the deviation itself is a value called the standard deviation.

Definition 7.6.Standard deviationσ random variable X called Square root from dispersion:

Example. In the previous example, the standard deviations X And Y equal respectively

Share with friends or save for yourself:

Loading...