Application of the theory of probability in life. Forgot your password? Application of the theory of probability in the modern world

"Accidents are not accidental" ... It sounds like a philosopher said, but in fact it is the lot of the great science of mathematics to study randomness. In mathematics, chance theory deals with randomness. Formulas and examples of tasks, as well as the main definitions of this science will be presented in the article.

What is probability theory?

Probability theory is one of the mathematical disciplines that studies random events.

To make it a little clearer, let's give a small example: if you flip a coin up, it can fall "heads" or "tails". As long as the coin is in the air, both of these possibilities are possible. That is, the likelihood of possible consequences is 1: 1. If you pull one out of a deck with 36 cards, then the probability will be denoted as 1:36. It would seem that there is nothing to investigate and predict, especially with the help of mathematical formulas. Nevertheless, if you repeat a certain action many times, then you can identify a certain pattern and, on its basis, predict the outcome of events in other conditions.

To summarize all of the above, the theory of probability in the classical sense studies the possibility of the occurrence of one of the possible events in a numerical value.

From the pages of history

The theory of probability, formulas and examples of the first tasks appeared in the distant Middle Ages, when attempts were first made to predict the outcome of card games.

Initially, the theory of probability had nothing to do with mathematics. It was based on empirical facts or properties of an event that could be reproduced in practice. The first works in this area as a mathematical discipline appeared in the 17th century. The founders were Blaise Pascal and Pierre Fermat. For a long time they studied gambling and saw certain patterns, which they decided to tell the public about.

The same technique was invented by Christian Huygens, although he was not familiar with the results of the research of Pascal and Fermat. The concept of "probability theory", formulas and examples, which are considered the first in the history of the discipline, were introduced by him.

The works of Jacob Bernoulli, Laplace's and Poisson's theorems are also important. They made the theory of probability more like a mathematical discipline. The theory of probability, formulas and examples of basic tasks received their present form thanks to Kolmogorov's axioms. As a result of all the changes, the theory of probability has become one of the mathematical branches.

Basic concepts of probability theory. Developments

The main concept of this discipline is "event". There are three types of events:

  • Credible. Those that will happen anyway (the coin will fall).
  • Impossible. Events that will not happen under any scenario (the coin will remain hanging in the air).
  • Random. Those that will or will not happen. They can be influenced by various factors, which are very difficult to predict. If we talk about a coin, then random factors that can affect the result: the physical characteristics of the coin, its shape, initial position, power of the throw, etc.

All events in the examples are designated in capital Latin letters, with the exception of P, which has a different role. For example:

  • A = "students came to the lecture."
  • Ā = "students did not come to the lecture."

In practical exercises, it is customary to write down events in words.

One of the most important characteristics of events is their equal opportunity. That is, if you flip a coin, all variants of the initial fall are possible until it falls. But also events are not equally possible. This happens when someone specifically influences the outcome. For example, "marked" playing cards or dice in which the center of gravity is shifted.

Events are also compatible and incompatible. Compatible events do not exclude each other from occurring. For example:

  • A = "a student came to the lecture."
  • B = "student came to the lecture."

These events are independent of each other, and the appearance of one of them does not affect the appearance of the other. Incompatible events are determined by the fact that the appearance of one excludes the appearance of the other. If we talk about the same coin, then the “tails” falling out makes it impossible for the “heads” to appear in the same experiment.

Actions on events

Events can be multiplied and added, respectively, logical connections "AND" and "OR" are introduced in the discipline.

The amount is determined by the fact that either event A, or B, or two can occur at the same time. In the case when they are incompatible, the last option is impossible, either A or B.

The multiplication of events consists in the appearance of A and B at the same time.

Now you can give a few examples to better remember the basics, probability theory and formulas. Examples of problem solving further.

Exercise 1: The firm is participating in a competition for contracts for three types of work. Possible events that can occur:

  • A = "the firm will receive the first contract."
  • A 1 = "the firm will not receive the first contract."
  • B = "the firm will receive a second contract."
  • B 1 = "the firm will not receive a second contract"
  • C = "the firm will receive a third contract."
  • C 1 = "the firm will not receive a third contract."

Let's try to express the following situations using actions on events:

  • K = "the firm will receive all contracts."

In mathematical form, the equation will look like this: K = ABC.

  • M = "the firm will not receive a single contract."

M = A 1 B 1 C 1.

Complicating the task: H = "the firm will receive one contract." Since it is not known which contract the firm will receive (first, second or third), it is necessary to record the entire series of possible events:

Н = А 1 ВС 1 υ AB 1 С 1 υ А 1 В 1 С.

A 1 BC 1 is a series of events where the firm does not receive the first and third contracts, but receives the second. Other possible events were recorded by the corresponding method. The symbol υ in the discipline denotes the "OR" link. If we translate the given example into human language, then the company will receive either a third contract, or a second, or first. Similarly, you can write down other conditions in the discipline "Theory of Probability". The formulas and examples of solving problems presented above will help you do it yourself.

Actually, the probability

Perhaps, in this mathematical discipline, the probability of an event is the central concept. There are 3 definitions of probability:

  • classic;
  • statistical;
  • geometric.

Each has its place in the study of probabilities. Probability theory, formulas and examples (grade 9) mainly use the classical definition, which sounds like this:

  • The probability of situation A is equal to the ratio of the number of outcomes that favor its occurrence to the number of all possible outcomes.

The formula looks like this: P (A) = m / n.

A is actually an event. If there is a case opposite to A, it can be written as Ā or A 1.

m is the number of possible favorable cases.

n - all events that can happen.

For example, A = "draw a card of the heart suit." There are 36 cards in a standard deck, 9 of them are hearts. Accordingly, the formula for solving the problem will look like:

P (A) = 9/36 = 0.25.

As a result, the probability that a heart-suit card is drawn from the deck is 0.25.

Towards higher mathematics

Now it has become a little known what the theory of probability is, formulas and examples of solving tasks that come across in the school curriculum. However, probability theory is also found in higher mathematics, which is taught in universities. Most often, they operate with geometric and statistical definitions of the theory and complex formulas.

The theory of probability is very interesting. It is better to start learning formulas and examples (higher mathematics) small - with a statistical (or frequency) definition of probability.

The statistical approach does not contradict the classical one, but slightly expands it. If in the first case it was necessary to determine with what degree of probability the event will occur, then in this method it is necessary to indicate how often it will occur. Here we introduce a new concept "relative frequency", which can be denoted by W n (A). The formula is no different from the classic one:

If the classical formula is calculated for forecasting, then the statistical one - according to the results of the experiment. Take a small assignment, for example.

The technological control department checks the products for quality. Among 100 products, 3 were found to be of poor quality. How do you find the probability of the frequency of a quality product?

A = "the appearance of a quality product."

W n (A) = 97/100 = 0.97

Thus, the frequency of a quality product is 0.97. Where did you get 97 from? Of the 100 items we checked, 3 were found to be of poor quality. We subtract 3 from 100, we get 97, this is the amount of quality goods.

A little about combinatorics

Another method of probability theory is called combinatorics. Its basic principle is that if a certain choice of A can be done in m different ways, and the choice of B can be done in n different ways, then the choice of A and B can be made by multiplication.

For example, there are 5 roads leading from city A to city B. There are 4 ways from city B to city C. How many ways can you get from city A to city C?

It's simple: 5x4 = 20, that is, you can get from point A to point C in twenty different ways.

Let's complicate the task. How many ways are there to play cards in solitaire? There are 36 cards in the deck - this is the starting point. To find out the number of ways, you need to "subtract" one card from the starting point and multiply.

That is, 36x35x34x33x32 ... x2x1 = the result does not fit on the calculator screen, so you can simply designate it as 36 !. Sign "!" next to a number indicates that the entire series of numbers is being multiplied among themselves.

In combinatorics, there are concepts such as permutation, placement, and combination. Each of them has its own formula.

An ordered set of elements in a set is called an arrangement. Placements can be repetitive, that is, one element can be used multiple times. And no repetition, when the elements are not repeated. n are all elements, m are elements that participate in the placement. The formula for placement without repetitions would be:

A n m = n! / (N-m)!

Connections of n elements that differ only in the order of placement are called permutations. In mathematics, this is: P n = n!

Combinations of n elements by m are called such compounds in which it is important what elements they were and what their total number was. The formula will look like:

A n m = n! / M! (N-m)!

Bernoulli's formula

In the theory of probability, as in every discipline, there are works of outstanding researchers in their field who have taken it to a new level. One of these works is the Bernoulli formula, which allows you to determine the probability of a certain event occurring under independent conditions. This suggests that the appearance of A in an experiment does not depend on the appearance or non-appearance of the same event in previous or subsequent tests.

Bernoulli's equation:

P n (m) = C n m × p m × q n-m.

The probability (p) of the occurrence of event (A) is unchanged for each test. The probability that the situation will occur exactly m times in n number of experiments will be calculated by the formula presented above. Accordingly, the question arises of how to find out the number q.

If event A occurs p number of times, respectively, it may not occur. One is a number that is used to designate all outcomes of a situation in a discipline. Therefore, q is a number that denotes the possibility of the event not occurring.

Now you know Bernoulli's formula (probability theory). We will consider examples of solving problems (the first level) further.

Assignment 2: The store visitor will make a purchase with a probability of 0.2. 6 visitors entered the store independently. What is the likelihood that a visitor will make a purchase?

Solution: Since it is not known how many visitors should make a purchase, one or all six, it is necessary to calculate all possible probabilities using the Bernoulli formula.

A = "the visitor makes a purchase."

In this case: p = 0.2 (as indicated in the task). Accordingly, q = 1-0.2 = 0.8.

n = 6 (since there are 6 customers in the store). The number m will change from 0 (no customer will make a purchase) to 6 (all visitors to the store will purchase something). As a result, we get the solution:

P 6 (0) = C 0 6 × p 0 × q 6 = q 6 = (0.8) 6 = 0.2621.

None of the buyers will make a purchase with a probability of 0.2621.

How else is Bernoulli's formula (probability theory) used? Examples of problem solving (second level) below.

After the above example, questions arise about where C and p have gone. With respect to p, the number to the power of 0 will be equal to one. As for C, it can be found by the formula:

C n m = n! / m! (n-m)!

Since in the first example m = 0, respectively, C = 1, which, in principle, does not affect the result. Using the new formula, let's try to find out what is the probability of two visitors buying goods.

P 6 (2) = C 6 2 × p 2 × q 4 = (6 × 5 × 4 × 3 × 2 × 1) / (2 × 1 × 4 × 3 × 2 × 1) × (0.2) 2 × (0.8) 4 = 15 × 0.04 × 0.4096 = 0.246.

The theory of probability is not that complicated. Bernoulli's formula, examples of which are presented above, is a direct proof of this.

Poisson's formula

Poisson's equation is used to calculate unlikely random situations.

Basic formula:

P n (m) = λ m / m! × e (-λ).

Moreover, λ = n x p. Here is such a simple Poisson formula (probability theory). We will consider examples of solving problems further.

Assignment 3: The factory produced parts in the amount of 100,000 pieces. Defective part occurrence = 0.0001. What is the probability that there will be 5 defective parts in a batch?

As you can see, marriage is an unlikely event, and therefore the Poisson formula (probability theory) is used for the calculation. Examples of solving problems of this kind do not differ in any way from other tasks of the discipline, we substitute the necessary data in the given formula:

A = "a randomly selected part will be defective."

p = 0.0001 (according to the condition of the task).

n = 100000 (number of parts).

m = 5 (defective parts). We substitute the data into the formula and get:

P 100000 (5) = 10 5/5! X e -10 = 0.0375.

Just like Bernoulli's formula (probability theory), examples of solutions with which are written above, Poisson's equation has an unknown e. In fact, it can be found by the formula:

е -λ = lim n -> ∞ (1-λ / n) n.

However, there are special tables that contain almost all the values ​​of e.

Moivre-Laplace theorem

If the number of tests in the Bernoulli scheme is large enough, and the probability of occurrence of event A in all schemes is the same, then the probability of occurrence of event A a certain number of times in a series of tests can be found by the Laplace formula:

Р n (m) = 1 / √npq x ϕ (X m).

X m = m-np / √npq.

To better remember the Laplace formula (probability theory), examples of problems to help you below.

First, we find X m, substitute the data (they are all indicated above) into the formula and get 0.025. Using the tables, we find the number ϕ (0.025), the value of which is 0.3988. Now you can substitute all the data in the formula:

R 800 (267) = 1 / √ (800 x 1/3 x 2/3) x 0.3988 = 3/40 x 0.3988 = 0.03.

So the probability that the flyer will fire exactly 267 times is 0.03.

Bayes formula

Bayes' formula (probability theory), examples of solving problems with the help of which will be given below, is an equation that describes the probability of an event, based on the circumstances that could be associated with it. The basic formula looks like this:

P (A | B) = P (B | A) x P (A) / P (B).

A and B are definite events.

P (A | B) - conditional probability, that is, event A can occur provided that event B is true.

P (B | A) - conditional probability of event B.

So, the final part of the short course "Probability Theory" is the Bayes formula, examples of solutions to problems with which are below.

Assignment 5: Phones from three companies were brought to the warehouse. At the same time, part of the phones that are manufactured at the first plant is 25%, at the second - 60%, at the third - 15%. It is also known that the average percentage of defective products in the first factory is 2%, in the second - 4%, and in the third - 1%. It is necessary to find the probability that a randomly selected phone will turn out to be defective.

A = "randomly picked phone".

B 1 - the phone that was made by the first factory. Accordingly, input B 2 and B 3 will appear (for the second and third factories).

As a result, we get:

P (B 1) = 25% / 100% = 0.25; P (B 2) = 0.6; P (B 3) = 0.15 - thus we found the probability of each option.

Now you need to find the conditional probabilities of the desired event, that is, the probability of defective products in firms:

P (A / B 1) = 2% / 100% = 0.02;

P (A / B 2) = 0.04;

P (A / B 3) = 0.01.

Now we plug the data into the Bayes formula and get:

P (A) = 0.25 x 0.2 + 0.6 x 0.4 + 0.15 x 0.01 = 0.0305.

The article presents the theory of probability, formulas and examples of problem solving, but this is only the tip of the iceberg of a vast discipline. And after all that has been written, it will be logical to ask the question of whether the theory of probability is needed in life. It is difficult for an ordinary person to answer, it is better to ask the person who has hit the jackpot more than once with its help.

2.1. The choice of the mathematical apparatus of the theory of reliability

The above definition of reliability is clearly insufficient, since it is only of a qualitative nature and does not allow solving various engineering problems in the process of designing, manufacturing, testing and operating aircraft. In particular, it does not allow solving such important tasks as, for example:

Assess the reliability (reliability, recoverability, preservation, availability and durability) of existing and new structures being created;

Compare the reliability of different types of elements and systems;

Evaluate the effectiveness of the restoration of faulty aircraft;

Justify repair plans and the composition of spare parts required to support flight plans;

Determine the volume, frequency, cost of performing preparations for the flight, routine maintenance and the entire range of maintenance;

Determine the cost of time, money and funds required to restore faulty technical devices.

The difficulty in determining the quantitative characteristics of reliability stems from the very nature of failures, each of which is the result of the coincidence of a number of unfavorable factors, such as, for example, overloads, local deviations from the design operating modes of elements and systems, material flaws, changes in external conditions, etc. causal relationships of varying degrees and of different nature, causing sudden concentrations of loads exceeding the design load.

Failures of aviation equipment depend on many reasons that can be tentatively assessed from the point of view of their readability as primary or secondary. This makes it possible to consider the number of failures and the time of their appearance 1 as random variables, that is, quantities that, depending on the case, can take on different values, for which it is not known before.

Establishing quantitative dependencies using the classically III methods in such a difficult situation is practically not possible - 1k 11, since numerous secondary random factors play such a noticeable role that it is impossible to single out the first major factors from many others. In addition, the use of only classical research methods, based on consideration, instead of the phenomenon of his forgiven and idealized model, built on account. If only the main factors and neglect of the secondary ones, it always gives the right result.

The theory of probability and ma - | Semn і nicheskaya statistics - sciences that study regularity - III in random phenomena and in some cases well up to - IIі> ’111) 110111110 classical methods.

The following methods should be attributed to the network of these methods:

I) ciaiin'iirnch'kiiie methods, without disclosing the individual and reasons for the lousy refusal, establish instead of

……… i. i pvniiiiiH and about pc iyiii. і.іga mass exploitation with

Mill …………. (ІКНІМО (game І carry) in CONDITIONS

"in in hi і" ії і them іпм і reasons;

‘І“ і them) nі і ii’ii kii methods the results obtained

1 "……… and their search and milk yield correspond to everything

1 .. пік »pcarn. in. iK in terms of exploitation, and not one or another widespread and highly simplified scheme; m І..І on the basis of mass observations of the appearance of otitis і і. June and it is possible to identify general patterns, the engineering analysis of which opens the way for increasing the PNDI wear rate of aviation technology in the process of its creation and maintenance at a given level in the process of operation.

The indicated advantages of this mathematical apparatus make it so far the only acceptable one for investigating interrogations of the reliability of aviation technology. At the same time, in practice, specific restrictions should be taken into account, prizes

essential to statistical methods that cannot answer the question of whether a given technical device will function flawlessly during the period of interest to us or not. These methods make it possible only to determine the likelihood of failure-free operation of a particular piece of aviation equipment and to assess the risk that a failure will occur during the period of operation that is of interest to us.

The conclusions obtained statistically are always based on past experience in the operation of aviation equipment, and therefore the assessment of future failures will be rigorous only if the entire set of operating conditions (operating modes, storage conditions) coincides sufficiently accurately.

To analyze and assess the recoverability and readiness of aviation equipment for flight, these methods are also used, using the laws of the queuing theory and especially some sections of the recovery theory.

  • 2.1. Relative frequency. Relative frequency stability
  • 2.2. Limitations of the classical definition of probability. Statistical probability
  • 2.3. Geometric probabilities
  • 2.4. Probability addition theorem
  • 2.5. Complete Event Group
  • 2.6. Opposite events
  • 2.7. The principle of practical impossibility of unlikely events
  • 2.8. Product of events. Conditional probability
  • 2.9. Probability multiplication theorem
  • 2.10. Independent events. Multiplication theorem for independent events
  • 2.10. The probability of occurrence of at least one event
  • Lecture number 3 consequences of the theorems of addition and multiplication
  • 3.1. The addition theorem for the probabilities of joint events
  • 3.2. Total Probability Formula
  • 3.3. The probability of hypotheses. Bayes' formulas
  • 4. Repetition of tests
  • 4.1. Bernoulli's formula
  • 4.2. Limit theorems in the Bernoulli scheme
  • 4.3. Local and integral theorems of Moivre-Laplace
  • 4.3. The probability of deviation of the relative frequency from the constant probability in independent tests
  • 5. Random variables
  • 5.1. The concept of a random variable. The distribution law of a random variable
  • 5.2. Distribution law of a discrete random variable. Distribution polygon
  • 5.3. Binomial distribution
  • 5.4. Poisson distribution
  • 5.5. Geometric distribution
  • 5.6. Hypergeometric distribution
  • 6. Mathematical expectation of a discrete random variable
  • 6.1. Numerical characteristics of discrete random variables
  • 6.2. The mathematical expectation of a discrete random variable
  • 6.3. Probabilistic meaning of mathematical expectation
  • 6.4. Mathematical expectation properties
  • 6.5. The expected number of occurrences of an event in independent trials
  • 7. Dispersion of a discrete random variable
  • 7.1. Feasibility of introducing a numerical characteristic of the scattering of a random variable
  • 7.2. Deviation of a random variable from its mathematical expectation
  • 7.3. Dispersion of a discrete random variable
  • 7.4. Formula for calculating variance
  • 7.5. Dispersion properties
  • 7.6. Dispersion of the number of occurrences of an event in independent trials
  • 7.7. Standard deviation
  • 7.8. Standard deviation of the sum of mutually independent random variables
  • 7.9. Identically distributed mutually independent random variables
  • 7.10. Initial and central theoretical points
  • 8. The law of large numbers
  • 8.1. Preliminary remarks
  • 8.2. Chebyshev's inequality
  • 8.3. Chebyshev's theorem
  • 8.4. The essence of Chebyshev's theorem
  • 8.5. The importance of Chebyshev's theorem for practice
  • 8.6. Bernoulli's theorem
  • The probability distribution function of a random variable
  • 9.1. Definition of the distribution function
  • 9.2. Distribution function properties
  • 9.3. Distribution function plot
  • 10. The density of the probability distribution of a continuous random variable
  • 10.1. Determination of the distribution density
  • 10.2. Probability of hitting a continuous random variable in a given interval
  • 10.3. The law of uniform distribution of probabilities
  • 11. Normal distribution
  • 11.1. Numerical characteristics of continuous random variables
  • 11.2. Normal distribution
  • 11.3. Normal curve
  • 11.4. Influence of the parameters of the normal distribution on the shape of the normal curve
  • 11.5. Probability of hitting a given interval of a normal random variable
  • 11.6. Calculating the probability of a given deviation
  • 11.7. The Three Sigma Rule
  • 11.8. The concept of Lyapunov's theorem. Formulation of the central limit theorem
  • 11.9. Estimation of the deviation of the theoretical distribution from the normal one. Asymmetry and kurtosis
  • 11.10. Function of one random argument and its distribution
  • 11.11. The mathematical expectation of a function of one random argument
  • 11.12. A function of two random arguments. Distribution of the sum of independent terms. Stability of the normal distribution
  • 11.13. Chi-square distribution
  • 11.14. Student's t distribution
  • 11.15. Fischer - Snedecor f distribution
  • 12. Exponential distribution
  • 12.1. Determination of the exponential distribution
  • 12.2. The probability of hitting a given interval of an exponentially distributed random variable
  • § 3. Numerical characteristics of the exponential distribution
  • 12.4. Reliability function
  • 12.5. The exponential law of reliability
  • 12.6. The characteristic property of the exponential law of reliability
  • 1.2. Areas of Probability Theory

    Probability theory methods are widely used in various branches of natural science and technology:

     in the theory of reliability,

     queuing theory,

    Theoretical physics,

     geodesy,

    Astronomy,

    Shooting theory,

     the theory of observation errors,

     theory of automatic control,

     general communication theory and in many other theoretical and applied sciences.

    The theory of probability also serves to substantiate mathematical and applied statistics, which in turn is used in planning and organizing production, in analyzing technological processes, in preventive and acceptance control of product quality, and for many other purposes.

    In recent years, the methods of probability theory have penetrated more and more into various fields of science and technology, contributing to their progress.

    1.3. Brief historical background

    The first works, in which the basic concepts of the theory of probability were born, were attempts to create a theory of gambling (Cardano, Huygens, Pascal, Fermat and others in the 16th-17th centuries).

    The next stage in the development of probability theory is associated with the name of Jacob Bernoulli (1654 - 1705). The theorem he proved, which later received the name "The Law of Large Numbers", was the first theoretical substantiation of the previously accumulated facts.

    Further success of the theory of probability is due to Moivre, Laplace, Gauss, Poisson and others. The new, most fruitful period is associated with the names of P.L. ... Lyapunov (1857 - 1918). During this period, the theory of probability becomes a harmonious mathematical science. Its subsequent development is primarily due to Russian and Soviet mathematicians (S.N.Bernshtein, V.I. Romanovsky, A.N. Kolmogorov, A.Ya. Khinchin, B.V. Gnedenko, N.V. Smirnov, etc. ).

    1.4. Trials and events. Types of events

    The basic concepts of probability theory are the concept of an elementary event and the concept of a space of elementary events. Above, an event is called random if, under the implementation of a certain set of conditions S it may or may not happen. In what follows, instead of saying “a set of conditions S carried out ”, we will say briefly:“ the test was carried out ”. Thus, the event will be considered as a test result.

    Definition. By a random event any fact that may or may not occur as a result of experience is called.

    Moreover, this or that result of the experience can be obtained with varying degrees of possibility. That is, in some cases, we can say that one event will almost certainly happen, the other almost never.

    Definition. The space of elementary outcomesΩ is a set containing all possible results of a given random experiment, of which exactly one occurs in the experiment. The elements of this set are called elementary outcomes and denoted by the letter ω ("omega").

    Then events are called subsets of the set Ω. They say that as a result of an experiment, an event A Ω occurred, if one of the elementary outcomes included in the set A occurred in the experiment.

    For simplicity, we will assume that the number of elementary events is finite. A subset of the space of elementary events is called a random event. As a result of the test, this event may or may not occur (three points dropped when throwing a dice, a phone call at a given moment, etc.).

    Example 1. The shooter shoots at a target divided into four areas. The shot is a test. Hitting a specific target area is an event.

    Example 2. The urn contains colored balls. One ball is taken at random from the urn. Removing the ball from the urn is a test. The appearance of a ball of a certain color is an event.

    In a mathematical model, one can accept the concept of an event as an initial one, which is not given a definition and which is characterized only by its properties. Based on the real meaning of the concept of an event, various types of events can be defined.

    Definition. A random event is called reliable if it deliberately occurs (falling from one to six points when throwing the dice), and impossible, if it obviously cannot occur as a result of the experiment (the loss of seven points when throwing the dice). In this case, a reliable event contains all points of the space of elementary events, and an impossible event does not contain a single point of this space.

    Definition. Two random events are called inconsistent if they cannot occur simultaneously with the same test outcome. And in general, any number of events are called inconsistent if the appearance of one of them excludes the appearance of others.

    A classic example of inconsistent events is the result of a coin toss - a coin face falling out excludes the reverse side falling out (in the same experiment).

    Another example - a part has been removed at random from a parts box. The appearance of a standard part eliminates the appearance of a non-standard part. The events "a standard part appeared" and "a non-standard part appeared" are inconsistent.

    Definition. Several events form full group if at least one of them appears as a result of the test.

    In other words, the appearance of at least one of the events of the entire group is a reliable event. In particular, if the events that form a complete group are pairwise inconsistent, then one and only one of these events will appear as a result of the test. This particular case is of the greatest interest, since it is used below.

    Example. Two cash lottery tickets were purchased. One and only one of the following events will surely happen: "the winnings fell on the first ticket and did not fall on the second", "the winnings did not fall on the first ticket and fell on the second", "the winnings fell on both tickets", "on both tickets, the winnings did not dropped out. " These events form a complete group of pairwise incompatible events.

    Example. The shooter fired a shot at the target. One of the following two events is bound to happen: hit, miss. These two incompatible events form a complete group.

    Example. If one ball is taken out at random from a box containing only red and green balls, then the appearance of a white ball among the taken out balls is an impossible event. The appearance of red and the appearance of green balls form a complete group of events.

    Definition. Events are called equally possible if there is reason to believe that none of them is more possible than the other.

    Example. The appearance of the "coat of arms" and the appearance of an inscription when a coin is tossed are equally possible events. Indeed, it is assumed that the coin is made of a homogeneous material, has a regular cylindrical shape, and the presence of minting does not affect the fallout of one or another side of the coin.

    Example. The appearance of one or another number of points on a thrown dice are equally possible events. Indeed, it is assumed that the dice is made of a homogeneous material, has the shape of a regular polyhedron, and the presence of glasses does not affect the fallout of any face.

    In the above example with balls, the appearance of red and green balls are equally possible events if the box contains the same number of red and green balls. If there are more red balls in the box than green ones, then the appearance of a green ball is an event less likely than the appearance of a red one.

    Updated 12.09.2009

    A small excursion into the history of the application of the theory of probability in practice.

    Until the end of the 18th century, applied statistics, without which state accounting and control is inconceivable, and therefore existed for a long time, had an elementary, purely arithmetic character. Probability theory remained a purely academic discipline, and only gambling acted as its relatively complex "applications". Improvement in the technology of production of dice in the 18th century stimulated the development of the theory of probability. The players, unwittingly, began to stage reproducible experiments on a massive scale, since the dice became the same, standard. This gave rise to an example of what would later be called a "statistical experiment" - an experiment that can be repeated an unlimited number of times under the same conditions.

    In the 19th and 20th centuries, the theory of probability penetrates first into science (astronomy, physics, biology), then into practice (agriculture, industry, medicine), and finally, after the invention of computers, into the daily life of any person using modern means of receiving and transmitting. information. Let's follow the main stages.

    1. Astronomy.

    It was for use in astronomy that the famous "least squares method" was developed (Legendre 1805, Gauss 1815). The main problem for which it was originally used was the calculation of the orbits of comets, which had to be done on a small number of observations. It is clear that it is difficult to reliably determine the type of orbit (ellipse or hyperbola) and accurately calculate its parameters, since the orbit is observed only in a small area. The method proved to be effective, versatile, and sparked a heated debate over priority. It began to be used in geodesy and cartography. Now, when the art of manual calculations has been lost, it is difficult to imagine that when drawing up maps of the world's oceans in the 1880s in England, the least squares method was used to numerically solve a system of about 6,000 equations with several hundred unknowns.

    In the second half of the 19th century, statistical mechanics was developed in the works of Maxwell, Boltzmann and Gibbs, which described the state of rarefied systems containing a huge number of particles (of the order of Avogadro's number). If earlier the concept of the distribution of a random variable was mainly associated with the distribution of measurement errors, now the distribution turned out to be very different quantities - speed, energy, mean free path.

    3. Biometrics.

    In 1870-1900, the Belgian Quetelet and the British Francis Galton and Carl Pearson founded a new scientific direction - biometrics, in which for the first time the indefinite variability of living organisms and the inheritance of quantitative traits began to be systematically and quantitatively studied. New concepts were introduced into the scientific circulation - regression and correlation.

    So, until the beginning of the 20th century, the main applications of the theory of probability were associated with scientific research. The introduction into practice - agriculture, industry, medicine - took place in the 20th century.

    4. Agriculture.

    At the beginning of the 20th century in England, the task was set to quantitatively compare the effectiveness of various agricultural methods. To solve this problem, the theory of planning experiments and analysis of variance were developed. The main merit in the development of this already purely practical use of statistics belongs to Sir Ronald Fisher, an astronomer (!) By education, and later a farmer, statistics, genetics, president of the English Royal Society. Modern mathematical statistics, suitable for wide application in practice, was developed in England (Carl Pearson, Student, Fisher). A student was the first to solve the problem of estimating an unknown distribution parameter without using the Bayesian approach.

    5. Industry. Introduction of methods of statistical control in production (Shewhart control charts). Reducing the number of product quality tests required. Mathematical methods are already so important that they began to be classified. Thus, a book describing a new technique that allowed to reduce the number of tests (Wald's "Sequential Analysis") was published only after the end of the Second World War in 1947.

    6. Medicine. The widespread use of statistical methods in medicine began relatively recently (second half of the 20th century). The development of effective methods of treatment (antibiotics, insulin, effective anesthesia, artificial circulation) required reliable methods for assessing their effectiveness. A new concept “Evidence-Based Medicine” has emerged. A more formal, quantitative approach to the treatment of many diseases began to develop - the introduction of protocols, guide lines.

    Since the mid-1980s, a new and important factor has emerged that has revolutionized all applications of probability theory - the widespread use of fast and affordable computers. One can feel the enormity of the revolution that took place if we consider that one (!) Modern personal computer surpasses in speed and memory all (!) Computers of the USSR and the USA that were available by 1968, the time when projects related to the construction of nuclear power plants were already implemented , flights to the moon, the creation of a thermonuclear bomb. Now, by direct experimentation, you can get results that were previously unavailable - thinking of unthinkable.

    7. Bioinformatics. Since the 1980s, the number of known protein and nucleic acid sequences has increased rapidly. The volume of accumulated information is such that only computer analysis of this data can solve the problem of information extraction.

    8. Recognition of images.


    Content
    Introduction 3
    1. History of origin 4
    2. The emergence of the classical definition of probability 9
    3. Subject of the theory of probability 11
    4. Basic concepts of the theory of probability 13
    5. Application of the theory of probability in the modern world 15
    6. Probability and Air Transport 19 Conclusion 20
    References 21


    Introduction

    Chance, chance - we meet with them every day: an accidental meeting, an accidental breakdown, an accidental find, an accidental mistake. This series can be continued endlessly. It would seem that there is no place for mathematics, but even here science has discovered interesting patterns - they allow a person to feel confident when faced with random events.
    Probability theory can be defined as a branch of mathematics that studies the patterns inherent in random events. Probability theory methods are widely used in the mathematical processing of measurement results, as well as in many problems of economics, statistics, insurance business, and queuing. Hence, it is not difficult to guess that the theory of probability finds very wide application in aviation as well.
    My future dissertation work will be related to satellite navigation. Not only in satellite navigation, but also in traditional means of navigation, the theory of probability has received very wide application, because most of the operational and technical characteristics of radio equipment are quantitatively expressed through probability.


    1. History of origin

    Now it is already difficult to establish who first raised the question, albeit in an imperfect form, about the possibility of quantitatively measuring the possibility of the occurrence of a random event. One thing is clear that a more or less satisfactory answer to this question took a long time and considerable efforts of a number of generations of outstanding researchers. For a long period, researchers have limited themselves to considering various kinds of games, especially dice games, since their study allows one to restrict oneself to simple and transparent mathematical models. However, it should be noted that many understood very well what was later formulated by Christian Huygens: “... I believe that upon careful study of the subject, the reader will notice that he is dealing not only with the game, but that the foundations of a very interesting and deep theory are being laid here. ".
    We will see that in the further progress of the theory of probability, deep considerations of both a natural scientific and a general philosophical character played an important role. This trend continues today: we constantly observe how practical issues - scientific, industrial, defense - pose new problems to the theory of probability and lead to the need to expand the arsenal of ideas, concepts and research methods.
    The development of the theory of probability, and with it the development of the concept of probability, can be divided into the following stages.
    1. Prehistory of the theory of probability. During this period, the beginning of which is lost in the centuries, elementary problems were posed and solved, which would later be referred to the theory of probability. No special methods arise during this period. This period ends with the works of Cardano, Pacioli, Tartaglia and others.
    We meet with probabilistic concepts in antiquity. In Democritus, Lucretius Kara and other ancient scientists and thinkers, we have deep predictions about the structure of matter with the disorderly movement of small particles (molecules), reasoning about equally possible outcomes, etc. Even in antiquity, attempts were made to collect and analyze some statistical materials - all this (as well as other manifestations of attention to random phenomena) created the basis for the development of new scientific concepts, including the concept of probability. But ancient science did not reach the point of isolating this concept.
    In philosophy, the question of the accidental, the necessary and the possible has always been one of the main ones. The philosophical development of these problems also influenced the formation of the concept of probability. In general, in the Middle Ages, there are only scattered attempts to dilute the encountered probabilistic reasoning.
    In the works of Pacioli, Tartaglia and Cardano, an attempt is already made to highlight a new concept - the ratio of odds - when solving a number of specific problems, primarily combinatorial ones.
    2. The emergence of the theory of probability as a science. By the middle of the 17th century. probabilistic issues and problems arising in statistical practice, in the practice of insurance companies, in the processing of observation results and in other areas, attracted the attention of scientists, since they have become topical issues. This period is primarily associated with the names of Pascal, Fermat and Huygens. During this period, specific concepts are developed, such as the mathematical expectation and probability (as a ratio of odds), the first properties of probability are established and used: the theorems of addition and multiplication of probabilities. At this time, the probability theorem finds application in the insurance business, demography, in assessing observation errors, widely using the concept of probability.
    3. The next period begins with the appearance of Bernoulli's work "The Art of Assumptions" (1713), in which the first limit theorem was first proved - the simplest case of the law of large numbers. The works of Moivre, Laplace, Gauss, and others belong to this period, which lasted until the middle of the 19th century. Limit theorems are at the center of attention at this time. Probability theory began to be widely applied in various fields of natural science. And although during this period various concepts of probability (geometric probability, statistical probability) begin to be applied, the classical definition of probability occupies a dominant position.
    4. The next period in the development of probability theory is associated primarily with the St. Petersburg mathematical school. For two centuries of the development of probability theory, its main achievements were limit theorems, but the limits of their application and the possibility of further generalization were not clarified. Along with the successes, significant shortcomings in its justification were also identified, this is expressed in an insufficiently clear understanding of the probability. In the theory of probability, a situation was created where its further development required clarification of the basic provisions, strengthening of the research methods themselves.
    This was done by the Russian mathematical school headed by Chebyshev. Among its largest representatives are Markov and Lyapunov.
    During this period, the theory of probability includes estimates of approximations of limit theorems, as well as an extension of the class of random variables obeying limit theorems. At this time, some dependent random variables (Markov chains) are beginning to be considered in probability theory. In the theory of probability, new concepts are emerging, such as the theory of characteristic functions, the theory of moments, etc. And in this regard, it has become widespread in the natural sciences, primarily related to physics. During this period, statistical physics was created. But this introduction of probabilistic methods and concepts into physics went quite far from the achievements of the theory of probability. The probabilities used in physics were not exactly the same as in mathematics. The existing concepts of probability did not satisfy the needs of the natural sciences and, as a result, various interpretations of probability began to arise, which were difficult to reduce to a single definition.
    The development of probability theory at the beginning of the 19th century. It led to the need to revise and clarify its logical foundations, primarily the concept of probability. This required the development of physics and the use of probabilistic concepts and the apparatus of the theory of probability in it; there was dissatisfaction with the classical justification of the Laplace type.
    5. The modern period of development of the theory of probability began with the establishment of axiomatics (axiomatics is a system of axioms of any science). This was primarily required by practice, since for the successful application of the theory of probability in physics, biology and other fields of science, as well as in technology and military affairs, it was necessary to clarify and bring into a coherent system its basic concepts. Thanks to axiomatics, probability theory has become an abstract-deductive mathematical discipline closely related to set theory. This led to the breadth of research in the theory of probability.
    The first works of this period are associated with the names of Bernstein, Mises, Borel. The final establishment of axiomatics took place in the 30s of the XX century. Analysis of trends in the development of probability theory allowed Kolmogorov to create a generally accepted axiomatics. In probabilistic studies, analogies with set theory began to play an essential role. The ideas of the metric theory of functions began to penetrate deeper and deeper into the theory of probability. The need arose for the axiomatization of probability theory based on set-theoretical concepts. This axiomatics was created by Kolmogorov and contributed to the fact that the theory of probability was finally strengthened as a full-fledged mathematical science.
    During this period, the concept of probability penetrates into almost all spheres of human activity. Various definitions of probability arise. The variety of definitions of basic concepts is an essential feature of modern science. Modern definitions in science are a statement of concepts, points of view, which can be many for any fundamental concept, and they all reflect some essential aspect of the concept being defined. This also applies to the concept of probability.


    2. The emergence of the classical definition of probability

    The concept of probability plays a huge role in modern science, and thus is an essential element of the modern worldview in general, modern philosophy. All this gives rise to attention and interest in the development of the concept of probability, which is closely related to the general movement of science. The concepts of probability were significantly influenced by the achievements of many sciences, but this concept, in turn, forced them to refine their approach to the study of the world.
    The formation of basic mathematical concepts represents important stages in the process of mathematical development. Until the end of the 17th century, science did not come to the introduction of the classical definition of probability, but continued to operate only with the number of chances favorable to a particular event of interest to researchers. Individual attempts, which were noted by Cardano and later researchers, did not lead to a clear understanding of the meaning of this innovation and remained a foreign body in the completed works. However, in the thirties of the 18th century, the classical concept of probability became generally used and none of the scientists of these years could limit themselves only to counting the number of chances favorable to an event. The introduction of the classical definition of probability did not occur as a result of a single action, but took a long period of time, during which there was a continuous improvement of the formulation, the transition from particular problems to the general case.
    A careful study shows that even in the book of H. Huygens "On Calculations in Gambling" (1657), there is no concept of probability as a number between 0 and 1 and equal to the ratio of the number of chances favorable to the event to the number of all possible ones. And in J. Bernoulli's treatise The Art of Assumptions (1713), this concept was introduced, although in a far from perfect form, but, which is especially important, is widely used.
    A. Moivre took the classical definition of probability given by Bernoulli, and determined the probability of an event almost exactly as we do now. He wrote: "Therefore, we construct a fraction, the numerator of which will be the number of occurrences of an event, and the denominator is the number of all cases in which it may or may not appear, such a fraction will express the actual probability of its occurrence."


    3. The subject of probability theory
    The events (phenomena) we observe can be divided into the following three types: reliable, impossible and random.
    An event is called reliable, which will necessarily occur if a certain set of conditions S is fulfilled. For example, if a vessel contains water at normal atmospheric pressure and a temperature of 20 °, then the event “water in the vessel is in a liquid state” is reliable. In this example, the set atmospheric pressure and water temperature are the set of conditions S.
    An event is called impossible if it will not happen if the set of conditions S is fulfilled. For example, the event “water in the vessel is in a solid state” will certainly not happen if the set of conditions of the previous example is fulfilled.
    A random event is an event that, when a set of conditions S is fulfilled, can either happen or not. For example, if a coin is thrown, then it can fall so that there will be either a coat of arms or an inscription on top. Therefore, the event “when the coin was thrown, the“ coat of arms ”fell out - random. Each random event, in particular the fall of the "coat of arms", is a consequence of the action of very many random causes (in our example: the force with which the coin is thrown, the shape of the coin, and many others). It is impossible to take into account the influence on the result of all these reasons, since their number is very large and the laws of their action are unknown. Therefore, probability theory does not set itself the task of predicting whether a single event will occur or not - it simply cannot do it.
    The situation is different if random events are considered that can be observed many times under the same conditions S, i.e., if we are talking about mass homogeneous random events. It turns out that a sufficiently large number of homogeneous random events, regardless of their specific nature, obeys certain laws, namely, probabilistic laws. The establishment of these regularities is dealt with by the theory of probability.
    So, the subject of probability theory is the study of the probabilistic laws of mass homogeneous random events.


    4. Basic concepts of probability theory

    Each science that develops a general theory of any range of phenomena contains a number of basic concepts on which it is based. Such basic concepts also exist in probability theory. They are: an event, the probability of an event, the frequency of an event or a statistical probability and a random variable.
    Random events are those events that may or may not occur when a set of conditions associated with the possibility of these events occurring.
    Random events are designated by the letters A, B, C, .... Each exercise of the constellation under consideration is called a test. The number of tests can grow indefinitely. The ratio of the number m of occurrences of a given random event A in a given series of tests to the total number n of tests in this series is called the frequency of occurrence of event A in a given series of tests (or simply the frequency of event A) and is denoted by P * (A). Thus, P * (A) = m / n.
    The frequency of a random event is always between zero and one: 0? P * (A)? 1.
    Mass random events have the property of frequency stability: the frequency values ​​of a given random event observed in various series of homogeneous tests (with a sufficiently large number of tests in each series) vary from series to series within fairly narrow limits.
    It is this circumstance that makes it possible to apply mathematical methods in the study of random events, assigning to each mass random event its probability, which is taken to be that (generally speaking, unknown in advance) number around which the observed frequency of the event fluctuates.
    The probability of a random event A is denoted by P (A). The probability of a random event, like its frequency, is between zero and one: 0? P (A)? 1 .

    A random variable is a value that characterizes the result of an undertaken operation and which can take on different values ​​for various operations, no matter how homogeneous the conditions for their implementation are.

    5. Application of the theory of probability in the modern world
    We should start rightfully with statistical physics. Modern natural science proceeds from the idea that all natural phenomena are of a statistical nature and laws can be accurately formulated only in terms of probability theory. Statistical physics became the basis of all modern physics, and the theory of probability became its mathematical apparatus. In statistical physics, problems are considered that describe the phenomena that determine the behavior of a large number of particles. Statistical physics is used very successfully in various branches of physics. In molecular physics, it is used to explain thermal phenomena, in electromagnetism - the dielectric, conductive and magnetic properties of bodies, in optics it made it possible to create a theory of thermal radiation, molecular scattering of light. In recent years, the range of applications of statistical physics has continued to expand.
    Statistical representations made it possible to quickly formulate the mathematical study of the phenomena of nuclear physics. The emergence of radiophysics and the study of radio signal transmission not only increased the significance of statistical concepts, but also led to the progress of mathematical science itself - the emergence of information theory.
    Understanding the nature of chemical reactions, dynamic equilibrium is also impossible without statistical concepts. All physical chemistry, its mathematical apparatus and the models it offers are statistical.
    The processing of observation results, which are always accompanied by both random observation errors and random changes for the observer in the experimental conditions, led researchers back in the 19th century to the creation of a theory of observation errors, and this theory is fully based on statistical concepts.
    Astronomy uses a statistical apparatus in a number of its sections. Stellar astronomy, the study of the distribution of matter in space, the study of cosmic particle fluxes, the distribution of sunspots (centers of solar activity) on the surface of the sun and much more requires the use of statistical representations.
    Biologists have noticed that the spread in the sizes of organs of living beings of the same species fits perfectly into the general theory of probabilistic laws. The famous Mendel's laws, which laid the foundation for modern genetics, require probabilistic and statistical reasoning. The study of such significant problems of biology as the transmission of excitement, the device of memory, the transmission of hereditary properties, the questions of the settlement of animals in the territory, the relationship between a predator and a prey requires a good knowledge of the theory of probability and mathematical statistics.
    The humanities combine disciplines that are very diverse in nature, from linguistics and literature to psychology and economics. Statistical methods are increasingly beginning to be involved in historical research, especially in archeology. A statistical approach is used to decipher the inscriptions in the language of the ancient peoples. Ideas that guided J. Champollion in decipheringold hieroglyphic writingare basically statistical. The art of encryption and decryption is based on the use of statistical laws of the language. Other directions are related to the study of the repetition of words and letters, the distribution of stress in words, the calculation of the informativeness of the language of specific writers and the poet. Statistical methods are used to establish authorship and expose literary forgeries. For example,authorship of M.A. Sholokhov based on the novel "Quiet Don"was established using probabilistic and statistical methods. Revealing the frequency of the appearance of sounds of a language in oral and written speech makes it possible to raise the question of the optimal coding of the letters of a given language for transmitting information. The frequency of use of letters determines the ratio of the number of characters in the typesetting box office. The location of letters on a typewriter carriage and on a computer keyboard is determined by statistical study of the frequency of letter combinations in a given language.
    Many problems of pedagogy and psychology also require the involvement of a probabilistic-statistical apparatus. Economic issues cannot but be of interest to society, since all aspects of its development are associated with it. Without statistical analysis, it is impossible to predict changes in the size of the population, its needs, the nature of employment, changes in mass demand, and without this it is impossible to plan economic activities.
    Issues of product quality control are directly related to probabilistic and statistical methods. Often, manufacturing a product takes incomparably less time than checking its quality. For this reason, there is no way to check the quality of each product. Therefore, one has to judge the quality of the batch by a relatively small part of the sample. Statistical methods are also used when testing the quality of products leads to their damage or death.
    Agriculture-related issues have long been resolved with extensive use of statistical methods. Breeding new breeds of animals, new varieties of plants, comparing productivity - this is not a complete list of problems solved by statistical methods.
    It can be said without exaggeration that statistical methods permeate our whole life today. In the well-known work of the materialist poet Lucretius Cara "On the nature of things" there is a vivid and poetic description of the phenomenon of Brownian motion of dust particles:
    “Look here: whenever sunlight penetrates
    Into our homes and the darkness cuts through with its rays,
    Many small bodies in the void, you will see, flickering,
    They rush to and fro in a radiant radiance of light;
    As if in an eternal struggle, they fight in battles and battles.
    Suddenly they rush into fights in detachments, not knowing rest.
    Either converging, or apart continually flying apart again.
    From this you can understand how relentlessly
    The origins of things are rumpled in the immense emptiness.
    So about great things they help make up an understanding
    Small things, outlining paths for achievement,
    In addition, therefore, you need to pay attention
    To the turmoil in bodies flashing in the sunlight
    That from it you learn of matter and also motion "

    The first opportunity to experimentally study the relationship between the random movement of individual particles and the regular movement of their large aggregates appeared when, in 1827, the botanist R. Brown discovered a phenomenon that was named after him "Brownian motion". Brown observed pollen suspended in water under a microscope. To his surprise, he found that particles suspended in water are in continuous disorderly motion, which cannot be stopped with the most careful efforts to eliminate any external influences. It was soon discovered that this is a common property of any sufficiently small particles suspended in a liquid. Brownian motion is a classic example of a random process.


    6. Probability and air transport
    In the previous chapter, we examined the application of the theory of probability and statistics in various fields of science. In this chapter, I would like to give examples of the application of probability theory in air transport.
    Air transport is a concept that includes both the aircraft itself and the infrastructure necessary for their operation: airports, dispatching and technical services. As you know, making a flight is the result of the joint work of many airport services, which in their activities use various fields of science and in almost all of these areas the theory of probability takes place. I would like to give an example from the field of navigation, where the theory of probability is also widely used.
    In connection with the development of satellite navigation, landing and communication systems, new indicators of reliability have been introduced such as the integrity, continuity, and availability of the system. All of these reliability indicators are quantified in terms of probability.
    Integrity - the degree of confidence in the information received from the radio engineering system and used in the future by the aircraft. The probability of integrity is equal to the product of the probability of failure and the probability of failure to be detected and must be equal to or less than 10 -7 per flight hour.
    Continuity of service is the ability of the complete system to perform its function without interrupting the operating mode during the execution of the planned operation. It must be at least 10 -4.
    Readiness is the ability of a system to perform its functions prior to the beginning of an operation. Onam must be at least 0, 99.
    Conclusion
    Probabilistic ideas stimulate the development of the entire complex of knowledge today, from the sciences of non-living nature to the sciences of society. The progress of modern natural science is inseparable from the use and development of probabilistic ideas and methods. In our time, it is difficult to name any area of ​​research where probabilistic methods are not applied.


    Bibliography
    1. Wentzel E.S. Probability theory: Textbook for universities. M .: Higher school, 2006;
    2. Gmurman V.E. Theory of Probability and Mathematical Statistics. Textbook. manual for universities. M: Higher School, 1998;
    3. Gnedenko B.V. Essay on the theory of probability. M .: Editorial URSS, 2009;
    4. Maystrov L.Ye. Development of the theory of probability. M.: Science, 1980;
    5. Maystrov L.Ye. Probability theory. Historical sketch. Moscow: Nauka, 1967
    6. Sobolev E.V. Organization of radio technical support of flights (part 1). St. Petersburg, 2008;
    7. http: // verojatnost. pavlovkashkola.edusite.ru/ p8aa1.html
    8.http: //shpora.net/index.cgi? act = view & id = 4966

    Share with friends or save for yourself:

    Loading...