You get a bonus - 1 coin for daily activity. Now you have 1 coin

Simulation of a random variable system

Lecture



Often in practice there are systems of random variables, that is, such two (and more) different random variables X , Y (and others) that depend on each other. For example, if an event X occurred and took some random value, then the Y event occurs, although by chance, but taking into account that X has already taken some meaning.

For example, if a large number falls out as X , then Y should fall out also a sufficiently large number (if the correlation is positive). It is very likely that if a person has a lot of weight, then he will most likely be of great height. Although this is NOT MANDATORY, it is NOT the REGULARITY, but the correlation of random variables. Since there are, albeit rarely, people with a large weight, but of small stature or with a small weight and high. And yet, the majority of obese people are high, and low people have low weight.

By definition, if the random variables are independent, then f ( x ) = f ( x 1 ) · f ( x 2 ) ·… · f ( x n ).

x i is a random independent variable;

f ( x i ) is the probability density of a random independent variable x i ;

f ( x ) is the probability density of falling out of the vector x of random independent variables x 1 , x 2 , ..., x n .

If the random variables are dependent, then f ( x ) = f ( x 1 ) · f ( x 2 | x 1 ) · f ( x 3 | x 2 , x 1 ) · ... · f ( x n | x n - 1 , x n - 2 , ..., x 2 , x 1 ).

x j | x j - 1 , ..., x 1 - random dependent values: the loss of x j , provided that x j - 1 , ..., x 1 is dropped;

f ( x j | x j - 1 , ..., x 1 ) is the density of the conditional probability of occurrence of x j , if x j - 1 , ..., x 1 ;

f ( x ) is the probability of falling out the vector x of random dependent variables.

Let, for example, there are two dependent events - X and Y , distributed according to the normal law. X has a mean of m x and a standard deviation of σ x . Y has a mean of m y and a standard deviation σ y . The correlation coefficient - q - shows how closely the events X and Y are related. If the correlation coefficient is equal to one, then the dependence of the events X and Y is one-to-one: one value of X corresponds to one value of Y (see fig. 26.1).

  Simulation of a random variable system
Fig. 26.1. Type of dependence of two random variables
with a positive correlation coefficient (q = 1)

When q is close to unity, the picture shown in fig. 26.2, that is, several Y values ​​(more precisely, one of several Y values ​​randomly determined) may correspond to one X value; in this case, X and Y events are less correlated, less dependent on each other.

  Simulation of a random variable system
Fig. 26.2. Type of dependence of two random variables
with a positive correlation coefficient (0 <q <1)

And finally, when the correlation coefficient tends to zero, a situation arises in which any value of X can correspond to any value of Y , that is, events X and Y do not depend on or almost do not depend on each other, do not correlate with each other (see Figure 26.3).

  Simulation of a random variable system
Fig. 26.3. Type of dependence of two random variables
with a correlation coefficient close to zero (q -> 0)

On all graphs, the correlation was taken as a positive value. If q <0, then the graphs will look like the one shown in fig. 26.4.

  Simulation of a random variable system
Fig. 26.4. Type of dependence of two random variables
with a negative correlation coefficient
a) q = –1; b) –1 <q <0; c) q -> 0

In fact, random events ( X and Y ) cannot take any values ​​with equal probability, as is the case in Fig. 26.2. For example, in a group of students there cannot be people of super-small or extra-large growth; Mostly, people have a certain average growth and variation around this average growth. Therefore, on some parts of the X axis the number of events is located more densely, on others - less often. (The density of random events, the number of points on the graphs is greater near the values ​​of m x ). The same is true for Y. And then rice. 26.2 can be depicted more accurately, as shown in Fig. 26.5.

  Simulation of a random variable system
Fig. 26.5. Illustration of a system of random dependent variables

For example, take the normal distribution, as the most common. The expectation indicates the most probable events, here the number of events is greater and the schedule of events is thicker. A positive correlation indicates (see Fig. 26.2) that large random variables X cause large Y to be generated. A negative correlation indicates (see Fig. 26.4) that large random variables X stimulate to generate smaller random variables Y. The zero and close to zero correlation shows (see Fig. 26.3) that the magnitude of the random variable X is in no way connected with a certain value of the random variable Y. It is easy to understand what has been said if we first imagine the distributions of f ( X ) and f ( Y ) separately, and then link them into the system (see Fig. 26.6, Fig. 26.7 and Iris. 26.8).

  Simulation of a random variable system
Fig. 26.6. Random System Generation
with a positive correlation coefficient
  Simulation of a random variable system
Fig. 26.7. Random System Generation
with a negative correlation coefficient
  Simulation of a random variable system
Fig. 26.8.

An example of the implementation of the algorithm for modeling two dependent random events X and Y. Condition: Assume that X and Y are distributed according to the normal law with the corresponding values m x , σ x and m y , σ y . Given the correlation coefficient of two random events q , that is, the random variables X and Y are dependent on each other, Y is not entirely random.

Then the possible algorithm for implementing the model will be as follows.

  1. Playing six random evenly distributed on the interval [0; 1] numbers b 1 , b 2 , b 3 , b 4 , b 5 , b 6 ; their sum is S : S = b 1 + b 2 + b 3 + b 4 + b 5 + b 6 . The normally distributed random number x is found according to the following formula: x = sqrt (2) · σ x · ( S - 3) + m x , see division 25.
  2. By the formula m y / x = m y + q · σ y / σ x · ( x - m x ) the expectation m y / x is found (the sign y / x means that y will take random values ​​taking into account the condition that x already adopted some specific values).
  3. By the formula σ y / x = σ y · sqrt (1 - q 2 ) the standard deviation σ y / x is found (the sign y / x means that y will take random values ​​taking into account the condition that x has already taken some specific values ).
  4. Playing six random evenly distributed on the interval [0; 1] numbers r 1 , r 2 , r 3 , r 4 , r 5 , r 6 ; their sum k is found : k = r 1 + r 2 + r 3 + r 4 + r 5 + r 6 . The normally distributed random number y is found using the following formula: y = sqrt (2) · σ y / x · ( k - 3) + m y / x .

Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

System modeling

Terms: System modeling