Lecture
Different forms of the central limit theorem differ among themselves by the conditions imposed on the distributions forming the sum of random terms. Here we formulate and prove one of the simplest forms of the central limit theorem, which refers to the case of identically distributed terms.
Theorem. If a - independent random variables having the same distribution law with the expectation and variance , then with an unlimited increase amount distribution law
(13.8.1)
unlimited approaches to normal.
Evidence.
We will carry out the proof for the case of continuous random variables. (for discontinuous it will be similar).
According to the second property of the characteristic functions, proved in the previous , characteristic function of magnitude is a product of the characteristic functions of the terms. Random variables have the same distribution law with density and therefore the same characteristic function
. (13.8.2)
Therefore, the characteristic function of a random variable will be
. (13.8.3)
We investigate in more detail the function . Imagine it in a neighborhood of a point. according to the Maclaurin formula with three members:
, (13.8.4)
Where at .
Find the values , , . Assuming in the formula (13.8.2) we have:
. (13.8.5)
Differentiate (13.8.2) by :
. (13.8.6)
Putting in (13.8.6) , we get:
. (13.8.7)
Obviously, without limiting the generality, you can put (for this it is enough to move the origin to the point ). Then
.
Let's differentiate (13.8.6) again:
,
from here
. (13.8.8)
With the integral in expression (13.8.8) is nothing but the dispersion of the quantity with density , Consequently
. (13.8.9)
Substituting in (13.8.4) , and , we get:
. (13.8.10)
Let's turn to random . We want to prove that its distribution law with increasing approaching normal. To do this, move on from to another ("normalized") random value
. (13.8.11)
This value is convenient because its dispersion does not depend on and is equal to one for any . It is easy to verify this, considering the value as a linear function of independent random variables each of which has a variance . If we prove that the distribution law approaching normal, then obviously it will be true for the magnitude associated with linear dependence (13.8.11).
Instead of proving that the distribution law while increasing approaches normal, we show that its characteristic function approaches the characteristic function of a normal law.
Find the characteristic function of . From relation (13.8.11), according to the first property of the characteristic functions (13.7.8), we obtain
, (13.8.12)
Where - characteristic function of a random variable .
From formulas (13.8.12) and (13.8.3) we get
(13.8.13)
or using the formula (13.8.10),
. (13.8.14)
Let us name the expression (13.8.14):
.
We introduce the notation
. (13.8.15)
Then
. (13.8.16)
We will increase without limit . The value of , according to the formula (13.8.15), tends to zero. With significant it can be considered very small. Decompose in a row and confine ourselves to one member of the expansion (the rest become negligible):
.
Then we get
.
By definition, the function tends to zero at ; Consequently,
and
,
from where
. (13.8.17)
This is nothing but the characteristic function of a normal law with parameters , (see example 2, 13.7).
Thus, it is proved that by increasing random characteristic function unboundedly approaches the characteristic function of the normal law; hence we conclude that the law of distribution of magnitude (and therefore the values a) unlimited approach to normal law. The theorem is proved.
We have proved the central limit theorem for a particular, but important case of equally distributed terms. However, in a fairly wide class of conditions, it is also valid for unequally distributed terms. For example, A. M. Lyapunov proved the central limit theorem for the following conditions:
, (13.8.18)
Where - the third absolute central moment of magnitude :
.
- variance of magnitude .
The most general (necessary and sufficient) condition for the validity of the central limit theorem is the Lindeberg condition: for any
,
Where - expected value, - distribution density of a random variable , .
Comments
To leave a comment
Probability theory. Mathematical Statistics and Stochastic Analysis
Terms: Probability theory. Mathematical Statistics and Stochastic Analysis