Lecture
The likelihood function introduced by Fisher looks like this:
Where - unknown parameter.
As a parameter estimate need to choose a value at which reaches a maximum. Insofar as reaches a maximum with the same what and , then search for the required value of the estimate consists in solving the likelihood equation . At the same time all the roots should be discarded, and leave only those that depend on .
The estimate of the distribution parameter is a random variable that has a mathematical expectation and “scattering” around it. An estimate is said to be effective if its “scattering” around its expectation is minimal.
The following theorem is valid (given without proof). If exists for effective evaluation , then the likelihood equation has a unique solution. This decision converges to the true value .
All this is true for several unknown parameters. For example, for a one-dimensional normal law
,
,
from here at .
,
from here .
The estimate is called unbiased if the expectation ratings equally . Evaluation is unbiased. Indeed, since - simple random selection from the general population, then and .
Find out if obtained by the maximum likelihood method (or the method of moments), unbiased. Easy to make sure that
.
Consequently,
.
Find the expected value of this value:
.
Since the variance independent of value then choose . Then
, , ,
Where , - correlation coefficient between and (in this case it is equal to zero, since and do not depend on each other).
So, . This shows that the estimate is not unbiased, its expectation is somewhat less than . To eliminate this bias you need to multiply on . As a result, we get an unbiased estimate.
.
Comments
To leave a comment
Pattern recognition
Terms: Pattern recognition