You get a bonus - 1 coin for daily activity. Now you have 1 coin

Linear regression

Lecture



Linear regression is a regression model used in statistics for the dependence of one (explicable, dependent) variable y on another or several other variables (factors, regressors, independent variables) x with a linear dependence function.

The linear regression model is often used and most studied in econometrics. Namely, the properties of parameter estimates obtained by various methods under the assumptions about the probability characteristics of factors and random model errors are studied. The limiting (asymptotic) properties of estimates of nonlinear models are also derived based on the approximation of the latter by linear models. It should be noted that, from an econometric point of view, linearity in parameters is more important than linearity in model factors.

Content

  • 1 Definition
    • 1.1 Paired and Multiple Regression
  • 2 Examples
  • 3 Matrix representation
  • 4 Classic linear regression
  • 5 Evaluation Methods
  • 6 See also
  • 7 Literature

Definition [edit]

Regression model

  Linear regression ,

Where   Linear regression - model parameters,   Linear regression - random model error, called linear regression, if the regression function   Linear regression has the appearance

  Linear regression ,

Where   Linear regression - regression parameters (coefficients),   Linear regression - regressors (model factors), k - number of model factors.

The linear regression coefficients show the rate of change of the dependent variable for a given factor, with other factors fixed (in the linear model this rate is constant):

  Linear regression

Parameter   Linear regression at which there are no factors, is often called a constant . Formally, this is the value of the function with zero value of all factors. For analytical purposes, it is convenient to assume that a constant is a parameter with a "factor" equal to 1 (or another arbitrary constant, therefore this factor is also called a constant). In this case, if you re-number the factors and parameters of the original model with this in mind (leaving the designation of the total number of factors - k), then the linear regression function can be written in the following form, which formally does not contain a constant:

  Linear regression

  Linear regression - vector of regressors,   Linear regression - column vector of parameters (coefficients)

A linear model can be either with a constant or without a constant. Then, in this view, the first factor is either equal to one, or is the usual factor, respectively.

Paired and Multiple Regression [edit]

In the particular case when the factor is unique (without taking into account the constant), they speak of a pair or the simplest linear regression:

  Linear regression

When the number of factors (without taking into account the constant) is greater than 1, then they say about multiple regression.

Examples [edit]

1. Organization cost model (without specifying a random error):

  Linear regression

  Linear regression - total costs   Linear regression - fixed costs (not dependent on the volume of production),   Linear regression - variable costs proportional to the volume of production,   Linear regression - specific or average (per unit of production) variable costs,   Linear regression - volume of production.

2. The simplest model of consumer spending (Keynes):

  Linear regression

  Linear regression - consumer spending   Linear regression - disposable income   Linear regression - “marginal propensity to consume”,   Linear regression - Autonomous (not dependent on income) consumption.

Matrix view [edit]

Let the sample be given a volume of n observations of the variables y and x . Let t be the number of observations in the sample. Then   Linear regression - the value of the variable y in the t -th observation,   Linear regression - the value of the j -th factor in the t -th observation. Respectively,   Linear regression - vector of regressors in the t -th observation. Then a linear regression dependence occurs in each observation:

  Linear regression

We introduce the notation:

  Linear regression - vector of observations of the dependent variable y ,   Linear regression - matrix of factors.   Linear regression - vector of random errors.

Then the linear regression model can be represented in a matrix form:

  Linear regression

Classical linear regression [edit]

In classical linear regression, it is assumed that, along with the standard condition   Linear regression The following assumptions are also fulfilled ( Gauss-Markov conditions ):

1) Homoscedasticity (constant or identical dispersion) or the absence of heteroscedasticity of random model errors:   Linear regression

2) Absence of autocorrelation of random errors:   Linear regression

These assumptions in the matrix representation of the model are formulated as one assumption about the structure of the covariance matrix of the random error vector:   Linear regression

In addition to these assumptions, in the classical model, the factors are assumed to be deterministic ( non-stochastic ). In addition, it is formally required that the matrix   Linear regression had a full rank (   Linear regression ), that is, it is assumed that there is no complete collinearity of factors.

When performing classical assumptions, the usual method of least squares allows one to obtain sufficiently qualitative estimates of the model parameters, namely: they are unbiased, consistent and most effective estimates.

Evaluation Methods [edit]

  • Least square method
  • Generalized least squares method
  • Method of instrumental variables
  • Maximum Likelihood Method
  • Moment method
  • Generalized method of moments
  • Quantile regression
created: 2014-11-06
updated: 2024-11-13
678



Rating 9 of 10. count vote: 2
Are you satisfied?:



Comments


To leave a comment
If you have any suggestion, idea, thanks or comment, feel free to write. We really value feedback and are glad to hear your opinion.
To reply

Probability theory. Mathematical Statistics and Stochastic Analysis

Terms: Probability theory. Mathematical Statistics and Stochastic Analysis