Econometric Theory/Normal Equations Proof

From Wikibooks, open books for an open world
Jump to: navigation, search

Below is the proof of the Normal Equations for OLS.

The goal of OLS is to minimize the sum of squared error terms to find the best fit, also called the Residual Sum of Squares (RSS). This is denoted by \sum \hat{\epsilon_i}^2.

Defining the RSS[edit]

Known:  \hat{\epsilon_i} = Y_i - \hat{Y_i} = Y_i - \alpha - \beta X_i

RSS =   \sum \hat{\epsilon_i}^2 =

         =\sum  (Y_i - \hat{Y_i})^2

        = \sum (Y_i - \hat{\alpha} - \hat{\beta} X_i)^2

Differentiate the RSS (so that we can then minimise it)[edit]

 min_{\alpha} \sum \hat{\epsilon_i}^2 =

\frac{\partial \sum \hat{\epsilon_i}^2}{\partial \hat{\alpha}} = \sum 2 \hat{\epsilon_i}\frac{\partial \hat{\epsilon_i}}{\partial \hat{\alpha}}  = 2 \sum \hat{\epsilon_i} (-1) = 2 \sum (Y_i - \hat{\alpha} - \hat{\beta} X_i) (-1) = 0


min_{\beta} \sum \hat{\epsilon_i}^2 =


\frac{\partial \sum \hat{\epsilon_i}^2}{\partial \hat{\beta}} = \sum 2 \hat{\epsilon_i} \frac{\partial \hat{\epsilon_i}}{\partial \hat{\beta}} = 2 \sum \hat{\epsilon_i} (-X_i) = 2 \sum (Y_i - \hat{\alpha} - \hat{\beta} X_i) (-X_i) = 0

So we have two equations:

 \sum (Y_i -  \hat{\alpha} - \hat{\beta} X_i) (-1) = 0

and


 \sum (Y_i - \hat{\alpha} - \hat{\beta} X_i) (-X_i) = 0 (The two(2) here is divided from both sides)

setting them both equal to \sum Y_i

We get

\sum Y_i = n \hat{\alpha} + \hat{\beta} \sum X_i (This is the first OLS Normal Equation)

and

 \sum Y_i X_i = \hat{\alpha} \sum X_i + \hat{\beta}\sum X_i^2 (This is the second OLS Normal Equation)

Solve the Normal Equations[edit]

Divide the first equation by n


 \frac{1}{n} \sum Y_i = \hat{\alpha} + \frac{1}{n}\hat{\beta} \sum X_i

Leaves us with \left( \sum W_i \frac{1}{n} = \bar{W} \right)

 \bar{Y} = \hat{\alpha} + \hat{\beta} \bar{X} \Leftrightarrow \hat{\alpha} = \bar{Y} - \hat{\beta} \bar{X}

Now we know how to get α(hat), we can work on β(hat)


 \sum Y_i X_i = \hat{\alpha} \sum X_i + \hat{\beta}\sum X_i^2 = [\bar{Y} - \hat{\beta} \bar{X}] \sum X_i + \hat{\beta}\sum X_i^2 = [\frac{(\sum X_i)(\sum Y_i)}{n}] + \hat{\beta}[\sum X_i^2 - \frac{(\sum X_i)^2}{n}]

We can move β(hat) to one side

 \hat{\beta} = \frac{\sum Y_i X_i -\frac{(\sum X_i)(\sum Y_i)}{n}}{\sum X^2_i - \frac{(\sum X_i)^2}{n}} = \frac{\sum (x_i - \bar{x} )(y_i - \bar{y}) } {\sum (x_i - \bar{x})^2}

- n \bar{X} \bar{Y}

And now we have our Normal equations for OLS.

Since we have two equations and two unknowns, we are able to solve for them (\hat{\alpha} , \hat{\beta}).