Econometric Theory/Ordinary Least Squares (OLS)

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Ordinary Least Squares or OLS is one of the simplest (if you can call it so) methods of linear regression. The goal of OLS is to closely "fit" a function with the data. It does so by minimizing the sum of squared errors from the data.

Why we Square Errors before Summing[edit | edit source]

We are not trying to minimize the sum of errors, but rather the sum of squared errors. Let's take a brief look at our sweater story again.

Model A
Model B
model data point error from line
A 1 5
A 2 10
A 3 -5
A 4 -10
B 1 3
B 2 -3
B 3 3
B 4 -3

Notice that the Sum of Model A is and that the Sum of Model B is

The error of both Models sum to 0. Does this mean they are both are great fits! NO!!

So to account for the signs, whenever we sum errors, we square the terms first. Thus, both positive and negative deviations are penalized equally, while trying to minimize the errors of the fitted line.

The Model[edit | edit source]

These two models each have an intercept term , and a slope term (some textbooks use instead of and instead of , this is a much better approach once we move to multivariate formulas). We can represent an arbitrary single variable model with the formula: The y-values are related to the x-values given this formula. is called the dependent variable and is called the independent variable, since the value of is determined by the value of . We use the subscript i to denote an observation. So is paired with , with , etc. The term is the error term, which is the difference between the effect of and the observed value of .

Unfortunately, we don't know the values of or . We have to approximate them. We can do this by using the ordinary least squares method. The term "least squares" means that we are trying to minimize the sum of squares, or more specifically we are trying to minimize the squared error terms. Since there are two variables that we need to minimize with respect to ( and ), we have two equations:



Call the solutions to these equations and . Solving we get:


Where and . Computing these results can be left as an exercise.

It is important to know that and are not the same as and because they are based on a single sample rather than the entire population. If you took a different sample, you would get different values for and . Let's call and the OLS estimators of and . One of the main goals of econometrics is to analyze the quality of these estimators and see under what conditions these are good estimators and under which conditions they are not.

Once we have and , we can construct two more variables. The first is the fitted values, or estimates of y:

The second is the estimates of the error terms, which we will call the residuals:

These two variables will be important later on.