MATH 414 Lecture 4

From Notes
Jump to navigation Jump to search

« previous | Wednesday, January 22, 2014 | next »


Least Squares

Given time , we can measure samples of a function as points for .

Suppose we want to find a linear regression for parameters and that minimizes the following sum of squares:

Case

Let us define , where and .

Then , where is our vector of samples.


Let be a vector space of all possible regressions for . For some unit vector , we can find a vector such that .

Therefore we can find a minimum value for in :

We find the minimum of this function at , so we find that and furthermore for all .


Punchline

Theorem. Let be an inner product space and let be a subspace of . Also, let be any vector in . Then, minimizes over all if and only if .

Proof. Follows mutatis mutandis [1] from the special case illustrated above.

quod erat demonstrandum


Finding

How do we find (or in the special case )?

is the orthogonal projection of onto the vector space .

Let be an orthonormal basis for . Then


Footnotes

  1. mutatis mutandis = "change for change"