« previous | Wednesday, January 22, 2014 | next »
Least Squares
Given time , we can measure samples of a function as points for .
Suppose we want to find a linear regression for parameters and that minimizes the following sum of squares:
Case
Let us define , where and .
Then , where is our vector of samples.
Let be a vector space of all possible regressions for . For some unit vector , we can find a vector such that .
Therefore we can find a minimum value for in :
We find the minimum of this function at , so we find that and furthermore for all .
Punchline
Proof. Follows mutatis mutandis [1] from the special case illustrated above.
quod erat demonstrandum
Finding
How do we find (or in the special case )?
is the orthogonal projection of onto the vector space .
Let be an orthonormal basis for . Then
- ↑ mutatis mutandis = "change for change"