MATH 417 Lecture 24

From Notes
Jump to navigation Jump to search

« previous | Thursday, April 17, 2014 | next »


Suppose we want to find a linear approximation for the points :

MATH 417 LinApproxPts.png

Hence the coefficients to our control points and are

and our values are

We compute and solve the following for :

Hence our equation is:

MATH 417 LinApproxRegression.png

General Case

for , is a subspace spanned by , where 's are linearly independent.

We wish to approximate by . We do so by minimizing the value of for all . This is called the best least squares fit

is just the projection of onto .

Normal equations are:

Residual (or rejection) belongs to the subspace perpendicular to , or .

Theorem. If for all , then is the minimizer for .

Proof. If is the projection of onto a subspace, then is the shortest vector from to . The shortest vector from to is orthogonal to (otherwise would not be minimized). Hence is orthogonal to .

Take and compute

Theus minimizes the value of , and thus

quod erat demonstrandum

Special Case

Let be spanned by an orthonormal basis , and let be spanned by an orthonormal basis . Thus .

Now is given by .

To find the projection, we just take the components that are members of :