MATH 323 Lecture 23

From Notes
Jump to navigation Jump to search

« previous | Tuesday, November 20, 2012 | next »


Least Squares Problem

given a matrix with rank , the least-squares solution to the equation is given by

and has the unique solution

the vector in the range of is closest to .

Assume the columns of constitute an orthonormal basis in .

Theorem 5.5.6

If the column vectors of form an orthonormal set of vectors in , then and the solution to the least squares problem is

Proof

We need to show that .

If , then

Therefore, the diagonal entries for will be 1, and all other matrix cells will be 0.

Q.E.D.

Theorem 5.5.7

Let be a subspace in and let .

Let and be an orthonormal basis for .

If where for all , then .

Proof

Show that for any :

Theorem 5.5.8

Under hypothesis of #Theorem 5.5.7, is the element of that is closest to . That is, for all in .

Corollary 5.5.9

Let be a nonzero subspace in and let .

If \{\vec{u}_1, \ldots, \vec{u}_k \} is an orthonormal basis for and , then the projection of onto is given by \vec{p} = UU^T \vec{b}</math>

Proof

Let be the projection matrix for .

Going back to the formula for the least squares solution, ,

is unique to regardless of basis used. However, can only be used for orthonormal bases; other calculations must use


Example

Find best least squares approximation of on [0,1] by a linear function (subspace of C[0,1])

  • Space: C[0,1]
  • Inner product:
  • Subsace: (linear functions)

Find such that :

Therefore , and forms an orthogonal basis for .


Find orthonormal basis for :

Therefore forms an orthonormal basis for


Find least squares approximation:

Fourier Approximation

The last formula is the equation for harmonic motion.

Gram-Schmidt Process

For , and basis in ,

How do we obtain , an orthonormal basis for so that ?

Theorem 5.6.1

Let be a basis for . Let and define recursively by

Where is the projection of onto .

The set is an orthonormal basis for .