MATH 323 Lecture 21

From Notes
Jump to navigation Jump to search

« previous | Thursday, November 8, 2012 | next »


Least Squares Problem

Given subspace and a vector , find the closest approximation . is a vector projection, and is the α-scalar projection.

When represented by , is the column space of , and is the vector such that . We get is the residual vector.

Normal Equation

Theorem 5.3.2

is a matrix of rank (same rank as number of columns). Then the normal equation has a unique solution given by

And is the unique least squares solution to the sysetm .


Proof

Based on premise that is nonsingular.

Assume we have some vector . For to be nonsingular, must be the only solution.

We know that

  • and

Therefore .

Q.E.D.

Corollary

We already know that , and , so

The projection matrix is interesting because :

Example

Overdetermined system

Regression

Given a set of measurements at points , each set of values defines a point at .

Linear Regression

Find equation such that that approximates the system of equations

Solution for is given by


Inner Product Spaces

Vector space . Suppose we have a function such that for all , the inner product of (notation ) is a real number.

We want to have the following properties:

  1. and is equal to 0 iff
  2. (commutativity)
  3. and the same applies for the second component.

Example 1

, is the scalar product.

Given weights such that ,

Example 2

Given two matrices , let .

Example 3

Given two functions , let


  1. and

Properties

Given a vector space and an inner product function

We can redefine:

  • length / norm:
  • Orthogonality:
  • Scalar projection:
  • Vector projection:

Theorem 5.4.1

The Pythagorean Law

If , then

Proof


Orthogonality of Functions

, where the inner product is defined as


Theorem 5.4.2

The Cauchy-Schwarz Inequality

Holds iff and are linearly dependent.