« previous | Thursday, November 8, 2012 | next »
Least Squares Problem
Given subspace and a vector , find the closest approximation . is a vector projection, and is the α-scalar projection.
When represented by , is the column space of , and is the vector such that . We get is the residual vector.
is a matrix of rank (same rank as number of columns). Then the normal equation has a unique solution given by
And is the unique least squares solution to the sysetm .
Based on premise that is nonsingular.
Assume we have some vector . For to be nonsingular, must be the only solution.
We know that
We already know that , and , so
The projection matrix is interesting because :
Given a set of measurements at points , each set of values defines a point at .
Find equation such that that approximates the system of equations
Solution for is given by
Inner Product Spaces
Vector space . Suppose we have a function such that for all , the inner product of (notation ) is a real number.
We want to have the following properties:
- and is equal to 0 iff
- and the same applies for the second component.
, is the scalar product.
Given weights such that ,
Given two matrices , let .
Given two functions , let
Given a vector space and an inner product function
We can redefine:
- length / norm:
- Scalar projection:
- Vector projection:
The Pythagorean Law
If , then
Orthogonality of Functions
, where the inner product is defined as
The Cauchy-Schwarz Inequality
Holds iff and are linearly dependent.