« previous | Tuesday, November 6, 2012 | next »
If , then take for : is not orthogonal to .
, is orthogonal complement. For example, a plane and a normal vector.
for matrix and is defined as
For transpose matrix,
Note: Range is nothing more than the column space of a matrix
Fundamental subspaces theorem
Prove one, then the proof of the second follows from the first:
Let , then
If is a subspace of , then
Furthermore, if is a basis for and is a basis for , then is a basis for
If and is a basis for , then .
Let be a matrix formed by using the basis vectors as rows of . The rank of is , and .
by equation 1 of the previous theorem, so
Therefore . This proves the first part of the theorem.
Check linear independence of s to determine whether it is a valid basis of .
In order for to be true, and must be elements of Since and are orthogonal subspaces, , so .
If are subspaces of a vector space , and each can be written as a sum , where and , then is a direct sum of and , written
If is a subspcae of , then . In other words (or lack thereof):
Let be a basis for , then
This must be unique since .
Find basis for , , , and
Therefore, is a basis for .
, so is basis for .
Repeat above steps for
Section 5.3: Least Squares
Find best approximation of (outside of a subspace) using vector (in subspace)
Let be a subspace.
For each , there is a unique element of that is closest to , i.e.
Definition: Residual Vector
A vector is a solution to the least squares problem iff is the vector in that is closest to .
Thus we know that is the projection of onto
, where is the residual vector.
Thus is a solution of the least squares problem iff .