MATH 323 Lecture 20

From Notes
Jump to navigation Jump to search

« previous | Tuesday, November 6, 2012 | next »


Subspaces :

If , then take for : is not orthogonal to .

, is orthogonal complement. For example, a plane and a normal vector.


for matrix and is defined as

For transpose matrix,

Note: Range is nothing more than the column space of a matrix

Theorem 5.2.1

Fundamental subspaces theorem


Prove one, then the proof of the second follows from the first: Let , then


Theorem 5.2.2

If is a subspace of , then

Furthermore, if is a basis for and is a basis for , then is a basis for


If and is a basis for , then .

Let be a matrix formed by using the basis vectors as rows of . The rank of is , and .

by equation 1 of the previous theorem, so

Therefore . This proves the first part of the theorem.

Check linear independence of s to determine whether it is a valid basis of .

In order for to be true, and must be elements of Since and are orthogonal subspaces, , so .

Direct Sum

If are subspaces of a vector space , and each can be written as a sum , where and , then is a direct sum of and , written

Theorem 5.2.3

If is a subspcae of , then . In other words (or lack thereof):


Let be a basis for , then

This must be unique since .

Theorem 5.2.4


Find basis for , , , and

Therefore, is a basis for .

, so is basis for .

Repeat above steps for

Section 5.3: Least Squares

Find best approximation of (outside of a subspace) using vector (in subspace)

Theorem 5.3.1

Let be a subspace.

For each , there is a unique element of that is closest to , i.e.

for any


Definition: Residual Vector

A vector is a solution to the least squares problem iff is the vector in that is closest to .

Thus we know that is the projection of onto

, where is the residual vector.

Thus is a solution of the least squares problem iff .