« previous | Thursday, November 8, 2012 | next »
Least Squares Problem
Given subspace
and a vector
, find the closest approximation
.
is a vector projection, and
is the α-scalar projection.
When represented by
,
is the column space of
, and
is the vector such that
. We get
is the residual vector.
Normal Equation
Theorem 5.3.2
is a
matrix of rank
(same rank as number of columns). Then the normal equation
has a unique solution given by
And
is the unique least squares solution to the sysetm
.
Proof
Based on premise that
is nonsingular.
Assume we have some vector
. For
to be nonsingular, Failed to parse (Conversion error. Server ("https://wikimedia.org/api/rest_") reported: "Cannot get mml. Server problem."): {\displaystyle z={\vec {0}}}
must be the only solution.
We know that


- and

Therefore
.
Q.E.D.
Corollary
We already know that
, and
, so
The projection matrix
is interesting because
:
Example
Overdetermined system
Regression
Given a set of measurements
at points
, each set of values defines a point at
.
Linear Regression
Find equation such that Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle y = c_0 + c_1 x}
that approximates the system of equations
Solution for
is given by
Inner Product Spaces
Vector space Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle V}
. Suppose we have a function such that for all
, the inner product of
(notation
) is a real number.
We want to have the following properties:
and is equal to 0 iff 
(commutativity)
and the same applies for the second component.
Example 1
,
is the scalar product.
Given
weights such that
,
Example 2
Given two matrices
, let
.
Example 3
Given two functions
, let
and 
- …
Properties
Given a vector space
and an inner product function
We can redefine:
- length / norm:

- Orthogonality:

- Scalar projection:

- Vector projection:

Theorem 5.4.1
The Pythagorean Law
If
, then
Proof
Orthogonality of Functions
, where the inner product is defined as
Theorem 5.4.2
The Cauchy-Schwarz Inequality
Holds iff
and
are linearly dependent.