# MATH 323 Lecture 24

« previous | Tuesday, November 27, 2012 | next »

## Theorem 5.6.1 (Gram-Schmidt Process)

Let $\{x_{1},\ldots ,x_{n}\}$ be a basis for $V,\langle \cdot ,\cdot \rangle$ . Let $u_{1}={\tfrac {x_{1}}{\|x_{1}\|}}$ and define $u_{2},\ldots ,u_{n}$ recursively by

$u_{k+1}={\frac {x_{k+1}-p_{k}}{\|x_{k+1}-p_{k}\|}}$ Where $p_{k}=\langle x_{k+1},u_{1}\rangle u_{1}+\langle x_{k+1},u_{2}\rangle u_{2}+\dots +\langle x_{k+1},u_{k}\rangle u_{k}$ is the projection of $x_{k+1}$ onto $v_{k}=\mathrm {Span} (u_{1},\ldots ,u_{k})=\mathrm {Span} (x_{1},\ldots ,x_{k})$ .

The set $\{u_{1},\ldots ,u_{n}\}$ is an orthonormal basis for $V$ .

### Notation

From now on,

{\begin{aligned}r_{kk}&=\|x_{k}-p_{k-1}\|r_{ik}&=\langle x_{k},u_{i}\rangle \end{aligned}} ### Proof by Induction

$\mathrm {Span} (x_{1})=\mathrm {Span} (u_{1})$ and $u_{1}$ is an orthonormal basis for $V_{1}$ Suppose $u_{1},\ldots ,u_{k}$ are constructed so they are an orthonormal basis for $v_{k}$ and $\mathrm {Span} \{x_{1},\ldots ,x_{k}\}=\mathrm {Span} \{u_{1},\ldots ,u_{k}\}$ .

$p_{k}\in \mathrm {Span} \{u_{1},\ldots ,u_{k}\}\implies x_{k+1}-p_{k}\in \mathrm {Span} \{x_{1},\ldots ,x_{k},x_{k+1}\}$ $x_{k+1}-p_{k}\neq 0$ because $x_{1},\ldots ,x_{k+1}$ are linearly independent.

$x_{k+1}-p_{k}=x_{k+1}-\sum _{i=1}^{k}c_{i}\,x_{i}$ , where $c_{i}=\langle x_{k+1},u_{i}\rangle$ Therefore $u_{1},\ldots ,u_{k+1}$ is an orthonormal set of vectors in $\mathrm {Span} \{x_{1},\ldots ,x_{k+1}\}$ , and is a basis for subspace of dimension $k+1$ .

Thus the theorem holds by induction.

Q.E.D.

### Example

Find an orthonormal basis for the range of the following matrix:

$A={\begin{bmatrix}1&-1&4\\1&4&-2\\1&4&2\\1&-1&0\end{bmatrix}}$ Column space is defined by $\mathrm {Span} \left\{{\begin{pmatrix}1\\1\\1\\1\end{pmatrix}},{\begin{pmatrix}-1\\4\\4\\-1\end{pmatrix}},{\begin{pmatrix}4\\-2\\2\\0\end{pmatrix}}\right\}=\mathrm {Span} \{{\vec {a}}_{1},{\vec {a}}_{2},{\vec {a}}_{3}\}$ {\begin{aligned}r_{11}&=\left\|{\vec {a}}_{1}\right\|={\sqrt {1^{2}+1^{2}+1^{2}+1^{2}}}=1\\{\vec {q}}_{1}&={\frac {{\vec {a}}_{1}}{r_{11}}}=\left\langle {\tfrac {1}{2}},{\tfrac {1}{2}},{\tfrac {1}{2}},{\tfrac {1}{2}}\right\rangle \\r_{12}&=\langle {\vec {a}}_{2},{\vec {q}}_{1}\rangle =3\\{\vec {p}}_{1}&=r_{12}\,{\vec {q}}_{1}=3{\vec {q}}_{1}\\{\vec {a}}_{2}-{\vec {p}}_{1}&=\left\langle -{\frac {5}{2}},{\frac {5}{2}},{\frac {5}{2}},-{\frac {5}{2}}\right\rangle \\r_{22}&=\|{\vec {a}}_{2}-{\vec {p}}_{1}\|=5\\q_{2}&={\frac {{\vec {a}}_{2}-{\vec {p}}_{1}}{r_{22}}}=\left\langle -{\frac {1}{2}},{\frac {1}{2}},{\frac {1}{2}},-{\frac {1}{2}}\right\rangle \\r_{13}&=\langle {\vec {a}}_{3},{\vec {q}}_{1}\rangle =2\\r_{23}&=\langle {\vec {a}}_{3},{\vec {q}}_{2}\rangle =-2\\{\vec {p}}_{2}&=r_{13}\,{\vec {q}}_{1}+r_{23}\,{\vec {q}}_{2}=\left\langle 2,0,0,2\right\rangle \\{\vec {a}}_{3}-{\vec {p}}_{2}=\left\langle 2,-2,2,-2\right\rangle \\r_{33}=\|{\vec {a}}_{3}-{\vec {p}}_{2}\|=4\\q_{3}={\frac {{\vec {a}}_{3}-{\vec {p}}_{2}}{r_{33}}}=\left\langle {\frac {1}{2}},-{\frac {1}{2}},{\frac {1}{2}},-{\frac {1}{2}}\right\rangle \end{aligned}} Therefore, $\{{\vec {q}}_{1},{\vec {q}}_{2},{\vec {q}}_{3}\}=\left\{{\begin{pmatrix}{\frac {1}{2}}\\{\frac {1}{2}}\\{\frac {1}{2}}\\{\frac {1}{2}}\end{pmatrix}},{\begin{pmatrix}-{\frac {1}{2}}\\{\frac {1}{2}}\\{\frac {1}{2}}\\-{\frac {1}{2}}\end{pmatrix}},{\begin{pmatrix}{\frac {1}{2}}\\-{\frac {1}{2}}\\{\frac {1}{2}}\\-{\frac {1}{2}}\end{pmatrix}}\right\}$ is an orthonormal basis for the range of $A$ .

## Theorem 5.6.2 (Gram-Schmidt QR Factor)

If $A$ is an $m\times n$ matrix of rank $n$ , then $A$ can be factored into a product $QR$ , where $Q$ is an $m\times n$ matrix with orthonormal column vectors and $R$ is an upper triangular $n\times n$ matrix whose diagonal entries are all positive.

In the Gram-Schmidt Orthogonalization Process, ${\vec {p}}_{i}$ represents the projection vectors.

$r_{ij}={\begin{cases}\|{\vec {x}}_{1}\|&i=j=1\\\|{\vec {x}}_{i}-{\vec {p}}_{i}\|&i=j\\\langle {\vec {q}}_{i},{\vec {x}}_{j}\rangle &{\text{otherwise}}\end{cases}}$ We use the defined values above to form an upper triangular matrix $R$ $R=\left({\begin{cases}r_{ij}&i\leq j\\0&{\text{otherwise}}\end{cases}}\right)$ such that $A=QR$ , where $Q=\left({\vec {q}}_{1},\ldots ,{\vec {q}}_{n}\right)$ .

### Example

From previous example,

{\begin{aligned}R&={\begin{bmatrix}r_{11}&r_{12}&r_{13}\\0&r_{22}&r_{23}\\0&0&r_{33}\end{bmatrix}}Q&=({\vec {q}}_{1},{\vec {q}}_{2},{\vec {q}}_{3})\end{aligned}} ### Polynomial Space Example

Consider $P_{3}$ , the subspace consisting of quadratic polynomials, with an inner product:

$\langle p,q\rangle =\sum _{i=1}^{n}p(c_{i})\,q(c_{i})$ , where ${\vec {c}}=\langle -1,0,1\rangle$ $1$ , $x$ , and $x^{2}$ (represented by ${\vec {x}}$ ) form a basis for $P_{3}$ , but they are not orthonormal w.r.t. the inner product definition.

Find an orthonormal basis $\{u_{1},u_{2},u_{3}\}$ for $P_{3}$ .

{\begin{aligned}\|x_{1}\|&=\|1\|={\sqrt {\langle 1,1\rangle }}={\sqrt {1\cdot 1+1\cdot 1+1\cdot 1{\sqrt {3}}}}\\u_{1}&={\frac {x_{1}}{\|x_{1}\|}}={\frac {1}{\sqrt {3}}}\\p_{1}&=\left\langle x,{\frac {1}{\sqrt {3}}}\right\rangle {\frac {1}{\sqrt {3}}}=0\\x_{2}-p_{1}&=x\\\|x_{2}-p_{1}\|&=\langle x,x\rangle ={\sqrt {2}}\\u_{2}&={\frac {x}{\sqrt {2}}}\\p_{2}&=\left\langle x^{2},{\frac {1}{\sqrt {3}}}\right\rangle {\frac {1}{\sqrt {3}}}+\left\langle x^{2},{\frac {x}{\sqrt {2}}}\right\rangle {\frac {x}{\sqrt {2}}}={\frac {2}{3}}\\\|x^{2}-p_{2}\|&=\left\langle x^{2}-{\frac {2}{3}},x^{2}-{\frac {2}{3}}\right\rangle ={\frac {\sqrt {6}}{2}}\\u_{3}&={\frac {x^{2}-{\frac {2}{3}}}{\sqrt {\frac {2}{3}}}}\end{aligned}} ## Eigenvalues and Eigenvectors

Let $A$ be an $n\times n$ matrix.

A scalar $\lambda$ is an eigenvalue of $A$ if there is a nonzero vector ${\vec {x}}\in \mathbb {C} ^{n}$ such that

$A\,{\vec {x}}=\lambda \,{\vec {x}}$ ${\vec {x}}$ is said to be an eigenvector belonging to $\lambda$ .

All of the following are equivalent: $\lambda$ is an eigenvalue iff

• $(A-\lambda \,I){\vec {x}}={\vec {0}}$ has nonzero solution
• $\left|A-\lambda \,I\right|=0$ This is important since it gives the characteristic equation
• $A-\lambda \,I$ is singular
• $N(A-\lambda \,I)\neq 0$ ### Example

$A={\begin{bmatrix}4&-2\\1&1\end{bmatrix}}$ , $x={\begin{bmatrix}2\\1\end{bmatrix}}$ $A\,{\vec {x}}={\begin{bmatrix}6\\3\end{bmatrix}}=3{\vec {x}}$ .

Therefore, $\lambda =3$ is the eigenvalue, and ${\vec {x}}=\langle 2,1\rangle$ is an eigenvector belonging to $\lambda =3$ .

If ${\vec {x}}$ is an eigenvector belonging to $\lambda$ , then $\mu \,{\vec {x}}$ is also an eigenvector belonging to $\lambda$ (for nonzero $\mu$ ):

$A(\mu \,{\vec {x}})=\mu \,A\,{\vec {x}}=\mu \,\lambda \,{\vec {x}}$ ### Characteristic Equation

$p(\lambda )=\|A-\lambda \,I\|$ is called the characteristic polynomial

$\|A-\lambda \,I\|=0$ is called the characteristic equation

Either way, the form is

${\begin{vmatrix}a_{11}-\lambda &\dots &a_{1n}\\\vdots &\ddots &\vdots \\a_{n1}&\dots &a_{nn}-\lambda \end{vmatrix}}$ #### Example 1

$A={\begin{bmatrix}3&2\\3&-2\end{bmatrix}}$ Characteristic equation is ${\begin{vmatrix}3-\lambda &2\\3&-2-\lambda \end{vmatrix}}=\lambda ^{2}-\lambda -12$ .

Solutions are $\lambda =4,-3$ .

Eigenvectors can be obtained by solving $(A-\lambda \,I){\vec {x}}=0$ .