# MATH 323 Lecture 26

« previous | "Thursday", November 4, 2012 | next »

### Theorem 6.3.1

If ${\displaystyle \lambda _{1},\ldots ,\lambda _{k}}$ are distinct eigenvalues of an ${\displaystyle n\times n}$ matrix ${\displaystyle A}$ with corresponding eigenvectors ${\displaystyle {\vec {x}}_{1},\ldots ,{\vec {x}}_{k}}$, then ${\displaystyle {\vec {x}}_{1},\ldots ,{\vec {x}}_{2}}$ are linearly independent.

{\displaystyle {\begin{aligned}V&=\mathrm {Span} \{{\vec {x}}_{1},\ldots ,{\vec {x}}_{k}\}\\\dim V&=r\end{aligned}}}

## Diagonalization

${\displaystyle A}$ is said to be diagonalizable if there is a nonsingular matrix ${\displaystyle X}$ with

${\displaystyle X^{-1}AX=D={\begin{pmatrix}\lambda _{1}&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &\lambda _{n}\end{pmatrix}}}$

where ${\displaystyle D}$ is a diagonal matrix.

Therefore ${\displaystyle A\sim D}$ by ${\displaystyle A=XDX^{-1}}$.

### Theorem 6.3.2

An ${\displaystyle n\times n}$ matrix ${\displaystyle A}$ is diagonalizable iff ${\displaystyle A}$ has ${\displaystyle n}$ linearly independent eigenvectors.

{\displaystyle {\begin{aligned}A:&{\vec {x}}_{1},\ldots ,{\vec {x}}_{n}\quad {\text{linearly independent eigenvectors}}\\&\lambda _{1},\ldots ,\lambda _{n}\quad {\text{eigenvalues (not necessarily distinct)}}\\X&=({\vec {x}}_{1},\ldots ,{\vec {x}}_{n})\\D&={\begin{bmatrix}\lambda _{1}&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &\lambda _{n}\end{bmatrix}}\end{aligned}}}

#### Corollary

1. If ${\displaystyle A}$ is diagonalizable and ${\displaystyle A=XDX^{-1}}$, then diagonal entries of ${\displaystyle D}$ are eigenvalues of ${\displaystyle A}$.
2. ${\displaystyle X=({\vec {x}}_{1},\ldots ,{\vec {x}}_{n})}$, but ${\displaystyle X}$ is not unique.
3. If eigenvalues are distinct, then ${\displaystyle A}$ is diagonalizable
4. If they are not distinct, then ${\displaystyle A}$ may or may not be diagonalizable
5. If ${\displaystyle A=XDX^{-1}}$, then ${\displaystyle A^{k}=XD^{k}X^{-1}=X{\begin{pmatrix}\lambda _{1}^{k}&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &\lambda _{n}^{k}\end{pmatrix}}X^{-1}}$.

#### Example

{\displaystyle {\begin{aligned}A&={\begin{bmatrix}2&-3\\2&-5\end{bmatrix}}\\\lambda _{1}&=1&\lambda _{2}&=-4\\{\vec {x}}_{1}&=\left\langle 3,1\right\rangle &{\vec {x}}_{2}&=\left\langle 1,2\right\rangle \\X&={\begin{pmatrix}3&1\\1&2\end{pmatrix}}\\X^{-1}AX=D&={\begin{pmatrix}1&0\\0&-4\end{pmatrix}}\\&={\frac {1}{5}}{\begin{bmatrix}2&-1\\-1&3\end{bmatrix}}\,{\begin{bmatrix}2&-3\\2&-5\end{bmatrix}}\,{\begin{bmatrix}3&1\\1&2\end{bmatrix}}\end{aligned}}}

## Matrix Exponential

${\displaystyle \mathrm {e} ^{x}=\sum _{n=0}^{\infty }{\frac {1}{n!}}x^{n}}$ is a very important function.

We similarily define ${\displaystyle \mathrm {e} ^{A}}$ for a matrix ${\displaystyle A}$ to be

${\displaystyle \mathrm {e} ^{A}=\sum _{n=0}^{\infty }{\frac {1}{n!}}A^{n}}$

If A is diagonalizable, then ${\displaystyle \mathrm {e} ^{A}=X\mathrm {e} ^{D}X^{-1}}$, where ${\displaystyle \mathrm {e} ^{D}}$ is given by

${\displaystyle \mathrm {e} ^{D}={\begin{bmatrix}\mathrm {e} ^{\lambda _{1}}&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &\mathrm {e} ^{\lambda _{n}}\end{bmatrix}}}$

### Example

${\displaystyle A={\begin{bmatrix}-2&-6\\1&3\end{bmatrix}}={\begin{bmatrix}-2&-3\\1&1\end{bmatrix}}\,{\begin{bmatrix}1&0\\0&0\end{bmatrix}}\,{\begin{bmatrix}1&3\\-1&-2\end{bmatrix}}}$

${\displaystyle \mathrm {e} ^{A}={\begin{bmatrix}-2&-3\\1&1\end{bmatrix}}\,{\begin{bmatrix}\mathrm {e} &0\\0&1\end{bmatrix}}\,{\begin{bmatrix}1&3\\-1&-2\end{bmatrix}}={\begin{bmatrix}3-2\mathrm {e} &6-6\mathrm {e} \\\mathrm {e} -1&3\mathrm {e} -2\end{bmatrix}}}$

### Application: Differential Equation

Solution to differential equation ${\displaystyle y'=ay}$ is ${\displaystyle y=\mathrm {e} ^{at}\,y_{0}}$.

Similarly, the system of differential equations ${\displaystyle {\vec {Y}}'=A{\vec {Y}}}$ has the solution ${\displaystyle {\vec {Y}}=\mathrm {e} ^{tA}{\vec {Y}}_{0}}$.

## Final Exam Review

5 problems and an 11-point bonus problem

• Linear transformations (Null Space, Range, Basis)
• Orthogonality
• Orthogonalization (Gram-Schmidt process, ${\displaystyle QR}$ factorization)
• Eigenvalues and Eigenvectors
• Least Squares Problems

## Note on Factoring Cubics

For a cubic polynomial ${\displaystyle p(\lambda )=a\lambda ^{3}+b\lambda ^{2}+c\lambda +d}$, if ${\displaystyle p(m)=0}$ for ${\displaystyle m\in \mathbb {Z} }$, ${\displaystyle m}$ divides ${\displaystyle d}$:

{\displaystyle {\begin{aligned}p(\lambda )&=a\lambda ^{3}+b\lambda ^{2}+c\lambda +d\\&=(\alpha \lambda ^{2}+\beta \lambda +\gamma )(\lambda -m)\end{aligned}}}