# MATH 323 Lecture 26

« previous | "Thursday", November 4, 2012 | next »

### Theorem 6.3.1

If $\lambda _{1},\ldots ,\lambda _{k}$ are distinct eigenvalues of an $n\times n$ matrix $A$ with corresponding eigenvectors ${\vec {x}}_{1},\ldots ,{\vec {x}}_{k}$ , then ${\vec {x}}_{1},\ldots ,{\vec {x}}_{2}$ are linearly independent.

{\begin{aligned}V&=\mathrm {Span} \{{\vec {x}}_{1},\ldots ,{\vec {x}}_{k}\}\\\dim V&=r\end{aligned}} ## Diagonalization

$A$ is said to be diagonalizable if there is a nonsingular matrix $X$ with

$X^{-1}AX=D={\begin{pmatrix}\lambda _{1}&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &\lambda _{n}\end{pmatrix}}$ where $D$ is a diagonal matrix.

Therefore $A\sim D$ by $A=XDX^{-1}$ .

### Theorem 6.3.2

An $n\times n$ matrix $A$ is diagonalizable iff $A$ has $n$ linearly independent eigenvectors.

{\begin{aligned}A:&{\vec {x}}_{1},\ldots ,{\vec {x}}_{n}\quad {\text{linearly independent eigenvectors}}\\&\lambda _{1},\ldots ,\lambda _{n}\quad {\text{eigenvalues (not necessarily distinct)}}\\X&=({\vec {x}}_{1},\ldots ,{\vec {x}}_{n})\\D&={\begin{bmatrix}\lambda _{1}&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &\lambda _{n}\end{bmatrix}}\end{aligned}} #### Corollary

1. If $A$ is diagonalizable and $A=XDX^{-1}$ , then diagonal entries of $D$ are eigenvalues of $A$ .
2. $X=({\vec {x}}_{1},\ldots ,{\vec {x}}_{n})$ , but $X$ is not unique.
3. If eigenvalues are distinct, then $A$ is diagonalizable
4. If they are not distinct, then $A$ may or may not be diagonalizable
5. If $A=XDX^{-1}$ , then $A^{k}=XD^{k}X^{-1}=X{\begin{pmatrix}\lambda _{1}^{k}&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &\lambda _{n}^{k}\end{pmatrix}}X^{-1}$ .

#### Example

{\begin{aligned}A&={\begin{bmatrix}2&-3\\2&-5\end{bmatrix}}\\\lambda _{1}&=1&\lambda _{2}&=-4\\{\vec {x}}_{1}&=\left\langle 3,1\right\rangle &{\vec {x}}_{2}&=\left\langle 1,2\right\rangle \\X&={\begin{pmatrix}3&1\\1&2\end{pmatrix}}\\X^{-1}AX=D&={\begin{pmatrix}1&0\\0&-4\end{pmatrix}}\\&={\frac {1}{5}}{\begin{bmatrix}2&-1\\-1&3\end{bmatrix}}\,{\begin{bmatrix}2&-3\\2&-5\end{bmatrix}}\,{\begin{bmatrix}3&1\\1&2\end{bmatrix}}\end{aligned}} ## Matrix Exponential

$\mathrm {e} ^{x}=\sum _{n=0}^{\infty }{\frac {1}{n!}}x^{n}$ is a very important function.

We similarily define $\mathrm {e} ^{A}$ for a matrix $A$ to be

$\mathrm {e} ^{A}=\sum _{n=0}^{\infty }{\frac {1}{n!}}A^{n}$ If A is diagonalizable, then $\mathrm {e} ^{A}=X\mathrm {e} ^{D}X^{-1}$ , where $\mathrm {e} ^{D}$ is given by

$\mathrm {e} ^{D}={\begin{bmatrix}\mathrm {e} ^{\lambda _{1}}&\dots &0\\\vdots &\ddots &\vdots \\0&\dots &\mathrm {e} ^{\lambda _{n}}\end{bmatrix}}$ ### Example

$A={\begin{bmatrix}-2&-6\\1&3\end{bmatrix}}={\begin{bmatrix}-2&-3\\1&1\end{bmatrix}}\,{\begin{bmatrix}1&0\\0&0\end{bmatrix}}\,{\begin{bmatrix}1&3\\-1&-2\end{bmatrix}}$ $\mathrm {e} ^{A}={\begin{bmatrix}-2&-3\\1&1\end{bmatrix}}\,{\begin{bmatrix}\mathrm {e} &0\\0&1\end{bmatrix}}\,{\begin{bmatrix}1&3\\-1&-2\end{bmatrix}}={\begin{bmatrix}3-2\mathrm {e} &6-6\mathrm {e} \\\mathrm {e} -1&3\mathrm {e} -2\end{bmatrix}}$ ### Application: Differential Equation

Solution to differential equation $y'=ay$ is $y=\mathrm {e} ^{at}\,y_{0}$ .

Similarly, the system of differential equations ${\vec {Y}}'=A{\vec {Y}}$ has the solution ${\vec {Y}}=\mathrm {e} ^{tA}{\vec {Y}}_{0}$ .

## Final Exam Review

5 problems and an 11-point bonus problem

• Linear transformations (Null Space, Range, Basis)
• Orthogonality
• Orthogonalization (Gram-Schmidt process, $QR$ factorization)
• Eigenvalues and Eigenvectors
• Least Squares Problems

## Note on Factoring Cubics

For a cubic polynomial $p(\lambda )=a\lambda ^{3}+b\lambda ^{2}+c\lambda +d$ , if $p(m)=0$ for $m\in \mathbb {Z}$ , $m$ divides $d$ :

{\begin{aligned}p(\lambda )&=a\lambda ^{3}+b\lambda ^{2}+c\lambda +d\\&=(\alpha \lambda ^{2}+\beta \lambda +\gamma )(\lambda -m)\end{aligned}} 