# MATH 323 Lecture 17

« previous | Thursday, October 25, 2012 | next »

## Linear Transformation

${\displaystyle \forall L:\mathbb {R} ^{n}\to \mathbb {R} ^{m}\exists A\in \mathbb {R} ^{m\times n}}$

${\displaystyle A=({\vec {a}}_{i}=L({\vec {e}}_{i})}$

### Rotation Transformation

Let ${\displaystyle L}$ be a rotation by angle ${\displaystyle \theta }$ about the origin.

{\displaystyle {\begin{aligned}L({\vec {e}}_{1})&={\begin{pmatrix}\cos {\theta }\\\sin {\theta }\end{pmatrix}}L({\vec {e}}_{2})&={\begin{pmatrix}-\sin {\theta }\\\cos {\theta }\end{pmatrix}}\\A&={\begin{pmatrix}\cos {\theta }&-\sin {\theta }\\\sin {\theta }&\cos {\theta }\end{pmatrix}}\end{aligned}}}

### In General

For arbitrary ${\displaystyle L:V\to W}$, where ${\displaystyle E=[v_{1},\ldots ,v_{n}]}$ is a basis in ${\displaystyle V}$ and ${\displaystyle F=[w_{1},\ldots ,w_{m}]}$ is a basis in ${\displaystyle W}$.

${\displaystyle v\to [v]_{E}\in \mathbb {R} ^{n}}$ is the coordinate vector of ${\displaystyle v}$ w.r.t ${\displaystyle E}$ so ${\displaystyle v=\sum _{i=1}^{n}x_{i}\,v_{i}=\left\langle x_{1},\ldots ,x_{n}\right\rangle }$

For some ${\displaystyle \sum _{i=1}^{m}y_{i}\,w_{i}=L(v)\in W}$, we can represent ${\displaystyle [L(v)]_{F}}$ as ${\displaystyle \left\langle y_{1},\ldots ,y_{m}\right\rangle \in \mathbb {R} ^{m}}$.

${\displaystyle {\vec {y}}=A{\vec {x}}\iff [L(v)]_{F}=A[v]_{E}}$ [1], so what is ${\displaystyle A}$?

${\displaystyle {\vec {a}}_{j}=[L(v_{j})]_{F}}$ for ${\displaystyle j=1,\ldots ,n}$
${\displaystyle A=({\vec {a}}_{i},\ldots ,{\vec {a}}_{n})}$

### Theorem 4.2.2

Matrix representation theorem

If ${\displaystyle E=[v_{1},\ldots ,v_{n}]}$ and ${\displaystyle F=[w_{1},\ldots ,w_{m}]}$ are ordered bases for vector spaces ${\displaystyle V}$ and ${\displaystyle W}$ respectively, then corresponding to each linear transformation ${\displaystyle L:V\to W}$ there is an ${\displaystyle m\times n}$ matrix ${\displaystyle A}$ such that

${\displaystyle [L(v)]_{F}=A[v]_{E}}$ for each ${\displaystyle v\in V}$

${\displaystyle A}$ is the matrix representing ${\displaystyle L}$ relative to the ordered bases ${\displaystyle E}$ and ${\displaystyle F}$.

In fact, ${\displaystyle A=({\vec {a}}_{1},\ldots ,{\vec {a}}_{n})}$, where ${\displaystyle {\vec {a}}_{j}=[L(v_{j})]_{F}}$ for ${\displaystyle j=1,\ldots ,n}$.

### Examples

For ${\displaystyle {\vec {x}}\in \mathbb {R} ^{3}}$ and ${\displaystyle [{\vec {b}}_{1}=\left\langle 1,1\right\rangle ,{\vec {b}}_{2}=\left\langle -1,1\right\rangle ]}$ is a basis in ${\displaystyle \mathbb {R} ^{2}}$, find the matrix ${\displaystyle A}$ representing ${\displaystyle L({\vec {x}})=x_{1}{\vec {b}}_{1}+(x_{2}+x_{3}){\vec {b}}_{2}}$ w.r.t. ordered bases ${\displaystyle [{\vec {e}}_{1},{\vec {e}}_{2},{\vec {e}}_{3}]}$ (standard 3D basis) to ${\displaystyle B=[{\vec {b}}_{1},{\vec {b}}_{2}]}$

Just take ${\displaystyle A=\left(a_{j}=L({\vec {e}}_{i})\right)={\begin{pmatrix}1&0&0\\0&1&1\end{pmatrix}}}$

${\displaystyle L(\alpha \,{\vec {b}}_{1}+\beta \,{\vec {b}}_{2})=(\alpha +\beta ){\vec {b}}_{1}+2\beta \,{\vec {b}}_{2}}$, where ${\displaystyle [{\vec {b}}_{1},{\vec {b}}_{2}]}$ is a basis for ${\displaystyle \mathbb {R} ^{2}}$. Find ${\displaystyle A}$.

${\displaystyle A=(L(1\cdot {\vec {b}}_{1}+0\cdot {\vec {b}}_{2}),L(0\cdot {\vec {b}}_{1}+1\cdot {\vec {b}}_{2}))={\begin{pmatrix}1&1\\0&2\end{pmatrix}}}$

${\displaystyle D:P_{3}\to P_{2}}$, where ${\displaystyle D(p(x))=p'(x)}$.

${\displaystyle [x^{2},x,1]}$ forms basis for ${\displaystyle P_{3}}$, and ${\displaystyle [x,1]}$ forms basis for ${\displaystyle P_{2}}$

{\displaystyle {\begin{aligned}D(x^{2})&=2x=2\cdot x+0\cdot 1\\D(x)&=1=0\cdot x+1\cdot 1\\D(1)&=0=0\cdot x+0\cdot 1\\D(ax^{2}+bx+c)&=2ax+b\end{aligned}}}

Thus ${\displaystyle A={\begin{bmatrix}2&0&0\\0&1&0\end{bmatrix}}}$

{\displaystyle {\begin{aligned}{[p(x)]}_{[x^{2},x,1]}&=\left\langle a,b,c\right\rangle \\{[D(p(x))]}_{[x,1]}&=\left\langle 2a,b\right\rangle \end{aligned}}}

## Reversing Linear Transformations

### Theorem 4.2.3

Given ${\displaystyle E=[{\vec {u}}_{1},\ldots {\vec {u}}_{n}]}$ and ${\displaystyle F=[{\vec {b}}_{1},\ldots ,{\vec {b}}_{m}]}$ are bases for ${\displaystyle \mathbb {R} ^{n}}$ and ${\displaystyle \mathbb {R} ^{m}}$ respectively,

If ${\displaystyle A}$ is the matrix representing ${\displaystyle L:\mathbb {R} ^{n}\to \mathbb {R} ^{m}}$ w.r.t. ${\displaystyle E}$ and ${\displaystyle F}$, then

${\displaystyle {\vec {a}}_{j}=B^{-1}L({\vec {u}}_{j})}$ for ${\displaystyle j=1,\ldots ,n}$, where ${\displaystyle B=({\vec {b}}_{1},\ldots ,{\vec {b}}_{m})}$

#### Corollary 4.2.4

If ${\displaystyle A}$ is the matrix representing the linear transformation ${\displaystyle L:\mathbb {R} ^{n}\to \mathbb {R} ^{m}}$ w.r.t. ${\displaystyle E}$ and ${\displaystyle F}$, then the rref of ${\displaystyle ({\vec {b}}_{1},\ldots ,{\vec {b}}_{m}\mid L({\vec {u}}_{1}),\ldots ,L({\vec {u}}_{n}))}$ is ${\displaystyle (I\mid A)}$.

### Example

${\displaystyle L:\mathbb {R} ^{2}\to \mathbb {R} ^{3}}$,

Basis ${\displaystyle [{\vec {u}}_{1},{\vec {u}}_{2}]}$ is ${\displaystyle {\vec {u}}_{1}=\left\langle 1,2\right\rangle }$, ${\displaystyle u_{2}=\left\langle 3,1\right\rangle }$

Basis ${\displaystyle [{\vec {b}}_{1},{\vec {b}}_{2},{\vec {b}}_{3}]}$ is ${\displaystyle {\vec {b}}_{1}=\left\langle 1,0,0\right\rangle }$, ${\displaystyle {\vec {b}}_{2}=\left\langle 1,1,0\right\rangle }$, ${\displaystyle {\vec {b}}_{3}=\left\langle 1,1,1\right\rangle }$

${\displaystyle L({\vec {x}})={\begin{pmatrix}x_{2}\\x_{1}+x_{2}\\x_{1}-x_{2}\end{pmatrix}}}$

What is ${\displaystyle A}$ w.r.t. ${\displaystyle [{\vec {u}}_{1},{\vec {u}}_{2}]}$ and ${\displaystyle [{\vec {b}}_{1},{\vec {b}}_{2},{\vec {b}}_{3}]}$?

{\displaystyle {\begin{aligned}L({\vec {u}}_{1})&=\left\langle 2,3,-1\right\rangle \\L({\vec {u}}_{2})&=\left\langle 1,4,2\right\rangle \end{aligned}}}

${\displaystyle \left[{\begin{array}{ccc|cc}1&1&1&2&1\\0&1&1&3&4\\0&0&1&-1&2\end{array}}\right]\quad \longrightarrow \quad \left[{\begin{array}{ccc|cc}1&0&0&-1&-3\\0&1&0&4&2\\0&0&1&-1&2\end{array}}\right]}$

${\displaystyle A={\begin{pmatrix}-1&-3\\4&2\\-1&2\end{pmatrix}}}$

## Footnotes

1. The correspondence between ${\displaystyle V}$ and ${\displaystyle \mathbb {R} ^{n}}$ given by ${\displaystyle ({\vec {x}}\in \mathbb {R} ^{n})=[v\in V]_{E}}$; and between ${\displaystyle w=L(v)\in W}$ and ${\displaystyle A{\vec {x}}=[w]_{F}\in \mathbb {R} ^{m}}$ is called isomorphism