# MATH 323 Lecture 17

« previous | Thursday, October 25, 2012 | next »

## Linear Transformation

$\forall L:\mathbb {R} ^{n}\to \mathbb {R} ^{m}\exists A\in \mathbb {R} ^{m\times n}$ $A=({\vec {a}}_{i}=L({\vec {e}}_{i})$ ### Rotation Transformation

Let $L$ be a rotation by angle $\theta$ about the origin.

{\begin{aligned}L({\vec {e}}_{1})&={\begin{pmatrix}\cos {\theta }\\\sin {\theta }\end{pmatrix}}L({\vec {e}}_{2})&={\begin{pmatrix}-\sin {\theta }\\\cos {\theta }\end{pmatrix}}\\A&={\begin{pmatrix}\cos {\theta }&-\sin {\theta }\\\sin {\theta }&\cos {\theta }\end{pmatrix}}\end{aligned}} ### In General

For arbitrary $L:V\to W$ , where $E=[v_{1},\ldots ,v_{n}]$ is a basis in $V$ and $F=[w_{1},\ldots ,w_{m}]$ is a basis in $W$ .

$v\to [v]_{E}\in \mathbb {R} ^{n}$ is the coordinate vector of $v$ w.r.t $E$ so $v=\sum _{i=1}^{n}x_{i}\,v_{i}=\left\langle x_{1},\ldots ,x_{n}\right\rangle$ For some $\sum _{i=1}^{m}y_{i}\,w_{i}=L(v)\in W$ , we can represent $[L(v)]_{F}$ as $\left\langle y_{1},\ldots ,y_{m}\right\rangle \in \mathbb {R} ^{m}$ .

${\vec {y}}=A{\vec {x}}\iff [L(v)]_{F}=A[v]_{E}$ , so what is $A$ ?

${\vec {a}}_{j}=[L(v_{j})]_{F}$ for $j=1,\ldots ,n$ $A=({\vec {a}}_{i},\ldots ,{\vec {a}}_{n})$ ### Theorem 4.2.2

Matrix representation theorem

If $E=[v_{1},\ldots ,v_{n}]$ and $F=[w_{1},\ldots ,w_{m}]$ are ordered bases for vector spaces $V$ and $W$ respectively, then corresponding to each linear transformation $L:V\to W$ there is an $m\times n$ matrix $A$ such that

$[L(v)]_{F}=A[v]_{E}$ for each $v\in V$ $A$ is the matrix representing $L$ relative to the ordered bases $E$ and $F$ .

In fact, $A=({\vec {a}}_{1},\ldots ,{\vec {a}}_{n})$ , where ${\vec {a}}_{j}=[L(v_{j})]_{F}$ for $j=1,\ldots ,n$ .

### Examples

For ${\vec {x}}\in \mathbb {R} ^{3}$ and $[{\vec {b}}_{1}=\left\langle 1,1\right\rangle ,{\vec {b}}_{2}=\left\langle -1,1\right\rangle ]$ is a basis in $\mathbb {R} ^{2}$ , find the matrix $A$ representing $L({\vec {x}})=x_{1}{\vec {b}}_{1}+(x_{2}+x_{3}){\vec {b}}_{2}$ w.r.t. ordered bases $[{\vec {e}}_{1},{\vec {e}}_{2},{\vec {e}}_{3}]$ (standard 3D basis) to $B=[{\vec {b}}_{1},{\vec {b}}_{2}]$ Just take $A=\left(a_{j}=L({\vec {e}}_{i})\right)={\begin{pmatrix}1&0&0\\0&1&1\end{pmatrix}}$ $L(\alpha \,{\vec {b}}_{1}+\beta \,{\vec {b}}_{2})=(\alpha +\beta ){\vec {b}}_{1}+2\beta \,{\vec {b}}_{2}$ , where $[{\vec {b}}_{1},{\vec {b}}_{2}]$ is a basis for $\mathbb {R} ^{2}$ . Find $A$ .

$A=(L(1\cdot {\vec {b}}_{1}+0\cdot {\vec {b}}_{2}),L(0\cdot {\vec {b}}_{1}+1\cdot {\vec {b}}_{2}))={\begin{pmatrix}1&1\\0&2\end{pmatrix}}$ $D:P_{3}\to P_{2}$ , where $D(p(x))=p'(x)$ .

$[x^{2},x,1]$ forms basis for $P_{3}$ , and $[x,1]$ forms basis for $P_{2}$ {\begin{aligned}D(x^{2})&=2x=2\cdot x+0\cdot 1\\D(x)&=1=0\cdot x+1\cdot 1\\D(1)&=0=0\cdot x+0\cdot 1\\D(ax^{2}+bx+c)&=2ax+b\end{aligned}} Thus $A={\begin{bmatrix}2&0&0\\0&1&0\end{bmatrix}}$ {\begin{aligned}{[p(x)]}_{[x^{2},x,1]}&=\left\langle a,b,c\right\rangle \\{[D(p(x))]}_{[x,1]}&=\left\langle 2a,b\right\rangle \end{aligned}} ## Reversing Linear Transformations

### Theorem 4.2.3

Given $E=[{\vec {u}}_{1},\ldots {\vec {u}}_{n}]$ and $F=[{\vec {b}}_{1},\ldots ,{\vec {b}}_{m}]$ are bases for $\mathbb {R} ^{n}$ and $\mathbb {R} ^{m}$ respectively,

If $A$ is the matrix representing $L:\mathbb {R} ^{n}\to \mathbb {R} ^{m}$ w.r.t. $E$ and $F$ , then

${\vec {a}}_{j}=B^{-1}L({\vec {u}}_{j})$ for $j=1,\ldots ,n$ , where $B=({\vec {b}}_{1},\ldots ,{\vec {b}}_{m})$ #### Corollary 4.2.4

If $A$ is the matrix representing the linear transformation $L:\mathbb {R} ^{n}\to \mathbb {R} ^{m}$ w.r.t. $E$ and $F$ , then the rref of $({\vec {b}}_{1},\ldots ,{\vec {b}}_{m}\mid L({\vec {u}}_{1}),\ldots ,L({\vec {u}}_{n}))$ is $(I\mid A)$ .

### Example

$L:\mathbb {R} ^{2}\to \mathbb {R} ^{3}$ ,

Basis $[{\vec {u}}_{1},{\vec {u}}_{2}]$ is ${\vec {u}}_{1}=\left\langle 1,2\right\rangle$ , $u_{2}=\left\langle 3,1\right\rangle$ Basis $[{\vec {b}}_{1},{\vec {b}}_{2},{\vec {b}}_{3}]$ is ${\vec {b}}_{1}=\left\langle 1,0,0\right\rangle$ , ${\vec {b}}_{2}=\left\langle 1,1,0\right\rangle$ , ${\vec {b}}_{3}=\left\langle 1,1,1\right\rangle$ $L({\vec {x}})={\begin{pmatrix}x_{2}\\x_{1}+x_{2}\\x_{1}-x_{2}\end{pmatrix}}$ What is $A$ w.r.t. $[{\vec {u}}_{1},{\vec {u}}_{2}]$ and $[{\vec {b}}_{1},{\vec {b}}_{2},{\vec {b}}_{3}]$ ?

{\begin{aligned}L({\vec {u}}_{1})&=\left\langle 2,3,-1\right\rangle \\L({\vec {u}}_{2})&=\left\langle 1,4,2\right\rangle \end{aligned}} $\left[{\begin{array}{ccc|cc}1&1&1&2&1\\0&1&1&3&4\\0&0&1&-1&2\end{array}}\right]\quad \longrightarrow \quad \left[{\begin{array}{ccc|cc}1&0&0&-1&-3\\0&1&0&4&2\\0&0&1&-1&2\end{array}}\right]$ $A={\begin{pmatrix}-1&-3\\4&2\\-1&2\end{pmatrix}}$ ## Footnotes

1. The correspondence between $V$ and $\mathbb {R} ^{n}$ given by $({\vec {x}}\in \mathbb {R} ^{n})=[v\in V]_{E}$ ; and between $w=L(v)\in W$ and $A{\vec {x}}=[w]_{F}\in \mathbb {R} ^{m}$ is called isomorphism