# MATH 323 Lecture 16

« previous | Tuesday, October 23, 2012 | next »

## Linear Transformations

Let ${\displaystyle L:V\to W}$ be a linear transformation for all ${\displaystyle v_{i}\in V}$ and ${\displaystyle \alpha _{i}\in \mathbb {R} }$:

1. ${\displaystyle L({\vec {0}}_{V})={\vec {0}}_{W}}$
2. ${\displaystyle L(\sum _{i=1}^{n}\alpha _{i}\,v_{i})=\sum _{i=1}^{n}\alpha _{i}\,L(v_{i})}$
3. ${\displaystyle L(-{\vec {v}})=-L({\vec {v}})}$

In general, linear transformations are of the form

${\displaystyle L_{A}({\vec {x}})=A{\vec {x}}}$.

### Transdimensional Transformations

If ${\displaystyle L_{A}}$ is a transformation from ${\displaystyle \mathbb {R} ^{n}}$ to ${\displaystyle \mathbb {R} ^{m}}$, then ${\displaystyle A}$ will be a ${\displaystyle m\times n}$ matrix.

Let ${\displaystyle L({\vec {x}})=x_{1}+x_{2}}$ be a linear transformation from 2D space to 1D space.

{\displaystyle {\begin{aligned}L(\alpha \,{\vec {x}}+\beta \,{\vec {y}})&=L\left({\begin{bmatrix}\alpha \,x_{1}+\beta \,y_{1}\\\alpha \,x_{2}+\beta \,y_{2}\end{bmatrix}}\right)\\&=\alpha \,x_{1}+\beta \,y_{1}+\alpha \,x_{2}+\beta \,y_{2}\\&=\alpha (x_{1}+x_{2})+\beta (y_{1}+y_{2})\\&=\alpha \,L({\vec {x}})+\beta \,L({\vec {y}})\end{aligned}}}

The following transformation is not linear because it fails on (1) above:

${\displaystyle M({\vec {x}})={\sqrt {x_{1}^{2}+x_{2}^{2}}}}$

### Identity Transformation

${\displaystyle I:V\to V}$ such that ${\displaystyle I({\vec {v}})={\vec {v}}}$.

### Image and Kernel

Giver ${\displaystyle L:V\to W}$,

kernel
Set of vectors in ${\displaystyle V}$ such that ${\displaystyle L(v)=0}$
Very analogous to null space of a matrix.
image
Written ${\displaystyle L(V)}$
Set of vectors in ${\displaystyle W}$ such that ${\displaystyle w=L(v)}$ for some vector in ${\displaystyle V}$
Subspace of ${\displaystyle V}$ will have image contained in ${\displaystyle L(V)}$

#### Theorem 4.1.1

Let ${\displaystyle L:V\to W}$ be a linear transformation, and ${\displaystyle S\subseteq V}$ be a subspace. Then

1. The kernel of ${\displaystyle L}$ is a subspace of ${\displaystyle V}$
2. ${\displaystyle L(S)}$ is a subspace of ${\displaystyle W}$. In particular, ${\displaystyle L(V)}$ is a subspace of ${\displaystyle W}$

### Example

Let ${\displaystyle L:\mathbb {R} ^{3}\to \mathbb {R} ^{2}}$ be defined as follows:

${\displaystyle L({\vec {x}})={\begin{pmatrix}x_{1}+x_{2}\\x_{2}+x_{3}\end{pmatrix}}}$

Kernel is ${\displaystyle \{{\vec {x}}~|~L({\vec {x}})={\vec {0}}\}}$.

${\displaystyle x_{1}+x_{2}=x_{2}+x_{3}=0}$, therefore ${\displaystyle \left\langle a,-a,a\right\rangle }$ is the kernel of ${\displaystyle L}$.

For a subspace ${\displaystyle S=Span({\vec {e}}_{1},{\vec {e}}_{2})=\left\langle a,0b\right\rangle }$, The image ${\displaystyle L(S)=\left\langle a+0,0+b\right\rangle =\mathbb {R} ^{2}}$.

## Matrix Representations of Linear Transformations

Let ${\displaystyle A}$ be a ${\displaystyle m\times n}$ matrix, and ${\displaystyle L_{A}:\mathbb {R} ^{n}\to \mathbb {R} ^{m}}$. Then ${\displaystyle L_{A}({\vec {x}})=A\,{\vec {x}}}$

${\displaystyle A}$ is called the standard matrix representation of ${\displaystyle L}$.

### Theorem 4.2.1

If ${\displaystyle L}$ is a linear transformation mapping ${\displaystyle \mathbb {R} ^{n}}$ into ${\displaystyle \mathbb {R} ^{m}}$, there is a ${\displaystyle m\times n}$ matrix ${\displaystyle A}$ such that ${\displaystyle L({\vec {x}})=A\,{\vec {x}}}$ for each ${\displaystyle x\in \mathbb {R} ^{n}}$. In fact, the ${\displaystyle j}$th column vector of ${\displaystyle A}$ is given by

${\displaystyle {\vec {a}}_{j}=L({\vec {e}}_{j})}$ for ${\displaystyle j=1,\ldots ,n}$

### Example

${\displaystyle L({\vec {x}})={\begin{pmatrix}x_{1}+x_{2}\\x_{2}+x_{3}\end{pmatrix}}}$. Find the standard matrix representation.

• ${\displaystyle {\vec {a}}_{1}=L({\vec {e}}_{1})={\begin{pmatrix}1+0\\0+0\end{pmatrix}}={\begin{pmatrix}1\\0\end{pmatrix}}}$
• ${\displaystyle {\vec {a}}_{2}=L({\vec {e}}_{2})={\begin{pmatrix}0+1\\1+0\end{pmatrix}}={\begin{pmatrix}1\\1\end{pmatrix}}}$
• ${\displaystyle {\vec {a}}_{3}=L({\vec {e}}_{3})={\begin{pmatrix}0+0\\0+1\end{pmatrix}}={\begin{pmatrix}0\\1\end{pmatrix}}}$

So ${\displaystyle A=(a_{1},a_{2},a_{3})={\begin{pmatrix}1&1&0\\0&1&1\end{pmatrix}}}$