# MATH 323 Lecture 15

« previous | Thursday, October 18, 2012 | next »

## Review

${\displaystyle m\times n}$ matrix ${\displaystyle A}$

• Row space is the span of row vectors and a subset of ${\displaystyle \mathbb {R} ^{n}}$
• Column space is span of column vectors and subset of ${\displaystyle \mathbb {R} ^{m}}$

${\displaystyle \mathrm {rank} (A)=\dim({\mbox{row space of }}A)}$

Theorem 3.6.1: Two row equivalent matrices have the same row space.

Theorem 3.6.2: The system ${\displaystyle A{\vec {x}}={\vec {b}}}$ has a solution iff ${\displaystyle {\vec {b}}}$ is contained in the column space of ${\displaystyle A}$.

Theorem 3.6.3: Let ${\displaystyle A}$ be a ${\displaystyle m\times n}$ matrix.

• The linear system ${\displaystyle A{\vec {x}}={\vec {b}}}$ is consistent for every ${\displaystyle {\vec {b}}\in \mathbb {R} ^{n}}$ iff the column vectors of ${\displaystyle A}$ span ${\displaystyle \mathbb {R} ^{n}}$.
• The system ${\displaystyle A{\vec {x}}={\vec {b}}}$ has at most one solution for every ${\displaystyle {\vec {b}}\in \mathbb {R} ^{n}}$ iff the column vectors of ${\displaystyle A}$ are linearly independent.

## Properties of Row and Column Space

${\displaystyle \mathrm {Span} \{{\mbox{columns of }}A\}=\mathbb {R} ^{m}\implies n\geq m}$

If Columns of ${\displaystyle A}$ are linearly independent, then ${\displaystyle n\leq m}$

If columns of ${\displaystyle A}$ form a basis for ${\displaystyle \mathbb {R} ^{m}}$, then ${\displaystyle n=m}$

Corollary: the following are equivalent for a nonsingular square matrix

• ${\displaystyle |A|=|A^{T}|\neq 0}$
• columns form a basis for ${\displaystyle \mathbb {R} ^{n}}$
• rows form a basis for ${\displaystyle \mathbb {R} ^{n}}$

Given ${\displaystyle A\to U}$, where ${\displaystyle U}$ is in row echelon form; the column space of ${\displaystyle A}$ is NOT equal to the column space of ${\displaystyle U}$.

${\displaystyle \dim(\{{\mbox{column space of }}U\})=\dim(\{{\mbox{column space of }}A\})}$

• columns containing the lead variables form basis for the column space of ${\displaystyle U}$
• above corresponds to columns of ${\displaystyle A}$ that form basis for ${\displaystyle A}$'s column space

### Rank-Nullity Theorem

If ${\displaystyle A}$ is a ${\displaystyle m\times n}$ matrix, then ${\displaystyle \mathrm {rank} (A)+\mathrm {nullity} (A)=n}$

${\displaystyle \mathrm {nullity} (A)=\dim(N(A))}$

${\displaystyle A}$ can be converted to row-echelon form matrix ${\displaystyle U}$, so ${\displaystyle A{\vec {x}}={\vec {0}}\iff U{\vec {x}}={\vec {0}}}$.

${\displaystyle \mathrm {rank} (A)=\mathrm {rank} (U)={\mbox{num. nonzero rows}}=r}$

Thus we have ${\displaystyle n-r}$ and ${\displaystyle \dim N(A)}$ equal the number of free variables: ${\displaystyle r+(n-r)=n}$.

#### Example

${\displaystyle A={\begin{bmatrix}1&2&-1&1\\2&4&-3&0\\1&2&1&5\end{bmatrix}}\longrightarrow U={\begin{bmatrix}1&2&0&3\\0&0&1&2\\0&0&0&0\end{bmatrix}}}$

Thus (1, 2, 0, 3) and (0, 0, 1, 2) form a basis for the row space of ${\displaystyle A}$

${\displaystyle \mathrm {rank} (A)=2}$, and there are ${\displaystyle 4-2=2}$ free variables.

${\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\\x_{3}\\x_{4}\end{bmatrix}}={\begin{bmatrix}-2\alpha -3\beta \\\alpha \\-2\beta \\\beta \end{bmatrix}}=\alpha {\begin{bmatrix}-2\\1\\0\\0\end{bmatrix}}+\beta {\begin{bmatrix}-3\\0\\-2\\1\end{bmatrix}}}$

The vectors (-2, 1, 0, 0) and (-3, 0, -2, 1) form a basis for the null space of ${\displaystyle A}$, thus ${\displaystyle \dim N(A)=2}$

### Theorem 3.6.6

If ${\displaystyle A}$ is a ${\displaystyle m\times n}$ matrix, the dimension of the row space of ${\displaystyle A}$ equals the dimension of the column space of ${\displaystyle A}$.

#### Proof

${\displaystyle \mathrm {rank} (A)=r={\mbox{num. lead vars}}={\mbox{num. nonzero rows}}}$

Let ${\displaystyle U_{L}}$ be the matrix obtained from ${\displaystyle U}$ by deleting columns corresponding to free vars, and let ${\displaystyle A_{L}}$ be obtained by deleting the same columns from ${\displaystyle A}$. Both matrices are of size ${\displaystyle r\times r}$

${\displaystyle A_{L}\sim U_{L}}$, so if ${\displaystyle A_{L}{\vec {x}}={\vec {0}}}$, then ${\displaystyle U_{L}{\vec {x}}={\vec {0}}}$ and ${\displaystyle {\vec {x}}={\vec {0}}}$ because columns of ${\displaystyle U_{L}}$ are linearly independent. Therefore the columns of ${\displaystyle A_{L}}$ are linearly independent.

${\displaystyle A_{L}}$ has ${\displaystyle r}$ columns, so the dimension of the column space of ${\displaystyle A}$${\displaystyle r}$ (and ${\displaystyle r}$ is also the dimension of the row space of ${\displaystyle A}$

the column space of ${\displaystyle A^{T}}$ has the same dimension as the row space of ${\displaystyle A}$ ≥ the row space of ${\displaystyle A^{T}}$ has the same dimension as the column space of ${\displaystyle A}$. By antisymmetry, the dimensions of the column and row spaces must be equal.

Q.E.D.

#### Example

${\displaystyle A={\begin{bmatrix}1&-2&1&1&2\\-1&3&0&2&-2\\0&1&1&3&4\\1&2&5&13&5\end{bmatrix}}\quad \longrightarrow \quad U={\begin{bmatrix}1&-2&1&1&2\\0&1&1&3&0\\0&0&0&0&1\\0&0&0&0&0\end{bmatrix}}}$.

Thus ${\displaystyle {\vec {u}}_{1},{\vec {u}}_{2},{\vec {u}}_{5}}$ form a basis for the column space of ${\displaystyle U}$, and ${\displaystyle {\vec {a}}_{1},{\vec {a}}_{2},{\vec {a}}_{5}}$ form a basis for the column space of ${\displaystyle A}$.

${\displaystyle U(1,:),U(2,:),U(3,:)}$ form a basis for the row spaces of ${\displaystyle U}$ and ${\displaystyle A}$ (since they are equivalent).

The nullity of ${\displaystyle A}$ is thus the number of columns − num. lead variables = 5 − 3 = 2.

## Linear Transformation

Let ${\displaystyle V,W}$ be vector spaces.

${\displaystyle L:V\to W}$ is a linear transformation if for all ${\displaystyle v,v_{1},v_{2}\in V}$ and for all ${\displaystyle \alpha ,\beta \in \mathbb {R} }$,

1. ${\displaystyle L(v_{1}+v_{2})=L(v_{1})+L(v_{2})}$
2. ${\displaystyle L(\alpha v)=\alpha L(v)}$
3. (combination of 1 and 2) ${\displaystyle L(\alpha v_{1}+\beta v_{2})=\alpha L(v_{1})+\beta L(v_{2})}$

Therefore, if ${\displaystyle v\in V}$, then ${\displaystyle L(v)\in W}$, where ${\displaystyle L(v)}$ is the image of ${\displaystyle v}$.

Let ${\displaystyle L:V\to V}$ be a linear operator on ${\displaystyle V}$.

For example:

1. ${\displaystyle L({\vec {x}})=2{\vec {x}}}$.
2. ${\displaystyle L({\vec {x}})=x_{1}{\vec {e}}_{1}}$ (projection onto ${\displaystyle x}$-axis)
3. ${\displaystyle L({\vec {x}})=\left\langle x_{1},-x_{2}\right\rangle }$ (reflect vector about ${\displaystyle x}$-axis)
4. ${\displaystyle L({\vec {x}})=\left\langle -x_{2},x_{1}\right\rangle }$ (rotate by 90° CCW)