# MATH 323 Lecture 10

« previous | Thursday, September 27, 2012 | next »

## Vector Space

Closure criteria:

1. ${\displaystyle {\vec {x}}+{\vec {y}}\in V\quad \forall {\vec {x}},{\vec {y}}\in V}$
2. ${\displaystyle \alpha \,{\vec {x}}\in V\quad \forall {\vec {x}}\in V\quad \forall \alpha \in \mathbb {R} }$

### Subspace

A vector space ${\displaystyle V}$ automatically has two subspaces:

1. ${\displaystyle S_{1}=\{0\}}$
2. ${\displaystyle S_{2}=V}$

When ${\displaystyle S\neq \{0\}}$ and ${\displaystyle S\neq V}$, then ${\displaystyle S}$ is called a proper subspace.

### Null Space of Matrix

Let ${\displaystyle A}$ be a ${\displaystyle m\times n}$ matrix, and let ${\displaystyle N(A)}$ be the set of all solutions to the system ${\displaystyle A\,{\vec {x}}={\vec {0}}}$:

${\displaystyle N(A)=\left\{{\vec {x}}\in \mathbb {R} ^{n}~\mid ~A\,{\vec {x}}={\vec {0}}\right\}}$

${\displaystyle N(A)}$ is clalled the null space (also called nullspace or kernel) of ${\displaystyle A}$.

${\displaystyle N(A)}$ is a vector space since ${\displaystyle A({\vec {x}}+{\vec {y}})=A\,{\vec {x}}+A\,{\vec {y}}={\vec {0}}}$ and ${\displaystyle A(\alpha \,{\vec {x}})=\alpha \,A\,{\vec {x}}=\alpha \,{\vec {0}}={\vec {0}}}$

#### Example

Determine ${\displaystyle N(A)}$ if ${\displaystyle A={\begin{bmatrix}1&1&1&0\\2&1&0&1\end{bmatrix}}}$

Gauss-Jordan reduction of ${\displaystyle A}$ gives ${\displaystyle \left[{\begin{array}{cccc|c}1&0&-1&1&0\\0&1&2&-1&0\end{array}}\right]}$, so the solutions of ${\displaystyle A\,{\vec {x}}={\vec {0}}}$ are

{\displaystyle {\begin{aligned}{\vec {x}}&=\left\langle \alpha -\beta ,-2\alpha +\beta ,\alpha ,\beta \right\rangle \quad \forall \alpha ,\beta \in \mathbb {R} \\&=\alpha {\begin{pmatrix}1\\-2\\1\\0\end{pmatrix}}+\beta {\begin{pmatrix}-1\\1\\0\\1\end{pmatrix}}\end{aligned}}}

## Linear Combinations and Span

Let ${\displaystyle {\vec {v}}=v_{1},\ldots ,v_{n}\in V}$.

The sum ${\displaystyle \alpha _{1}\,v_{1}+\alpha _{2}\,v_{2}+\dots +\alpha _{n}\,v_{n}}$, where ${\displaystyle {\vec {\alpha }}\in \mathbb {R} ^{n}}$, is called a linear combination of ${\displaystyle {\vec {v}}}$.

The set of all such linear combinations is called the span of ${\displaystyle {\vec {v}}}$:

${\displaystyle \mathrm {Span} (v_{1},\ldots ,v_{n})=\{v\in V~\mid ~V=\alpha _{1}\,v_{1}+\dots +\alpha _{n}\,v_{n}\}}$

### Example

${\displaystyle {\vec {b}}_{1}=\left\langle 1,0,0\right\rangle }$ and ${\displaystyle {\vec {b}}_{2}=\left\langle 0,1,0\right\rangle }$ are in ${\displaystyle \mathbb {R} ^{3}}$

The span of ${\displaystyle {\vec {b}}_{1}}$ and ${\displaystyle {\vec {b}}_{2}}$ is the set of all vectors ${\displaystyle {\vec {x}}=\alpha \,{\vec {b}}_{1}+\beta \,{\vec {b}}_{2}=\left\langle \alpha ,\beta ,0\right\rangle }$.

These two vectors form the standard basis in 2D space.

### Theorem

${\displaystyle \mathrm {Span} (v_{1},\ldots ,v_{n})}$ is a subspace of ${\displaystyle V}$

{\displaystyle {\begin{aligned}{\vec {x}}&=\alpha _{1}\,v_{1}+\dots +\alpha _{n}\,v_{n}\\{\vec {y}}&=\beta _{1}\,v_{1}+\dots +\beta _{n}\,v_{n}\\{\vec {x}}+{\vec {y}}&=(\alpha _{1}+\beta _{1})v_{1}+\dots +(\alpha _{n}+\beta _{n})v_{n}\in \mathrm {Span} (v_{1},\ldots ,v_{n})\\\lambda \,{\vec {x}}&=(\lambda \,\alpha _{1})v_{1}+\dots +(\lambda \,\alpha _{n})v_{n}\in \mathrm {Span} (v_{1},\ldots ,v_{n})\end{aligned}}}

### Spanning Set

The set ${\displaystyle S=\{v_{1},\ldots ,v_{n}\}}$ is a spanning set for ${\displaystyle V}$ if ${\displaystyle V=\mathrm {Span} (v_{1},\ldots ,v_{n})}$

If a subset of ${\displaystyle S}$ is a spanning set of ${\displaystyle V}$, then ${\displaystyle S}$ itself is a spanning set of ${\displaystyle V}$ (set linear coefficient of other terms in set to 0).

#### Example

${\displaystyle \mathrm {Span} ({\hat {\imath }},{\hat {\jmath }},{\hat {k}})=\mathbb {R} ^{3}}$

These are the standard basis vectors for 3D space.

#### Example

Is ${\displaystyle \left\{{\begin{pmatrix}1\\1\\1\end{pmatrix}},{\begin{pmatrix}1\\1\\0\end{pmatrix}},{\begin{pmatrix}1\\0\\0\end{pmatrix}}\right\}}$ a spanning set of ${\displaystyle \mathbb {R} ^{3}}$?

{\displaystyle {\begin{aligned}\alpha _{1}\left\langle 1,1,1\right\rangle +\alpha _{2}\left\langle 1,1,0\right\rangle +\alpha _{3}\left\langle 1,0,0\right\rangle &=\left\langle x_{1},x_{2},x_{3}\right\rangle \\\alpha _{1}+\alpha _{2}+\alpha _{3}&=x_{1}\\\alpha _{1}+\alpha _{2}&=x_{2}\\\alpha _{1}&=x_{3}\\\end{aligned}}}

This has a solution for any ${\displaystyle {\vec {x}}\in \mathbb {R} ^{3}}$, so it is indeed a spanning set (the vectors in the original problem are noncoplanar, so that's easier to visualize)

#### Example

${\displaystyle P_{3}}$ is the set of polynomials of degree < 3.

${\displaystyle P_{3}=\mathrm {Span} (1,x,x^{2})}$

${\displaystyle c\,v_{1}+b\,v_{2}+a\,v_{3}=a\,x^{2}+b\,x+c\in P_{3}}$

## Linear Independence

Given ${\displaystyle {\vec {x}}_{1}=\left\langle 1,-1,2\right\rangle }$, ${\displaystyle {\vec {x}}_{2}=\left\langle -2,3,1\right\rangle }$, and ${\displaystyle {\vec {x_{3}}}=\left\langle -1,3,8\right\rangle }$,

${\displaystyle S=\mathrm {Span} ({\vec {x}}_{1},{\vec {x}}_{2},{\vec {x}}_{3})=\mathrm {Span} ({\vec {x}}_{1},{\vec {x}}_{2})}$

This is true since ${\displaystyle {\vec {x}}_{3}=3{\vec {x}}_{1}+2{\vec {x}}_{2}}$

The set ${\displaystyle \{{\vec {x}}_{1},{\vec {x}}_{2},{\vec {x}}_{3}\}}$ is called linearly dependent

1. If ${\displaystyle v_{1},\ldots ,v_{n}}$ span ${\displaystyle V}$ and one of these vectors can be written as a linear combination of the ${\displaystyle n-1}$ others, then those ${\displaystyle n-1}$ vectors span ${\displaystyle V}$.
2. Given ${\displaystyle n}$ vectors ${\displaystyle v_{1},\ldots ,v_{n}}$, it is possible to write one of the vectors as a linear combination of the other ${\displaystyle n-1}$ vectors iff there exist scalars ${\displaystyle c_{1},\ldots ,c_{n}}$ (not all zero!) such that ${\displaystyle c_{1}\,v_{1}+\dots +c_{n}\,v_{n}=0}$.