# MATH 323 Lecture 12

« previous | Thursday, October 4, 2012 | next »

Substitute lecture

## Chapter 3.4: Basis and Dimension

### Definitions

We say that vectors ${\displaystyle {\vec {v}}_{1},\ldots ,{\vec {v}}_{k}}$ are linearly independendent if ${\displaystyle a_{1}\,{\vec {v}}_{1}+\dots +a_{k}\,{\vec {v}}_{k}={\vec {0}}}$

We say that ${\displaystyle {\vec {w}}_{1},\ldots {\vec {w}}_{k}\in V}$ span ${\displaystyle V}$ if any ${\displaystyle {\vec {u}}\in V}$ can be written as a linear combination of ${\displaystyle {\vec {w}}_{1},\ldots ,{\vec {w}}_{k}}$.

Let ${\displaystyle V}$ be a vector space.

The set ${\displaystyle {\vec {e}}_{1},\ldots ,{\vec {e}}_{n}\in V}$ is called a basis in ${\displaystyle V}$ if

1. ${\displaystyle {\vec {e}}_{1},\ldots ,{\vec {e}}_{n}}$ are linearly independent
2. They span ${\displaystyle V}$

The number of vectors in the basis is called the dimension of ${\displaystyle V}$

### Theorem

All bases in ${\displaystyle V}$ have the same number of elements called the dimension of ${\displaystyle V}$.

Note: it is possible for a "finitely defined" Vector space to have infinite dimension.

### Theorem 3.4.1

If ${\displaystyle \{v_{1},\ldots ,v_{n}\}}$ is a spanning set of ${\displaystyle V}$, ${\displaystyle m>n}$, then any vectors ${\displaystyle w_{1},\ldots ,w_{m}}$ are linearly independent.

Proof. (substitution/elimination process).

If ${\displaystyle w_{j}}$, ${\displaystyle 1\leq j\leq m}$ is a 0 vector, then there is nothing to prove:

${\displaystyle 0{\vec {w}}_{1}+\dots +\alpha {\vec {w}}_{j}+\dots +0{\vec {w}}_{m}={\vec {0}}}$

If ${\displaystyle {\vec {w}}_{1}\neq {\vec {0}}}$, then ${\displaystyle {\vec {w}}_{1}=a_{1}\,{\vec {v}}_{1}+\dots +a_{n}\,{\vec {v}}_{n}}$. At least one of the ${\displaystyle a}$'s is nonzero. Without loss of generality, we assume that it is ${\displaystyle a_{1}}$ which is nonzero.

Therefore ${\displaystyle {\vec {v}}_{1}={\frac {{\vec {w}}_{1}-a_{2}{\vec {v}}_{2}-\dots -a_{n}\,{\vec {v}}_{n}}{a_{1}}}}$, so ${\displaystyle \{{\vec {w}}_{1},{\vec {v}}_{2},\ldots ,{\vec {v}}_{n}\}}$ is a spanning set for ${\displaystyle V}$.

Repeat the process with the above spanning set and ${\displaystyle {\vec {w}}_{2},\ldots ,{\vec {w}}_{n}}$. When we can no longer continue, we end up with ${\displaystyle \{{\vec {w}}_{1},\ldots ,{\vec {w}}_{n}\}}$ as a spanning set for ${\displaystyle V}$, and ${\displaystyle \{{\vec {w}}_{n+1}=k_{1}\,{\vec {w}}_{1}+\dots +k_{n}\,{\vec {w}}_{n}\}}$

#### Example

${\displaystyle \mathbb {R} ^{2}}$ has bases ${\displaystyle {\vec {e}}_{1}=\left\langle 1,0\right\rangle }$, ${\displaystyle {\vec {e}}_{2}=\left\langle 0,1\right\rangle }$ and ${\displaystyle {\vec {v}}_{1}=\left\langle 1,1\right\rangle }$, ${\displaystyle {\vec {v}}_{2}=\left\langle 1,0\right\rangle }$.

Let ${\displaystyle {\vec {w}}_{1}=\left\langle 1,2\right\rangle }$, ${\displaystyle {\vec {w}}_{2}=\left\langle 2,3\right\rangle }$, and ${\displaystyle {\vec {w}}_{3}=\left\langle 3,4\right\rangle }$

Note: ${\displaystyle \{{\vec {e}}_{1},{\vec {e}}_{2}\}}$ is a spanning set of ${\displaystyle \mathbb {R} ^{2}}$.

${\displaystyle w_{1}=1{\vec {e}}_{1}+2{\vec {e}}_{2}\quad \rightarrow \quad {\vec {e}}_{1}={\vec {w}}_{1}-2{\vec {e}}}$

#### Corollary 3.4.2

If ${\displaystyle \{v_{1},\ldots ,v_{n}\}}$ and ${\displaystyle \{w_{1},\ldots ,w_{m}\}}$ are two bases in ${\displaystyle V}$, then ${\displaystyle m=n}$.

Proof. Assume ${\displaystyle v}$'s are spanning set and ${\displaystyle m>n}$, then from Thm. 3.4.1, we know that ${\displaystyle w_{1},\ldots ,w_{m}}$ are linearly dependent. Not a basis (contradiction)

Flip ${\displaystyle n>m}$ and by same logic → contradiction. Therefore ${\displaystyle m=n}$.

### Infinite Dimension

If the vector space ${\displaystyle V}$ has no basis of finitely many vectors, we say that ${\displaystyle V}$ has infinite dimension.

If ${\displaystyle V=\{{\vec {0}}\}}$, we say that ${\displaystyle \dim V=0}$

### Example

Find the dimension of the space of solutions to the system

{\displaystyle {\begin{aligned}x+y+z&=0\\x-2y+3z&=0\end{aligned}}}

Three variables, two constraints: ${\displaystyle 3-2=1}$

Note that the solutions form a vector space, and the set of solutions will be the nullspace of the coefficient matrix ${\displaystyle {\begin{bmatrix}1&1&1\\1&-2&3\end{bmatrix}}}$.

The solution is of the form ${\displaystyle \alpha \left\langle -5,2,3\right\rangle }$, so the basis has only one vector. Therefore ${\displaystyle \dim N=1}$

### Example

Find the dimension of the nullspace for the operation ${\displaystyle A=\left(x{\frac {\mathrm {d} }{\mathrm {d} x}}-1\right)}$ over ${\displaystyle P_{2}}$

In other words, ${\displaystyle A}$ does ${\displaystyle p(x)\mapsto x\,{\frac {\mathrm {d} p(x)}{\mathrm {d} x}}-p(x)}$. For a polynomial of degree ≤ 1, ${\displaystyle a+bx\mapsto x(b)-(a+bx)=-a}$. If ${\displaystyle A(a+bx)=0}$, then ${\displaystyle a=0}$, and ${\displaystyle b}$ is anything. Therefore, ${\displaystyle N(A)=\{bx~\mid ~b\in \mathbb {R} \}}$, the basis is ${\displaystyle \{x\}}$, and ${\displaystyle \dim N(A)=1}$.

### Example

${\displaystyle \mathbb {R} ^{2\times 2}}$

A basis in ${\displaystyle \mathbb {R} ^{2\times 2}}$ would be:

{\displaystyle {\begin{aligned}{\vec {e}}_{1}&={\begin{bmatrix}1&0\\0&0\end{bmatrix}}{\vec {e}}_{2}&={\begin{bmatrix}0&1\\0&0\end{bmatrix}}{\vec {e}}_{3}&={\begin{bmatrix}0&0\\1&0\end{bmatrix}}{\vec {e}}_{4}&={\begin{bmatrix}0&0\\0&1\end{bmatrix}}\end{aligned}}}

### Theorem 3.4.3

Let ${\displaystyle V}$ be a vector space, and ${\displaystyle \dim V=n>0}$. Then

1. any ${\displaystyle n}$ linearly independent vectors ${\displaystyle f_{1},\ldots ,f_{n}}$ span ${\displaystyle V}$
2. any ${\displaystyle n}$ vectors that span ${\displaystyle V}$ are linearly independent.

Proof.

1. assume they do not span ${\displaystyle V}$: there is a vector ${\displaystyle {\vec {g}}}$ which is not a linear combination of ${\displaystyle f}$'s, then ${\displaystyle \{f_{1},\ldots ,f_{n},g\}}$ would be linearly independent. this contradicts #Theorem 3.4.1 since there are ${\displaystyle n+1}$ vectors with only ${\displaystyle n}$ in a basis.
2. assume they are linearly dependent: then ${\displaystyle c_{1}\,f_{1}+\dots +c_{n}\,f_{n}={\vec {0}}}$, where ${\displaystyle c_{1},\ldots ,c_{n}}$ are not all 0. This means that one vector could be written as a linear combination of other vectors, so ${\displaystyle n-1}$ vectors would span ${\displaystyle V}$. This contradicts the definition of a basis, stating that ${\displaystyle n}$ vectors must be linearly independent.

### Theorem 3.4.4

${\displaystyle \dim V=n>0}$

1. No set of fewer than ${\displaystyle n}$ vectors spans ${\displaystyle V}$
2. Any ${\displaystyle m linearly independent vectors could be extended (by more vectors) to form a basis.
3. If ${\displaystyle v_{1},\ldots ,v_{N}}$, ${\displaystyle N>n}$, span ${\displaystyle V}$, we can pare them down to ${\displaystyle n}$ vectors, which would be a basis.