# MATH 323 Lecture 11

« previous | Tuesday, October 2, 2012 | next »

## Linear Independence

1. If ${\displaystyle v_{1},\ldots ,v_{n}}$ span ${\displaystyle V}$ and one of these vectors can be written as a linear combination of the ${\displaystyle n-1}$ others, then those ${\displaystyle n-1}$ vectors span ${\displaystyle V}$.
2. Given ${\displaystyle n}$ vectors ${\displaystyle v_{1},\ldots ,v_{n}\in V}$, it is possible to write one of the vectors as a linear combination of the other ${\displaystyle n-1}$ vectors iff there exist scalars ${\displaystyle c_{1},\ldots ,c_{n}}$ (not all zero!) such that ${\displaystyle c_{1}\,v_{1}+\dots +c_{n}\,v_{n}=0}$.

Recall that ${\displaystyle c_{1}\,v_{1}+\dots +c_{n}\,v_{n}}$ is a linear combination of ${\displaystyle v_{1},\ldots ,v_{n}}$.

The vectors ${\displaystyle v_{1},\ldots ,v_{n}\in V}$ are said to be linearly independent if ${\displaystyle c_{1}\,v_{1}+\dots +c_{n}\,v_{n}=0}$ implies that all the scalars ${\displaystyle c_{1},\dots ,c_{n}}$ must all be zero.

### Example

For two vectors ${\displaystyle v_{1},v_{2}\in V}$ such that ${\displaystyle c_{1}\,v_{1}+c_{2}\,v_{2}=0}$

1. If ${\displaystyle c_{1}\neq 0}$, then ${\displaystyle v_{1}=-{\frac {c_{2}}{c_{1}}}\,v_{2}}$
2. If ${\displaystyle c_{2}\neq 0}$, then ${\displaystyle v_{2}=-{\frac {c_{1}}{c_{2}}}\,v_{1}}$

Thus the two vectors are linearly dependent

### Example

Which of the following collectios are linearly independent?

1. ${\displaystyle {\begin{pmatrix}1\\1\\1\end{pmatrix}},{\begin{pmatrix}1\\1\\0\end{pmatrix}},{\begin{pmatrix}1\\0\\0\end{pmatrix}}}$ Yes.
2. ${\displaystyle {\begin{pmatrix}1\\0\\1\end{pmatrix}},{\begin{pmatrix}0\\1\\0\end{pmatrix}}}$ Yes.
3. ${\displaystyle {\begin{pmatrix}1\\2\\4\end{pmatrix}},{\begin{pmatrix}2\\1\\3\end{pmatrix}},{\begin{pmatrix}4\\-1\\1\end{pmatrix}}}$ No. The matrix of system ${\displaystyle c_{1}\,v_{1}+c_{2}\,v_{2}+c_{3}\,v_{3}=0}$ is singular, therefore there are nontrivial solutions ${\displaystyle \left\langle c_{1},c_{2},c_{3}\right\rangle }$ that will satisfy the equation. (See #Theorem)

## Theorem

Let ${\displaystyle x_{1},\ldots ,x_{n}}$ be ${\displaystyle n}$ vectors in ${\displaystyle \mathbb {R} ^{n}}$ and let ${\displaystyle X=(x_{1},\ldots ,x_{n})}$ be the ${\displaystyle n\times n}$ matrix formed by using the ${\displaystyle x}$ vectors as columns.

The vectors ${\displaystyle x_{1},\ldots ,x_{n}}$ will be linearly dependent iff ${\displaystyle X}$ is singular and linearly independent iff ${\displaystyle X}$ is nonsingular

### Proof

Let ${\displaystyle {\vec {c}}=\left\langle c_{1},\ldots ,c_{n}\right\rangle }$, then ${\displaystyle X\,{\vec {c}}={\vec {0}}}$ has a nontrivial solution iff ${\displaystyle X}$ is singular.

### For Non-Square Matrix

For ${\displaystyle x_{1},\ldots ,x_{k}\in \mathbb {R} }$, the system ${\displaystyle c_{1}\,x_{1}\ldots c_{k}\,x_{k}=0}$ can be written as ${\displaystyle X\,{\vec {c}}={\vec {0}}}$, where ${\displaystyle X=(x_{1},\ldots ,x_{k})}$ is an ${\displaystyle n\times k}$ matrix. Therefore if ${\displaystyle n\neq k}$, the determinant of ${\displaystyle X}$ is not defined and #Theorem does not apply.

However, the system has nontrivial solutions (i.e. ${\displaystyle x_{1},\ldots ,x_{n}}$ is linearly dependent) iff the row echelon form of ${\displaystyle X}$ has at least one free variable.

#### Example

${\displaystyle x_{1}=\left\langle 1,-1,2,3\right\rangle }$, ${\displaystyle x_{2}=\left\langle -2,3,1,-2\right\rangle }$, and ${\displaystyle x_{3}=\left\langle 1,0,7,7\right\rangle }$

${\displaystyle {\begin{bmatrix}1&-2&1&0\\-1&3&0&0\\2&1&7&0\\3&-2&7&0\end{bmatrix}}\longrightarrow {\begin{bmatrix}1&-2&1&0\\0&1&1&0\\0&0&0&0\\0&0&0&0\end{bmatrix}}}$

Therefore, since ${\displaystyle c_{3}}$ is a free variable, ${\displaystyle \{x_{1},x_{2},x_{3}\}}$ is linearly dependent.

## Theorem

Let ${\displaystyle v_{1},\ldots ,v_{n}\in V}$. A vector ${\displaystyle w\in \mathrm {Span} (v_{1},\ldots ,v_{n})}$ can be written uniquely as a linear combination of ${\displaystyle v_{1},\ldots ,v_{n}}$ iff ${\displaystyle v_{1},\ldots ,v_{n}}$are linearly independent.

### Proof

${\displaystyle w\in \mathrm {Span} (v_{1},\ldots ,v_{n})\iff v=c_{1}\,v_{1}+\dots c_{n}\,v_{n}}$

Assume the solution is not unique, that is, ${\displaystyle v=\alpha _{1}\,v_{1}+\dots +\alpha _{n}\,v_{n}=\beta _{1}\,v_{1}+\dots +\beta _{n}\,v_{n}}$ where ${\displaystyle \alpha _{i}\neq \beta _{i}}$ for some ${\displaystyle i}$. This would mean that ${\displaystyle 0=(\alpha _{1}-\beta _{1})v_{1}+\dots +(\alpha _{n}-\beta _{n})v_{n}}$, where ${\displaystyle \alpha _{i}-\beta _{i}\neq 0}$, would be linearly dependent.

Therefore, if ${\displaystyle v_{1},\ldots ,v_{n}}$ are linearly dependent, then there exist ${\displaystyle c_{1},\ldots ,c_{n}}$ (not all zero) such that

{\displaystyle {\begin{aligned}0&=c_{1}\,v_{1}+\dots +c_{n}\,v_{n}\\+w&=\alpha _{1}\,v_{1}+\dots \alpha _{n}\,v_{n}\\=w&=(\alpha _{1}+c_{1})v_{1}+\dots +(\alpha _{n}+c_{n})v_{n}\end{aligned}}}

The second and third lines are two different representations for ${\displaystyle w}$

### Example

Let ${\displaystyle p_{1}(x)=x^{2}-2x+3}$, ${\displaystyle p_{2}(x)=2x^{2}+x+8}$, and ${\displaystyle p_{3}(x)=x^{2}+8x+7}$ be in ${\displaystyle P_{3}}$.

{\displaystyle {\begin{aligned}c_{1}\,p_{1}(x)+c_{2}\,p_{2}(x)+c_{3}\,p_{3}(x)&=0\\c_{1}(x^{2}-2x+3)+c_{2}(2x^{2}+x+8)+c_{3}(x^{2}+8x+7)&=0\\(c_{1}+2c_{2}+c_{3})x^{2}+(-2c_{1}+c_{2}+8c_{3})x+(3c_{1}+8c_{2}+7c_{3})&=0x^{2}+0x+0\end{aligned}}}

Therefore, since coefficients of terms must be equal,

{\displaystyle {\begin{aligned}c_{1}+2c_{2}+c_{3}&=0\\-2c_{1}+c_{2}+8c_{3}&=0\\3c_{1}+8c_{2}+7c_{3}&=0\end{aligned}}}

${\displaystyle {\begin{vmatrix}1&2&1\\-2&1&8\\3&8&7\end{vmatrix}}=0}$, so the matrix is singular, and therefore the polynomials ${\displaystyle p_{1}(x),p_{2}(x),p_{3}(x)}$ are linearly dependent.

## Wronskian Theorem

The following determinant of a matrix of functions and derivatives

${\displaystyle W[f_{1},\ldots ,f_{n}](x)={\begin{vmatrix}f_{1}(x)&f_{2}(x)&\dots &f_{n}(x)\\f_{1}'(x)&f_{2}'(x)&\dots &f_{n}'(x)\\\vdots &\vdots &\ddots &\vdots \\f_{1}^{(n-1)}(x)&f_{2}^{(n-1)}(x)&\dots &f_{n}^{(n-1)}(x)\end{vmatrix}}}$

Is called the Wronskian

Let ${\displaystyle f_{1},\ldots ,f_{n}}$ be ${\displaystyle n-1}$-th differentiable functions along the interval ${\displaystyle [a,b]}$. If there exists a point ${\displaystyle x_{0}}$ in ${\displaystyle [a,b]}$ such that ${\displaystyle W[f_{1},\ldots ,f_{n}](x)\neq 0}$, then ${\displaystyle f_{1},\ldots ,f_{n}}$ are linearly independent.