# MATH 409 Lecture 23

« previous | Thursday, November 21, 2013 | next »

Lecture Slides

## Infinite Series

Given a sequence ${\displaystyle \left\{a_{n}\right\}}$ of real numbers, the expression

${\displaystyle a_{1}+a_{2}+\dots =\sum _{n=1}^{\infty }}$

Is called an infinite series with terms ${\displaystyle a_{n}}$

The partial sum of order ${\displaystyle n}$ is given by

${\displaystyle s_{n}=a_{1}+\dots +a_{n}}$

If the sequence ${\displaystyle \left\{s_{n}\right\}}$ converges to limit ${\displaystyle s\in \mathbb {R} }$, we say the series converges to ${\displaystyle s}$ or that ${\displaystyle s}$ is the sum of the series and write ${\displaystyle \sum _{n=1}^{\infty }a_{n}=s}$

Otherwise the series diverges

### Cauchy Criterion

Theorem. [Cauchy Criterion]. An infinite series ${\displaystyle \sum _{n=1}^{\infty }a_{n}}$ converges if and only if for every ${\displaystyle \epsilon >0}$ there exists ${\displaystyle N\in \mathbb {N} }$ such that ${\displaystyle m\geq n\geq N}$ implies ${\displaystyle \left|a_{n}+a_{n+1}+\dots +a_{m}\right|<\epsilon }$

Proof. Let ${\displaystyle \left\{s_{n}\right\}}$ be the sequence of partial sums. Then

${\displaystyle a_{n}+a_{n}+1+\dots +a_{m}=s_{m}-s_{n-1}}$

Consequently, the condition of the theorem is equivalent to the condition that ${\displaystyle \left\{s_{n}\right\}}$ be a Cauchy sequence. As we know, a sequence is convergent if and only if it is a Cauchy sequence.

quod erat demonstrandum

### Examples

${\displaystyle {\frac {1}{2}}+{\frac {1}{2^{2}}}+\dots +{\frac {1}{2^{n}}}+\dots =1}$

The partial sums ${\displaystyle s_{n}}$ of this series satisfy ${\displaystyle s_{n}=1-2^{-n}}$ for all ${\displaystyle n\in \mathbb {N} }$. Thus ${\displaystyle s_{n}\to 1}$ as ${\displaystyle n\to \infty }$

${\displaystyle {\frac {1}{1\cdot 2}}+{\frac {1}{2\cdot 3}}+\dots +{\frac {1}{n(n+1)}}+\dots =1}$

Since ${\displaystyle {\frac {1}{n(n+1)}}={\frac {1}{n}}-{\frac {1}{n+1}}}$, the partial sums ${\displaystyle s_{n}}$ of this series satisfy ${\displaystyle s_{n}=1-{\frac {1}{n+1}}}$. Thus ${\displaystyle s_{n}\to 1}$ as ${\displaystyle n\to \infty }$

${\displaystyle \sum _{n=1}^{\infty }\left(-1\right)^{n}=-1+1-1+\dots }$ diverges

The partial sums ${\displaystyle s_{n}}$ satisfy ${\displaystyle s_{n}=-1}$ for odd ${\displaystyle n}$ and ${\displaystyle s_{n}=0}$ for even ${\displaystyle n}$. Hence the sequence ${\displaystyle \left\{s_{n}\right\}}$ has no limit.

The geometric series ${\displaystyle \sum _{n=0}^{\infty }x^{n}}$ converges if and only if ${\displaystyle \left|x\right|<1}$, in which case its sum is ${\displaystyle {\frac {1}{1-x}}}$.

In the case ${\displaystyle \left|x\right|\geq 1}$, the series fails the divergence test. For any ${\displaystyle x\neq 1}$, the partial sums of the geometric series satisfy

${\displaystyle s_{n}=1+x+x^{2}+\dots +x^{n}={\frac {1-x^{n+1}}{1-x}}}$

For ${\displaystyle \left|x\right|<1}$, ${\displaystyle s_{n}\to {\frac {1}{1-x}}}$ as ${\displaystyle n\to \infty }$.

## Properties of Infinite Series

Theorem. [Divergence Test]. If the terms of an infinite series do not converge to zero, then the series diverges.

Theorem. [Linearity]. If ${\displaystyle \sum _{n=1}^{\infty }a_{n}}$ and ${\displaystyle \sum _{n=1}^{\infty }b_{n}}$ are convergent series, then

${\displaystyle \sum _{n=1}^{\infty }(a_{n}+b_{n})=\sum _{n=1}^{\infty }a_{n}+\sum _{n=1}^{\infty }b_{n}}$

and

${\displaystyle \sum _{n=1}^{\infty }\left(r\,a_{n}\right)=r\,\sum _{n=1}^{\infty }a_{n}}$

for any ${\displaystyle r\in \mathbb {R} }$.

Theorem. If ${\displaystyle \sum _{n=1}^{\infty }a_{n}}$ and ${\displaystyle \sum _{n=1}^{\infty }b_{n}}$ are convergent series, and ${\displaystyle a_{n}\leq b_{n}}$ for all ${\displaystyle n\in \mathbb {N} }$, then

${\displaystyle \sum _{n=1}^{\infty }a_{n}\leq \sum _{n=1}^{\infty }b_{n}}$

### Series with Nonnegative Terms

Suppose that a series ${\displaystyle \sum _{n=1}^{\infty }a_{n}}$ has nonnegative terms ${\displaystyle a_{n}\geq 0}$ for all ${\displaystyle n\in \mathbb {N} }$. Then the sequence of partial sums ${\displaystyle s_{n}=a_{1}+\dots +a_{n}}$ is increasing. It follows that ${\displaystyle \left\{s_{n}\right\}}$

• converges to a finite limit if bounded and
• diverges to ${\displaystyle +\infty }$ otherwise.

In the latter case, we write ${\displaystyle \sum _{n=1}^{\infty }a_{n}=\infty }$.

#### Comparison Test

Theorem. [Comparison Test]. Suppose that ${\displaystyle a_{n},b_{n}\geq 0}$ for all ${\displaystyle n\in \mathbb {N} }$ and ${\displaystyle a_{n}\leq b_{n}}$ for large enough ${\displaystyle n}$. Then

• convergence of the series ${\displaystyle \sum _{n=1}^{\infty }b_{n}}$ (larger terms) implies convergence of ${\displaystyle \sum _{n=1}^{\infty }a_{n}}$ (smaller terms), while
• divergence of the series ${\displaystyle \sum _{n=1}^{\infty }a_{n}=\infty }$ (smaller terms) implies divergence of ${\displaystyle \sum _{n=1}^{\infty }b_{n}=\infty }$ (larger terms)

Proof. Since change a finite number of terms does not affect convergence of a series, it is no loss to assume that ${\displaystyle a_{n}\leq b_{n}}$ for all ${\displaystyle n\in \mathbb {N} }$. Then the partial sums ${\displaystyle s_{n}=\sum _{k=1}^{n}a_{k}}$ and ${\displaystyle t_{n}=\sum _{k=1}^{n}b_{n}}$ satisfy ${\displaystyle s_{n}\leq t_{n}}$ for all ${\displaystyle n}$. Consequently, if ${\displaystyle s_{n}\to +\infty }$ as ${\displaystyle n\to \infty }$, then also ${\displaystyle t_{n}\to +\infty }$ as ${\displaystyle n\to \infty }$.

Conversely, if ${\displaystyle \left\{t_{n}\right\}}$ is bounded, then so is ${\displaystyle \left\{s_{n}\right\}}$.

#### Integral Test

Theorem. [Integral test]. Suppose a function ${\displaystyle f:\left[1,\infty \right)\to \mathbb {R} }$ is positive and decreasing on ${\displaystyle \left[1,\infty \right)}$. Then

1. a sequence ${\displaystyle \left\{y_{n}\right\}}$ is bounded, where ${\displaystyle y_{n}=f(1)+f(2)+\dots +f(n)-\int _{1}^{n}f(x)\,\mathrm {d} x}$ ${\displaystyle n=1,2,\ldots }$
2. the series ${\displaystyle \sum _{n=1}^{\infty }f(n)}$ is convergent if and only if the function ${\displaystyle f}$ is improperly integrable on ${\displaystyle \left[1,\infty \right)}$.

Proof. To prove the theorem, we need the following lemma:

Lemma. Any monotone function ${\displaystyle g:\left[a,b\right]\to \mathbb {R} }$ is integrable on ${\displaystyle \left[a,b\right]}$.

Idea of the proof. Any monotone function has only jump discontinuities. Further, any function has at most countably many jump discontinuities. Besides, a monotone function on a closed interval ${\displaystyle \left[a,b\right]}$ is clearly bounded.

The lemma implies that the function ${\displaystyle f}$ is integrable on every closed interval ${\displaystyle J=[a,b]\subset \left[1,\infty \right)}$. Then for any partition ${\displaystyle P}$ of the interval ${\displaystyle J}$, the lower Darboux sum ${\displaystyle L(f,P)}$ and the upper Darboux sum ${\displaystyle U(f,P)}$ satisfy

${\displaystyle L(f,P)\leq \int _{a}^{b}f(x)\,\mathrm {d} x\leq U(f,P)}$

Let ${\displaystyle P=\left\{x_{0},x_{1},\ldots ,x_{k}\right\}}$, where ${\displaystyle x_{0}. Then ${\displaystyle \sup {f\left(\left[x_{j-1},x_{j}\right]\right)}=f(x_{j-1})}$ and ${\displaystyle \inf {f\left(\left[x_{j-1},x_{j}\right]\right)}=f(x_{j})}$ since ${\displaystyle f}$ is decreasing. In the case ${\displaystyle J=\left[1,n\right]}$, where ${\displaystyle n\in \mathbb {N} }$, and ${\displaystyle P=\left\{1,2,\ldots ,n\right\}}$, we obtain

• ${\displaystyle L(f,P)=f(2)+f(3)+\dots +f(n)}$,
• ${\displaystyle U(f,P)=f(1)+f(2)+\dots +f(n-1)}$.

Then the above inequalities imply that ${\displaystyle 0. Thus the sequence ${\displaystyle \left\{y_{n}\right\}}$ is bounded.

Now for the second part of the theorem: Since ${\displaystyle f}$ is positive, the series ${\displaystyle \sum _{n=1}^{\infty }f(n)}$ either converges or else it diverges to ${\displaystyle +\infty }$. Likewise, the improper integral ${\displaystyle \int _{1}^{\infty }f(x)\,\mathrm {d} x}$ either converges or else it diverges to ${\displaystyle +\infty }$. Since the sequence ${\displaystyle \left\{y_{n}\right\}}$ is bounded by the above, divergence of the series and the integral imply each other.

quod erat demonstrandum

#### Examples

Theorem. [P-Series Test]. [Riemann Zeta Function]. ${\displaystyle \sum _{n=1}^{\infty }{\frac {1}{n^{p}}}}$ is convergent for any ${\displaystyle p>1}$ and divergent for ${\displaystyle p<1}$.

Proof. For any ${\displaystyle p\neq 1}$, we have ${\displaystyle \int x^{-p}\,\mathrm {d} x={\frac {x^{1-p}}{1-p}}+C}$ on the interval ${\displaystyle \left[1,\infty \right)}$. The antiderivative converges to a finite limit at ${\displaystyle +\infty }$ in the case ${\displaystyle p>1}$ and diverges to ${\displaystyle +\infty }$ for ${\displaystyle p<1}$. Hence the function ${\displaystyle f(x)=x^{-p}}$ is improperly integrable on ${\displaystyle \left[1,\infty \right)}$ for ${\displaystyle p>1}$, but not for ${\displaystyle p<1}$. By the Integral Test, the series is convergent for ${\displaystyle p>1}$ and divergent for ${\displaystyle 0\leq p<1}$.

If ${\displaystyle p<0}$ then the Integral Test does not apply since ${\displaystyle f}$ is not decreasing. In this case, the series is divergent since the terms ${\displaystyle {\frac {1}{n^{p}}}}$ do not converge to ${\displaystyle 0}$ as ${\displaystyle n\to \infty }$.

quod erat demonstrandum

Harmonic Series. ${\displaystyle \sum _{n=1}^{\infty }{\frac {1}{n}}}$ diverges.

Indeed ${\displaystyle \int _{1}^{n}{\frac {1}{x}}\,\mathrm {d} x=\log {x}\to +\infty }$ as ${\displaystyle n\to 0}$. By the integral test, the series is divergent.

Moreover, the sequence ${\displaystyle y_{n}=\sum _{k=1}^{n}k^{-1}-\log {n}}$ is bounded (actually, it is decreasing and hence convergent)

This was the bonus problem on the test.

${\displaystyle \sum _{n=2}^{\infty }{\frac {1}{n\log ^{2}{n}}}}$ converges.

The antiderivative of ${\displaystyle f(x)=\left(x\,\log ^{2}{x}\right)^{-1}}$ on ${\displaystyle (1,\infty )}$ is

${\displaystyle \int {\frac {1}{x\,\log ^{2}{x}}}\,\mathrm {d} x=-{\frac {1}{\log {x}}}+C}$

Since the antiderivative converges to a finite limit at ${\displaystyle +\infty }$, the function ${\displaystyle f}$ is improperly integrable on ${\displaystyle \left[2,\infty \right)}$.

${\displaystyle \sum _{n=1}^{\infty }{\frac {1}{1+n^{2}}}}$ converges.

Indeed, ${\displaystyle 0<{\frac {1}{1+n^{2}}}\leq {\frac {1}{n^{2}}}}$ for all ${\displaystyle n\in \mathbb {N} }$. Since the series ${\displaystyle \sum _{n=1}^{\infty }{\frac {1}{n^{2}}}}$ is convergent, it remains to apply the comparison test. Alternatively, we can use the integral test.

${\displaystyle \int {\frac {1}{1+x^{2}}}\,\mathrm {d} x=\arctan {x}+C}$ converges to a finite limit (${\displaystyle {\frac {\pi }{2}}+C}$) at ${\displaystyle +\infty }$ so the function ${\displaystyle f(x)={\frac {1}{1+x^{2}}}}$ is improperly integrable on ${\displaystyle \left[1,\infty \right)}$.

${\displaystyle \sum _{n=1}^{\infty }\mathrm {e} ^{-n^{2}}}$ converges (really fast!)

We have ${\displaystyle 0<\mathrm {e} ^{-n^{2}}\leq \mathrm {e} ^{-n}}$ for all ${\displaystyle n\in \mathbb {N} }$. The geometric series ${\displaystyle \sum _{n=1}^{\infty }\mathrm {e} ^{-n}=\sum _{n=1}^{\infty }\left({\frac {1}{e}}\right)^{n}}$ is convergent since ${\displaystyle 0<\mathrm {e} ^{-1}<1}$. By the comparison test, ${\displaystyle \sum _{n=1}^{\infty }\mathrm {e} ^{-n^{2}}}$ is convergent as well.