# MATH 409 Lecture 20

« previous | Thursday, November 7, 2013 | next »

Lecture Slides

Exam next Thursday

Homework problems just require definition of improper Riemann definition; look at hint for problem 5.3.12

## Integral with Variable Limit

Theorem. If ${\displaystyle f}$ is continuous at a point ${\displaystyle x\in [a,b]}$, then ${\displaystyle F}$ is differentiable at ${\displaystyle x}$ and ${\displaystyle F'(x)=f(x)}$

Proof. For any ${\displaystyle x,y\in [a,b]}$ such that ${\displaystyle x, we have

${\displaystyle \int _{a}^{y}f(t)\,\mathrm {d} t=\int _{a}^{x}f(t)\,\mathrm {d} t+\int _{x}^{y}f(t)\,\mathrm {d} t\,\!}$

Then

${\displaystyle F(y)-F(x)-f(x)(y-x)=\int _{x}^{y}f(t)\,\mathrm {d} t-\int _{x}^{y}f(x)\,\mathrm {d} t}$

So that

${\displaystyle \left|F(y)-F(x)-f(x)(y-x)\right|=\left|\int _{x}^{y}\left(f(t)-f(x)\right)\,\mathrm {d} t\right|\leq \int _{x}^{y}\left|f(t)-f(x)\right|\,\mathrm {d} t\leq \sup _{t\in [x,y]}\left|f(t)-f(x)\right|\,(y-x)}$

Dividing both sides by ${\displaystyle (y-x)}$ gives

${\displaystyle {\frac {F(y)-F(x)}{y-x}}-f(x)\leq \sup _{t\in [x,y]}\left|f(t)-f(x)\right|}$

If the function ${\displaystyle f}$ is right continuous at ${\displaystyle x}$ (i.e. ${\displaystyle f(y)\to f(x)}$ as ${\displaystyle y\to x^{+}}$), then ${\displaystyle \sup _{t\in [x,y]}\left|f(t)-f(x)\right|\to 0}$ as ${\displaystyle y\to x^{+}}$. It follows that ${\displaystyle f(x)}$ is the right-hand derivative of ${\displaystyle F}$ at ${\displaystyle x}$. Likewise, one can prove that left continuity of f at x implies ${\displaystyle f(x)}$ is the left derivative at ${\displaystyle x}$.

quod erat demonstrandum

## The Fundamental Theorem of Calculus

Theorem. [Part I]. If a function ${\displaystyle f}$ is continuous on an interval ${\displaystyle [a,b]}$, then the function

${\displaystyle F(x)=\int _{a}^{x}f(t)\,\mathrm {d} t\qquad x\in [a,b]}$

is continuously differentiable on ${\displaystyle [a,b]}$. Moreover, ${\displaystyle F'(x)=f(x)}$ for all ${\displaystyle x\in [a,b]}$.

Proof. Since ${\displaystyle f}$ is continuous, it is also integrable on ${\displaystyle [a,b]}$. As proved earlier, integrability of ${\displaystyle f}$ implies that the function ${\displaystyle F}$ is well-defined on ${\displaystyle [a,b]}$. Moreover, ${\displaystyle F'(x)=f(x)}$ whenever ${\displaystyle f}$ is continuous at the point ${\displaystyle x}$. Therefore, the continuoity of ${\displaystyle f}$ on ${\displaystyle [a,b]}$ implies that ${\displaystyle F'(x)=f(x)}$ for all ${\displaystyle x\in [a,b]}$. In particular, ${\displaystyle F}$ is continuously differentiable on ${\displaystyle [a,b]}$.

quod erat demonstrandum

Theorem. [Part II]. If a function ${\displaystyle F}$ is differentiable on ${\displaystyle [a,b]}$ and the derivative ${\displaystyle F'}$ is integrable on ${\displaystyle [a,b]}$, then

${\displaystyle \int _{a}^{x}F'(t)\,\mathrm {d} t=F(x)-F(a)}$ for all ${\displaystyle x\in [a,b]}$

Proof. The case ${\displaystyle x=a}$ is trivial: ${\displaystyle 0=0}$. Assume ${\displaystyle x\in (a,b]}$. Since ${\displaystyle F'}$ is integrable on ${\displaystyle [a,b]}$, it is also integrable on any subinterval ${\displaystyle [a,x]}$, ${\displaystyle x\in (a,b)}$. Therefore, it is no loss to assume that ${\displaystyle x=b}$.

Consider an arbitrary partition ${\displaystyle P=\left\{x_{0},x_{1},\ldots ,x_{n}\right\}}$ of ${\displaystyle [a,b]}$. Let us choose samples ${\displaystyle t_{j}\in [x_{j-1},x_{j}]}$ for the Riemann sum ${\displaystyle {\mathcal {S}}(F',P,t_{j})}$ so that ${\displaystyle F(x_{j})-F(x_{j-1})=F'(t_{j})\,(x_{j}-x_{j-1})}$ (this is possible due to the Mean Value Theorem). Then

{\displaystyle {\begin{aligned}{\mathcal {S}}(F',P,t_{j})&=\sum _{j=1}^{n}F'(t_{j})\,(x_{j}-x_{j-1})\\&=\sum _{j=1}^{n}\left(F(x_{j})-F(x_{j-1})\right)\\&=F(x_{n})-F(x_{0})\\&=F(b)-F(a)\end{aligned}}}

Since the sums ${\displaystyle {\mathcal {S}}(F',P,t_{j})}$ converge to ${\displaystyle \int _{a}^{b}F'(t)\,\mathrm {d} t}$ as ${\displaystyle \left\|P\right\|\to 0}$, the theorem follows: ${\displaystyle \int _{a}^{b}F'(t)\,\mathrm {d} t=F(b)-F(a)}$.

quod erat demonstrandum

### The Indefinite Integral

Given a function ${\displaystyle f:[a,b]\to \mathbb {R} }$, a function ${\displaystyle F:[a,b]\to \mathbb {R} }$ is called the indefinite integral (or antiderivative, primitive integral, or the primitive) of ${\displaystyle f}$ if ${\displaystyle F'(x)=f(x)}$ for all ${\displaystyle x\in [a,b]}$. Notation:

${\displaystyle \int f(x)\,\mathrm {d} x}$

If the function ${\displaystyle f}$ is continuous on ${\displaystyle [a,b]}$, then the function ${\displaystyle F(x)=\int _{a}^{x}f(t)\,\mathrm {d} t}$ for ${\displaystyle x\in [a,b]}$ is an indefinite integral of ${\displaystyle f}$ due to the Fundamental Theorem of Calculus.

Suppose ${\displaystyle F}$ is an antiderivative of ${\displaystyle f}$. If ${\displaystyle G}$ is another antiderivative of ${\displaystyle f}$, then ${\displaystyle G'=F'}$ on ${\displaystyle [a,b]}$. Hence ${\displaystyle (G-F)'=G'-F'=0}$ on ${\displaystyle [a,b]}$. It follows that ${\displaystyle G-F}$ is a constant function. Conversely, for any constant ${\displaystyle C}$, the function ${\displaystyle G(x)=F(x)+C}$ is also an antiderivative of ${\displaystyle f}$.

${\displaystyle \int f(x)\,\mathrm {d} x=F(x)+C}$

Where ${\displaystyle C}$ is an arbitrary constant.

#### Examples

• ${\displaystyle \int x^{\alpha }\,\mathrm {d} x={\frac {x^{\alpha +1}}{\alpha +1}}+C}$ on ${\displaystyle (0,\infty )}$ for ${\displaystyle \alpha \neq 1}$.
• ${\displaystyle \int {\frac {1}{x}}\,\mathrm {d} x=\log {x}+C}$ on ${\displaystyle (0,\infty )}$.
• ${\displaystyle \int \sin {x}=-\cos {x}+C}$
• ${\displaystyle \int \cos {x}=\sin {x}+C}$

### Integration by Parts

Theorem. Suppose that functions ${\displaystyle f}$ and ${\displaystyle g}$ are differentiable on ${\displaystyle [a,b]}$ with derivatives ${\displaystyle f'}$ and ${\displaystyle g'}$ integrable on ${\displaystyle [a,b]}$. Then

${\displaystyle \int _{a}^{b}f(x)\,g'(x)\,\mathrm {d} x=f(b)\,g(b)-f(a)\,g(a)-\int _{a}^{b}f'(x)\,g(x)\,\mathrm {d} x}$

Proof. By the product rule, ${\displaystyle (f\,g)'=f'\,g+f\,g'}$ on ${\displaystyle [a,b]}$. Since ${\displaystyle f,f',g,g'}$ are integrable on ${\displaystyle [a,b]}$ by the hypothesis, so are the products ${\displaystyle f'\,g}$ and ${\displaystyle f\,g'}$. Then ${\displaystyle (f\,g)'}$ is integrable on ${\displaystyle [a,b]}$ as well. By the Fundamental Theorem of Calculus,

{\displaystyle {\begin{aligned}f(b)\,g(b)-f(a)\,g(a)&=\int _{a}^{b}(f\,g)'(x)\,\mathrm {d} x\\&=\int _{a}^{b}f'(x)\,g(x)\,\mathrm {d} x+\int _{a}^{b}f(x)\,g'(x)\,\mathrm {d} x\end{aligned}}}
quod erat demonstrandum

Corollary. Suppose that functions ${\displaystyle f,g}$ are continuously differentiable on ${\displaystyle [a,b]}$. Then

${\displaystyle \int f(x)\,g'(x)\,\mathrm {d} x=f(x)\,g(x)-\int f'(x)\,g(x)\,\mathrm {d} x}$ on ${\displaystyle [a,b]}$

To simplify notation, it is convenient to use the Leibniz differential ${\displaystyle \mathrm {d} f}$ of a function ${\displaystyle f}$ defined by ${\displaystyle \mathrm {d} f(x)=f'(x)\,\mathrm {d} x={\frac {\mathrm {d} f}{\mathrm {d} x}}\,\mathrm {d} x}$. Another convenient notation is ${\displaystyle \left.f(x)\right|_{x=a}^{b}}$ or simply ${\displaystyle \left.f(x)\right|_{a}^{b}}$, which denotes the difference ${\displaystyle f(b)-f(a)}$.

Now the formula of integration by parts can be rewritten as

${\displaystyle \int _{a}^{b}f(x)\,\mathrm {d} g(x)=\left.f(x)\,g(x)\right|_{a}^{b}-\int _{a}^{b}g(x)\,\mathrm {d} f(x)}$

for definite integrals, and as

${\displaystyle \int f\,\mathrm {d} g=f\,g-\int g\,\mathrm {d} f}$

for indefinite integrals

#### Examples

• ${\displaystyle \int \log {x}\,\mathrm {d} x=x\,\log {x}-\int x\,\mathrm {d} (\log {x})=x\,\log {x}-\int \mathrm {d} x=x\,\log {x}-x+C}$
• ${\displaystyle \int _{0}^{\frac {\pi }{2}}x\,\sin {x}\,\mathrm {d} x=\left.-x\,\cos {x}\right|_{0}^{\frac {\pi }{2}}-\int _{0}^{\frac {\pi }{2}}(-\cos {x})\,\mathrm {d} x=\int _{0}^{\frac {\pi }{2}}\cos {x}\,\mathrm {d} x=\left.\sin {x}\right|_{0}^{\frac {\pi }{2}}=1}$.

## Change of the variable in an integral

(commonly called u-substitution as ${\displaystyle \phi }$ is often replaced by ${\displaystyle u}$ in practice.)

Theorem. if ${\displaystyle \phi }$ is continuously differentiable on a closed, nondegenerate interval ${\displaystyle [a,b]}$ and ${\displaystyle f}$ is continuous on ${\displaystyle \phi ([a,b])}$, then

${\displaystyle \int _{\phi (a)}^{\phi (b)}f(t)\,\mathrm {d} t=\int _{a}^{b}f(\phi (x))\,\phi '(x)\,\mathrm {d} x=\int _{a}^{b}f(\phi (x))\,\mathrm {d} \phi (x)}$

Be aware that ${\displaystyle t=\phi (x)}$ is a proper change of variable only if the function ${\displaystyle \phi }$ is strictly monotone. However, the theorem holds even without this assumption.

Proof. Let us define two functions

{\displaystyle {\begin{aligned}F(u)&=\int _{\phi (a)}^{u}f(t)\,\mathrm {d} t\qquad u\in \phi ([a,b])\\G(u)&=\int _{a}^{x}f(\phi (s))\,\phi '(s)\,\mathrm {d} s\qquad x\in [a,b]\end{aligned}}}

It follows from the Fundamental Theorem of Calculus that ${\displaystyle F'(u)=f(u)}$ and ${\displaystyle G'(x)=f(\phi (x))\,\phi '(x)}$. By the Chain Rule, ${\displaystyle (F\circ \phi )'(x)=F'(\phi (x))\,\phi '(x)=f(\phi (x))\,\phi '(x)=G'(x)}$.

Therefore, ${\displaystyle \left(F(\phi (x))-G(x)\right)'=0}$ for all ${\displaystyle x\in [a,b]}$. It follows that the function ${\displaystyle F(\phi (x))-G(x)}$ is constant on ${\displaystyle [a,b]}$. In particular, ${\displaystyle F(\phi (b))-G(b)=F(\phi (a))-G(a)=0-0=0}$. Hence ${\displaystyle F(\phi (b))=G(b)}$.

quod erat demonstrandum
Note: It is possible that ${\displaystyle \phi (a)\geq \phi (b)}$. To make sense of this case, we set
${\displaystyle \int _{c}^{d}f(t)\,\mathrm {d} t{=}-\int _{d}^{c}f(t)\,\mathrm {d} t}$
if ${\displaystyle c>d}$. Also we set the integral to be 0 if ${\displaystyle c=d}$.