# MATH 409 Lecture 21

« previous | Tuesday, November 12, 2013 | next »

End Exam 2 content
Lecture Slides

## Exam 2 Review

### Topics

Derivatives

• Derivative of a function
• Differentiability Theorems
• derivative of inverse function
• Mean Value Theorem
• Rolle's Theorem
• Generalized Mean Value Theorem
• Taylor's Formula
• l'Hôpital's rule

Integrals

• Definitions
• Darboux Sums
• Riemann Sums
• Riemann Integral
• Properties of Integrals
• Fundamental Theorem of Calculus (both parts)
• Integration by Parts
• Change of variable in an integral

Chapters 4.1–4.5, 5.1–5.3

#### Differentiability Theorems

• sum / difference rules
• product rule
• quotient rule
• chain rule
• differentiability implies continuity
• Rolle's Theorem: If a function ${\displaystyle f}$ is continuous on a closed interval ${\displaystyle [a,b]}$, differentiable on ${\displaystyle (a,b)}$, and ${\displaystyle f(a)=f(b)}$, then ${\displaystyle f'(c)=0}$ for some ${\displaystyle c\in (a,b)}$
• Mean Value Theorem: If a function ${\displaystyle f}$ is continuous on ${\displaystyle [a,b]}$ and differentiable on ${\displaystyle (a,b)}$, then there exists ${\displaystyle c\in (a,b)}$ such than ${\displaystyle f(b)-f(a)=f'(c)\,(b-a)}$.
• ${\displaystyle f}$ is increasing on ${\displaystyle [a,b]}$ if and only if ${\displaystyle f'\geq 0}$ on ${\displaystyle (a,b)}$
• ${\displaystyle f}$ is decreasing on ${\displaystyle [a,b]}$ if and only if ${\displaystyle f'\leq 0}$ on ${\displaystyle (a,b)}$
• ${\displaystyle f}$ is constant on ${\displaystyle [a,b]}$ if and only if ${\displaystyle f'=0}$ on ${\displaystyle (a,b)}$

#### Properties of Integrals

• Linearity:
• ${\displaystyle \int _{a}^{b}\left(f(x)+g(x)\right)\,\mathrm {d} x=\int _{a}^{b}f(x)\,\mathrm {d} x+\int _{a}^{b}g(x)\,\mathrm {d} x}$
• ${\displaystyle \int _{a}^{b}\left(\alpha \,f(x)\right)\,\mathrm {d} x=\alpha \,\int _{a}^{b}f(x)\,\mathrm {d} x}$
• Subinterval property: ${\displaystyle \int _{a}^{b}f(x)\,\mathrm {d} x=\int _{a}^{c}f(x)\,\mathrm {d} x+\int _{c}^{b}f(x)\,\mathrm {d} x}$
• Comparison Theorem: If ${\displaystyle f(x)\leq g(x)}$ for all ${\displaystyle x\in [a,b]}$, then ${\displaystyle \int _{a}^{b}f(x)\,\mathrm {d} x\leq \int _{a}^{b}g(x)\,\mathrm {d} x}$

#### Fundamental Theorem of Calculus

• Part 1: ${\displaystyle F(x)=\int _{a}^{x}f(t)\,\mathrm {d} t}$, ${\displaystyle x\in [a,b]}$ is continuously differentiable on ${\displaystyle [a,b]}$ and ${\displaystyle F'(x)=f(x)}$ for all ${\displaystyle x\in [a,b]}$
• Part 2: If a function ${\displaystyle F}$ is differentiable on ${\displaystyle [a,b]}$, and the derivative ${\displaystyle F'}$ is integrable on ${\displaystyle [a,b]}$, then ${\displaystyle \int _{a}^{x}F'(t)\,\mathrm {d} t=F(x)-F(a)}$ for all ${\displaystyle x\in [a,b]}$.

### Sample Problems

#### Problem 1: Prove the Chain Rule

Theorem. If a function ${\displaystyle f}$...

Proved in class (except for one small part)

#### Problem 2: Find the Limits

${\displaystyle \lim _{x\to 0}\left(1+x\right)^{\frac {1}{x}}}$

The function ${\displaystyle f(x)=\left(1+x\right)^{\frac {1}{x}}}$ is well-defined on ${\displaystyle (-1,\infty )}$ except at ${\displaystyle 0}$. Since ${\displaystyle f(x)>0}$ for all ${\displaystyle x>1}$, a function ${\displaystyle g(x)=\log {f(x)}}$ is well-defined on ${\displaystyle (-1,\infty )}$ except at ${\displaystyle 0}$ as well. For any ${\displaystyle x>1}$, we have ${\displaystyle g(x)=\log {\left(1+x\right)^{\frac {1}{x}}}=x^{-1}\,\log {\left(1+x\right)}}$. Hence ${\displaystyle g={\frac {h_{1}}{h_{2}}}}$, where ${\displaystyle h_{1}(x)=\log {\left(1+x\right)}}$ and ${\displaystyle h_{2}(x)=x}$ are continuously differentiable on ${\displaystyle \left(-1,\infty \right)}$. Since ${\displaystyle h_{1}(0)=h_{2}(0)=0}$, it follows that ${\displaystyle \lim _{x\to 0}h_{1}(x)=\lim _{x\to 0}h_{2}(x)=0}$. By l'Hôpital's rule, ${\displaystyle h_{1}'(0)=1}$ and ${\displaystyle h_{2}'(0)=1}$, so ${\displaystyle \lim _{x\to 0}g(x)=\lim _{x\to 0}{\frac {h_{1}(x)}{h_{2}(x)}}={\frac {1}{1}}=1}$

Since ${\displaystyle f=\mathrm {e} ^{g(x)}}$, a composition of ${\displaystyle g}$ with a continuous function, it follows that ${\displaystyle \lim _{x\to 0}f(x)=\mathrm {e} ^{1}=\mathrm {e} }$

${\displaystyle \lim _{x\to +\infty }\left(1+x\right)^{\frac {1}{x}}}$ Similar to above, ${\displaystyle \lim _{x\to +\infty }h_{1}(x)=\lim _{x\to +\infty }h_{2}(x)=+\infty }$. At the same time, ${\displaystyle h_{1}'(x)\to 0}$ as ${\displaystyle x\to +\infty }$, while ${\displaystyle h_{2}'}$ is identically ${\displaystyle 1}$. Using l'Hôpital's Rule, we obtain ${\displaystyle \lim _{x\to +\infty }g(x)=\lim _{x\to +\infty }{\frac {h_{1}(x)}{h_{2}(x)}}={\frac {0}{1}}=0}$

Since ${\displaystyle f=\mathrm {e} ^{g(x)}}$, a composition of ${\displaystyle g}$ with a continuous function, it follows that ${\displaystyle \lim _{x\to 0}f(x)=\mathrm {e} ^{0}=1}$

#### Problem 3: Limit of a sequence

Find the limit of a sequence ${\displaystyle x_{n}={\frac {1^{k}+2^{k}+\dots +n^{k}}{n^{k+1}}}}$ for ${\displaystyle n\in \mathbb {N} }$, where ${\displaystyle k}$ is a natural number.

The general element of the sequence can be represented as

${\displaystyle x_{n}={\frac {1^{k}+2^{k}+\dots +n^{k}}{n^{k}}}\cdot {\frac {1}{n}}=\left({\frac {1}{n}}\right)^{k}{\frac {1}{n}}+\left({\frac {2}{n}}\right)^{k}{\frac {1}{n}}+\dots +\left({\frac {n}{n}}\right)^{k}{\frac {1}{n}}}$

which shows that ${\displaystyle x_{n}}$ is a Riemann sum of the function ${\displaystyle f(x)=x^{k}}$ on the interval ${\displaystyle [0,1]}$ that corresponds to the partition ${\displaystyle P_{n}=\left\{0,{\frac {1}{n}},{\frac {2}{n}},\ldots ,{\frac {n-1}{n}},1\right\}}$ and samples ${\displaystyle t_{j}={\frac {j}{n}}}$, ${\displaystyle j=1,2,\ldots ,n}$. The norm of the partition is ${\displaystyle \left\|P_{n}\right\|={\frac {1}{n}}}$. Since ${\displaystyle \left\|P_{n}\right\|\to 0}$ as ${\displaystyle n\to \infty }$ and the function ${\displaystyle f}$ is integrable on ${\displaystyle [0,1]}$, the Riemann sums ${\displaystyle x_{n}}$ converge to the integral:

${\displaystyle \lim _{n\to \infty }x_{n}=\int _{0}^{1}x^{k}\,\mathrm {d} x=\left.{\frac {x^{k+1}}{k+1}}\right|_{x=0}^{1}={\frac {1}{k+1}}}$

#### Problem 4: Find/evaluate indefinite/definite integrals

##### Subproblem 1

${\displaystyle \int {\frac {x^{2}}{1-x}}\,\mathrm {d} x}$

A standard way to evaluate this type of function is to split it into the sum of a polynomial and a simple fraction:

${\displaystyle {\frac {x^{2}}{1-x}}={\frac {x^{2}-1+1}{1-x}}={\frac {x^{2}-1}{1-x}}+{\frac {1}{1-x}}=-x-1-{\frac {1}{x-1}}}$

Since the domain of the function is ${\displaystyle (-\infty ,1)\cup (1\infty )}$, the indefinite integral has differetn representations on the intervals ${\displaystyle (-\infty ,1)}$ and ${\displaystyle (1,\infty )}$:

${\displaystyle \int {\frac {x^{2}}{1-x}}={\begin{cases}-{\frac {x^{2}}{2}}-x-\log {\left(1-x\right)}+C_{1}&x<1\\-{\frac {x^{2}}{2}}-x-\log {\left(x-1\right)}+C_{2}&x>1\end{cases}}}$

##### Subproblem 2

${\displaystyle \int _{0}^{\pi }\sin ^{2}{(2x)}\,\mathrm {d} x}$

To integrate this function, we use a trigonometric formula ${\displaystyle 1-\cos {2\alpha }=2\sin ^{2}{\alpha }}$ and a new variable ${\displaystyle u=4x}$:

${\displaystyle \int _{0}^{\pi }\sin ^{2}{(2x)}\,\mathrm {d} x=\int _{0}^{\pi }{\frac {1-\cos {(4x)}}{2}}\,\mathrm {d} x=\int _{0}^{\pi }{\frac {1-\cos {(4x)}}{8}}\,\mathrm {d} (4x)=\int _{0}^{4\pi }{\frac {1-\cos {u}}{8}}\,\mathrm {d} u=\left.{\frac {u-\sin {u}}{8}}\right|_{u=0}^{4\pi }={\frac {\pi }{2}}}$

##### Subproblem 3

${\displaystyle \int \log ^{3}{x}\,\mathrm {d} x}$

To find this indefinite integral, we integrate by parts:

{\displaystyle {\begin{aligned}\int \log ^{3}{x}\,\mathrm {d} x&=x\,\log ^{3}{x}-\int x\,\mathrm {d} (\log ^{3}{x})\\&=x\,\log ^{3}{x}-\int x\left(\log ^{3}{x}\right)'\,\mathrm {d} x\\&=x\,\log ^{3}{x}-\int 3\log ^{2}{x}\,\mathrm {d} x\\&=x\,\log ^{3}{x}-3x\,\log ^{2}{x}+\int x\,\mathrm {d} \left(3\log ^{2}{x}\right)\\&=x\,\log ^{3}{x}-3x\,\log ^{2}{x}+\int 6\log {x}\,\mathrm {d} x\\&=x\,\log ^{3}{x}-3x\,\log ^{2}{x}+6x\,\log {x}-\int x\,\mathrm {d} \left(6\log {x}\right)\\&=x\,\log ^{3}{x}-3x\,\log ^{2}{x}+6x\,\log {x}-\int 6\,\mathrm {d} x\\&=x\,\log ^{3}{x}-3x\,\log ^{2}{x}+6x\,\log {x}-6x+C\end{aligned}}}

##### Subproblem 4

${\displaystyle \int _{0}^{\frac {1}{2}}{\frac {x}{\sqrt {1-x^{2}}}}\,\mathrm {d} x}$

To integrate this function, we introduce a new variable ${\displaystyle u=1-x^{2}}$:

{\displaystyle {\begin{aligned}\int _{0}^{\frac {1}{2}}{\frac {x}{\sqrt {1-x^{2}}}}\,\mathrm {d} x&=-{\frac {1}{2}}\,\int _{0}^{\frac {1}{2}}{\frac {\left(1-x^{2}\right)'}{\sqrt {1-x^{2}}}}\,\mathrm {d} x\\&=-{\frac {1}{2}}\,\int _{0}^{\frac {1}{2}}{\frac {1}{\sqrt {1-x^{2}}}}\,\mathrm {d} \left(1-x^{2}\right)\\&=-{\frac {1}{2}}\,\int _{1}^{\frac {3}{4}}{\frac {1}{\sqrt {u}}}\,\mathrm {d} u\\&=\ldots \\&=1-{\frac {\sqrt {3}}{2}}\end{aligned}}}

##### Subproblem 5

${\displaystyle \int _{0}^{1}{\frac {1}{\sqrt {4-x^{2}}}}\,\mathrm {d} x}$

To integrate this function, we use a substitution ${\displaystyle x=2\sin {t}}$. Observe that ${\displaystyle x}$ changes from ${\displaystyle 0}$ to ${\displaystyle 1}$ when ${\displaystyle t}$ changes from ${\displaystyle 0}$ to ${\displaystyle {\frac {\pi }{6}}}$):

{\displaystyle {\begin{aligned}\int _{0}^{1}{\frac {1}{\sqrt {4-x^{2}}}}\,\mathrm {d} x&=\int _{0}^{\frac {\pi }{6}}{\frac {1}{\sqrt {4-(2\sin {t})^{2}}}}\,\mathrm {d} (2\sin {t})\\&=\int _{0}^{\frac {\pi }{6}}{\frac {\left(2\sin {t}\right)}{\sqrt {4-4\sin ^{2}{t}}}}\,\mathrm {d} t\\&=\int _{0}^{\frac {\pi }{6}}{\frac {2\cos {t}}{\sqrt {4\cos ^{2}{t}}}}\,\mathrm {d} t\\&=\int _{0}^{\frac {\pi }{6}}1\,\mathrm {d} x\\&={\frac {\pi }{6}}\end{aligned}}}

#### Bonus Problem 5

Suppose ${\displaystyle p:\mathbb {R} \to \mathbb {R} }$ is locally a polynomial, which means that for every ${\displaystyle c\in \mathbb {R} }$, there exists ${\displaystyle \epsilon >0}$ such that ${\displaystyle p}$ coincides with a polynomial on the interval ${\displaystyle \left(c-\epsilon ,c+\epsilon \right)}$. Prove that ${\displaystyle p}$ is a polynomial.

Proof. For any ${\displaystyle c\in \mathbb {R} }$ let ${\displaystyle p_{c}}$ denote a polynomial and ${\displaystyle \epsilon _{c}}$ denote a positive number such that ${\displaystyle p(x)=p_{c}(x)}$ for all ${\displaystyle x\in (c-\epsilon _{c},c+\epsilon _{c})}$. Consider two sets:

{\displaystyle {\begin{aligned}E_{+}&=\left\{x>0~\mid ~p(x)\neq p_{0}(x)\right\}\\E_{-}&=\left\{x<0~\mid ~p(x)\neq p_{0}(x)\right\}\end{aligned}}}

We are going to show that ${\displaystyle E_{+}=E_{-}=\emptyset }$.

Assume that the set ${\displaystyle E_{+}}$ is not empty. Clearly, ${\displaystyle E_{+}}$ is bounded below, hence ${\displaystyle d=\inf {E_{+}}}$ is a well-defined real number. Note that ${\displaystyle E_{+}\subset \left[\epsilon _{0},\infty \right)}$ Therefore ${\displaystyle d\geq \epsilon _{0}>0}$.

Observe that ${\displaystyle p(x)=p_{0}(x)}$ for ${\displaystyle x\in (0,d)}$ and ${\displaystyle p(x)=p_{d}(x)}$ for ${\displaystyle x\in \left(d-\epsilon _{d},d+\epsilon _{d}\right)}$. The interval ${\displaystyle (0,d)}$ overlaps with the interval ${\displaystyle \left(d-\epsilon _{d},d+\epsilon _{d}\right)}$. Hence ${\displaystyle p_{d}}$ coincides with ${\displaystyle p_{0}}$ on the intersection ${\displaystyle (0,d)\cap (d-\epsilon _{d},d+\epsilon _{d})}$. Equivalently, the difference ${\displaystyle p_{d}-p_{0}}$ is zero on ${\displaystyle (0,d)\cap (d-\epsilon _{d},d+\epsilon _{d})}$. Since ${\displaystyle p_{d}-p_{0}}$ is a polynomial and any nonzero polynomial has only finitely many roots, we conclude that ${\displaystyle p_{d}-p_{0}}$ is identically 0. Then the polynomials ${\displaystyle p_{d}}$ and ${\displaystyle p_{0}}$ are the same. It follows that ${\displaystyle p(x)=p_{0}(x)}$ for ${\displaystyle x\in (0,d+\epsilon _{d})}$, so ${\displaystyle d\neq \inf {E_{+}}}$, a contradiction.

Thus ${\displaystyle E_{+}=\emptyset }$. Similarly, we prove that the set ${\displaystyle E_{-}}$ is empty as well. Since ${\displaystyle E_{+}=E_{-}=\emptyset }$, the function ${\displaystyle p}$ coincides with the polynomial ${\displaystyle p_{0}}$ everywhere.

#### Bonus Problem 6

Show that a function ${\displaystyle f(x)={\begin{cases}\mathrm {e} ^{-{\frac {1}{1-x^{2}}}}&\left|x\right|<1\\0&\left|x\right|\geq 1\end{cases}}}$

is infinitely differentiable on ${\displaystyle \mathbb {R} }$.

Observe that the integral of this function goes from 0 to 1 smoothly. Constant functions and ${\displaystyle \mathrm {e} ^{f(x)}}$ are infinitely differentiable.