# MATH 409 Lecture 15

« previous | Tuesday, October 22, 2013 | next »

Lecture Slides

## Review

Derivative is defined as ${\displaystyle f'(a)=\lim _{h\to 0}{\frac {f(a+h)-f(a)}{h}}}$ or equivalently ${\displaystyle f'(a)=\lim _{x\to a}{\frac {f(x)-f(a)}{x-a}}}$.

Notations:

• Lagrange: ${\displaystyle f'}$
• Newton: ${\displaystyle {\dot {f}}}$
• Leibniz: ${\displaystyle {\frac {\mathrm {d} f}{\mathrm {d} x}}}$
• Euler: ${\displaystyle D_{x}\,f}$
• ${\displaystyle f^{(1)}}$ (higher order derivatives)

Derivative at a point denoted ${\displaystyle \left.(f(x))'\right|_{x=a}}$.

• Sum and Difference Rules
• Product Rule
• Quotient Rule

## Derivatives of Elementary Functions

### Higher-Order Derivatives

Defined inductively:

For ${\displaystyle n\geq 2}$ for any ${\displaystyle a\in \mathbb {R} }$, the ${\displaystyle n}$-th derivative of ${\displaystyle f}$ at a point ${\displaystyle a}$, denoted ${\displaystyle f^{(n)}(a)}$, is defined by ${\displaystyle f^{(n)}(a)=(f^{(n-1)})'(a)}$.

#### Derivative Spaces

Let ${\displaystyle I}$ be an interval of the real line ${\displaystyle \mathbb {R} }$. We denote ${\displaystyle C(I)}$ or ${\displaystyle C^{0}(I)}$ as the set of all continuous functions on ${\displaystyle I}$.

For any ${\displaystyle n\in \mathbb {N} }$, we denote ${\displaystyle C^{n}(I)}$ as the set of all functions that are ${\displaystyle n}$ times continuously differentiable on ${\displaystyle I}$. (that is, the ${\displaystyle n}$-th derivative is continuous)

${\displaystyle C^{\infty }(I)}$ represents the set of all infinitely continuously differentiable on ${\displaystyle I}$.

### Examples

${\displaystyle f(0)=0}$, and ${\displaystyle f(x)=x\sin {\frac {1}{x}}}$ for ${\displaystyle x\neq 0}$.

Product and Chain rules show that ${\displaystyle f}$ is differentiable on ${\displaystyle \mathbb {R} \setminus \left\{0\right\}}$.

For ${\displaystyle x\neq 0}$.

${\displaystyle f'(x)=\left(x\,\sin {\frac {1}{x}}\right)'=\sin {\frac {1}{x}}+x\,\left(\sin {\frac {1}{x}}\right)=\sin {\frac {1}{x}}-{\frac {1}{x}}\,\cos {\frac {1}{x}}}$

The function ${\displaystyle f}$ is continuous at ${\displaystyle 0}$, but the derivative is not continuous at ${\displaystyle 0}$, so ${\displaystyle f}$ is not differentiable at ${\displaystyle 0}$. Indeed ${\displaystyle {\frac {f(h)-f(0)}{h}}=\sin {\frac {1}{h}}}$ has no limit as ${\displaystyle h\to 0}$.

${\displaystyle g(0)=0}$, and ${\displaystyle g(x)=x^{2}\sin {\frac {1}{x}}}$ for ${\displaystyle x\neq 0}$.

As above, the Product and Chain rules show that ${\displaystyle f}$ is differentiable on ${\displaystyle \mathbb {R} \setminus \left\{0\right\}}$.

For ${\displaystyle x\neq 0}$,

{\displaystyle {\begin{aligned}g'(x)&=\left(x\cdot x\sin {\frac {1}{x}}\right)'\\&=2x\,\sin {\frac {1}{x}}-\cos {\frac {1}{x}}\end{aligned}}}

The function is differentiable at 0. Indeed ${\displaystyle {\frac {g(h)-g(0)}{h}}=h\,\sin {\frac {1}{h}}\to 0}$ as ${\displaystyle h\to 0}$.

Note that ${\displaystyle g}$ is not continuously differentiable on ${\displaystyle \mathbb {R} }$ since ${\displaystyle g'}$ is not continuous at ${\displaystyle 0}$. Namely, ${\displaystyle \lim _{x\to 0}g'(x)}$ does not exist.

### Power Rule

Theorem. ${\displaystyle (x^{n})'=n\,x^{n-1}}$ for all ${\displaystyle x\in \mathbb {R} }$ and ${\displaystyle n\in \mathbb {N} }$.

Proof by induction. In the case ${\displaystyle n=1}$, we have ${\displaystyle x'=1=1\,x^{0}}$ for all ${\displaystyle x\in \mathbb {R} }$.

Assume that ${\displaystyle (x^{n})'=n\,x^{n-1}}$ for some ${\displaystyle n\in \mathbb {N} }$ and all ${\displaystyle x\in \mathbb {R} }$. Using the product rule, we obtain ${\displaystyle \left(x^{n+1}\right)'=\left(x^{n}\,x\right)'=(x^{n})'\,x+x^{n}\,x'=n\,x^{n-1}\,x+x^{n}=(n+1)\,x^{n}}$.

quod erat demonstrandum

Note: The theorem can also be proved using the formula ${\displaystyle {\frac {x^{n}-a^{n}}{x-a}}=x^{n-1}+x^{n-2}\,a+\dots +x\,a^{n-2}+a^{n-1}}$.

In the same breath, ${\displaystyle (x^{-n})'=-n\,x^{-n-1}}$ for all ${\displaystyle x\neq 0}$ and ${\displaystyle n\in \mathbb {N} }$.

Using reciprocal rule, we obtain ${\displaystyle (x^{-n})'=\left({\frac {1}{x^{n}}}\right)'={\frac {-(x^{n})'}{(x^{n})^{2}}}={\frac {-n\,x^{n-1}}{x^{2n}}}=-n\,x^{-n-1}}$

## Derivative of Inverse Function

Theorem. Suppose ${\displaystyle f}$ is an invertible continuous function. If ${\displaystyle f}$ is differentiable at a point ${\displaystyle a}$ and ${\displaystyle f'(a)\neq 0}$, then the inverse function is differentiable at the point ${\displaystyle b=f(a)}$ and

${\displaystyle (f^{-1})'(b)={\frac {1}{f'(a)}}={\frac {1}{f'(f^{-1}(b))}}}$

Proof. Since ${\displaystyle f}$ is differentiable at ${\displaystyle a}$, we know that ${\displaystyle f}$ is defined on an open interval ${\displaystyle I=(c,d)}$ containing ${\displaystyle a}$. Since ${\displaystyle f}$ is continuous and invertible, it follows from the Intermediate Value Theorem that ${\displaystyle f}$ is strictly monotone on ${\displaystyle I}$, the image ${\displaystyle f(I)}$ is an open interval containing ${\displaystyle b}$, and the inverse function ${\displaystyle f^{-1}}$ is continuous and strictly monotone on ${\displaystyle f(I)}$.

We have ${\displaystyle \lim _{x\to a}{\frac {f(x)-f(a)}{x-a}}=f'(a)}$. Since ${\displaystyle f'(a)\neq 0}$, it follows that ${\displaystyle \lim _{x\to a}{\frac {x-a}{f(x)-f(a)}}={\frac {1}{f'(a)}}}$. Since ${\displaystyle f^{-1}}$ is continuous and monotone on the interval ${\displaystyle f(I)}$, we obtain that ${\displaystyle f^{-1}(y)\to a}$ and ${\displaystyle f^{-1}(y)\neq a}$ when ${\displaystyle y\to b}$ and ${\displaystyle y\neq b}$.

Therefore ${\displaystyle \lim _{y\to b}{\frac {f^{-1}(y)-a}{y-b}}=\lim _{y\to b}{\frac {f^{-1}(y)-a}{f(f^{-1}(y))-b}}=\lim _{x\to a}{\frac {x-a}{f(x)-f(a)}}={\frac {1}{f'(a)}}}$.

quod erat demonstrandum

Remark. In the case ${\displaystyle f'(a)=0}$, the inverse function ${\displaystyle f^{-1}}$ is not differentiable at ${\displaystyle f(a)}$.

Indeed, if ${\displaystyle f^{-1}}$ is differentiable at ${\displaystyle b=f(a)}$, the chain rule implies that ${\displaystyle \left(f^{-1}\circ f\right)'(a)=(f^{-1})'(b)\cdot f'(a)}$. Obviously the LHS is the identity function, so ${\displaystyle (f^{-1}\circ f)'(a)=1\neq 0}$, so that ${\displaystyle f'(a)\neq 0}$.

### Example

${\displaystyle f(x)=\arccos {x}}$, ${\displaystyle x\in [-1,1]}$.

The function ${\displaystyle g(y)=\cos {y}}$ is strictly decreasing on ${\displaystyle \left[0,\pi \right]}$ and maps this interval onto ${\displaystyle \left[-1,1\right]}$. By definition, the function ${\displaystyle f(x)=\arccos {x}}$ is the inverse of the restriction of ${\displaystyle g}$ to ${\displaystyle \left[0,\pi \right]}$. Notice that ${\displaystyle g'(0)=g'(\pi )=0}$ and ${\displaystyle g'(y)\neq 0}$ for ${\displaystyle y\in \left(0,\pi \right)}$. It follows that the function ${\displaystyle f}$ is continuous on ${\displaystyle (-1,1)}$ and not differentiable at ${\displaystyle 1}$ and ${\displaystyle -1}$. Moreover, for any ${\displaystyle x\in \left(-1,1\right)}$,

${\displaystyle f'(x)={\frac {1}{g'(f(x))}}=-{\frac {1}{\sin {\arccos {x}}}}}$

Let ${\displaystyle y=\arccos {x}}$ (hence ${\displaystyle x=\cos {y}}$). We have ${\displaystyle \sin ^{2}{y}=\cos ^{2}{y}=1}$ by the pythagorean identity. Besides, ${\displaystyle \sin {y}>0}$ since ${\displaystyle y\in (0,\pi )}$. Consequently, ${\displaystyle \sin {y}={\sqrt {1-\cos ^{2}{y}}}={\sqrt {1-x^{2}}}}$. Thus ${\displaystyle f'(x)=-{\frac {1}{\sqrt {1-x^{2}}}}}$.

quod erat demonstrandum

Homework hint; use similar method to prove for ${\displaystyle \arcsin {x}}$ and ${\displaystyle \arctan {x}}$, just use same identity (divide by ${\displaystyle \cos ^{2}{y}}$ for ${\displaystyle \arctan {x}}$)

### Exponential and Logarithmic Functions

Theorem. The sequence ${\displaystyle x_{n}=\left(1+{\frac {1}{n}}\right)^{n}}$ for ${\displaystyle n\in \mathbb {N} }$ is increasing and bounded, hence convergent.

The limit is the number ${\displaystyle \mathrm {e} =2.718281828\ldots }$ (number of letters in each word in "I'm forming a mnemonic to remember a constant in analysis")

Corollary. ${\displaystyle \lim _{x\to 0}\left(1+x\right)^{\frac {1}{x}}=\mathrm {e} }$.

Not proved here... (Ain't nobody got time for that!)

for any ${\displaystyle a>0}$ and ${\displaystyle a\neq 1}$, the exponential function ${\displaystyle f(x)=a^{x}}$ is strictly monotone and continuous on ${\displaystyle \mathbb {R} }$. It maps ${\displaystyle \mathbb {R} }$ onto ${\displaystyle \left(0,\infty \right)}$. Therefore the inverse function ${\displaystyle g(y)=\log _{a}{y}}$ is strictly monotone and continuous on ${\displaystyle \left(0,\infty \right)}$. The natural logarithm ${\displaystyle \log _{e}{y}}$ is also denoted just ${\displaystyle \log {y}}$.

Since ${\displaystyle \left(1+h\right)^{\frac {1}{h}}\to \mathrm {e} }$ as ${\displaystyle h\to 0}$, it follows that ${\displaystyle h^{-1}\log {\left(1+h\right)}=\log {\left(1+h\right)^{\frac {1}{h}}}\to \log {\mathrm {e} }=1}$ as ${\displaystyle h\to 0}$. In other words, ${\displaystyle \left.\left(\log {y}\right)'\right|_{y=1}=1}$

#### Examples

${\displaystyle f(x)=\mathrm {e} ^{x}}$ for ${\displaystyle x\in \mathbb {R} }$.

${\displaystyle {\frac {f(x+h)-f(x)}{h}}={\frac {e^{x}(e^{h}-1)}{h}}}$ for all ${\displaystyle x,h\in \mathbb {R} }$. Therefore for any ${\displaystyle x\in \mathbb {R} }$, ${\displaystyle f'(x)=\lim _{h\to 0}{\frac {e^{h}-1}{h}}=e^{x}\,f'(0)=e^{x}}$

${\displaystyle f(x)=a^{x}}$ for ${\displaystyle x\in \mathbb {R} }$, where ${\displaystyle a>0}$.

Equivalently, ${\displaystyle f(x)=\mathrm {e} ^{\log {a^{x}}}=\mathrm {e} ^{x\,\log {a}}}$. So ${\displaystyle f'(x)=\mathrm {e} ^{x\,\log {a}}\,\log {a}=a^{x}\,\log {a}}$.

${\displaystyle f(x)=\log {x}}$ for ${\displaystyle x\in \left(0,\infty \right)}$.

Since ${\displaystyle f}$ is the inverse function ${\displaystyle g(y)=\mathrm {e} ^{y}}$, we obtain ${\displaystyle f'(x)={\frac {1}{g'(\log {x})}}={\frac {1}{\mathrm {e} ^{\log {x}}}}={\frac {1}{x}}}$ for all ${\displaystyle x>0}$.

### Power Rule: General Case

${\displaystyle \left(x^{\alpha }\right)'=\alpha \,x^{\alpha -1}}$ for all ${\displaystyle x>0}$ and ${\displaystyle \alpha \in \mathbb {R} }$.

Proof. Let us fix a number ${\displaystyle \alpha \in \mathbb {R} }$ and consider ${\displaystyle f(x)=x^{\alpha }}$ for ${\displaystyle x\in \left(0,\infty \right)}$. For any ${\displaystyle x>0}$, we obtain ${\displaystyle f(x)=\mathrm {e} ^{\log {x^{\alpha }}}=a^{\log {x}}}$, where ${\displaystyle a=\mathrm {e} ^{\alpha }}$. Hence ${\displaystyle f=h\circ g}$, where ${\displaystyle g(x)=\log {x}}$ for ${\displaystyle x>0}$ and ${\displaystyle h(y)=a^{y}}$ for ${\displaystyle y\in \mathbb {R} }$. By the chain rule, ${\displaystyle f'(x)=\alpha \,x^{\alpha -1}}$.

quod erat demonstrandum