# Differentiation rules

This is a summary of differentiation rules, that is, rules for computing the derivative of a function in calculus.

## Elementary rules of differentiation

Unless otherwise stated, all functions are functions of real numbers (R) that return real values; although more generally, the formulae below apply wherever they are well defined[1][2] — including the case of complex numbers (C).[3]

### Constant Term Rule

For any value of ${\displaystyle c}$, where ${\displaystyle c\in \mathbb {R} }$, for any value of ${\displaystyle x\in \mathbb {R} }$, ${\displaystyle {\frac {d}{dx}}\left(c\right)=0}$.[4]

#### Proof

Let ${\displaystyle c\in \mathbb {R} }$. Now, we will prove, from first principles, what the derivative is. Let ${\displaystyle x\in \mathbb {R} }$.

For clarity, we will define the new function ${\displaystyle f}$, so that ${\displaystyle f(x)=c}$. We want to show that ${\displaystyle {\frac {d}{dx}}\left(c\right)=0}$. From what we defined what ${\displaystyle f(x)}$ was, we can rewrite the statement:

{\displaystyle {\begin{aligned}&~{\frac {d}{dx}}\left(c\right)\\=&~{\frac {d}{dx}}f(x)=~f'(x)\\\end{aligned}}}

This is just done for clarity of notation. This way, we can use the definition of the derivative to find ${\displaystyle f'(x)}$.

{\displaystyle {\begin{aligned}f'(x)&=\lim _{h\to 0}{\frac {f(x+h)-f(x)}{h}}\\&=\lim _{h\to 0}{\frac {(c)-(c)}{h}}\\&=\lim _{h\to 0}{\frac {0}{h}}\\&=\lim _{h\to 0}0\\&=0\end{aligned}}}

Thus, we have found out that: for any value of ${\displaystyle c\in \mathbb {R} }$, for any value of ${\displaystyle x\in \mathbb {R} }$, ${\displaystyle {\frac {d}{dx}}\left(c\right)=0}$.

### Differentiation is linear

For any functions ${\displaystyle f}$ and ${\displaystyle g}$ and any real numbers ${\displaystyle a}$ and ${\displaystyle b}$, the derivative of the function ${\displaystyle h(x)=af(x)+bg(x)}$ with respect to ${\displaystyle x}$ is: ${\displaystyle h'(x)=af'(x)+bg'(x).}$

In Leibniz's notation this is written as:

${\displaystyle {\frac {d(af+bg)}{dx}}=a{\frac {df}{dx}}+b{\frac {dg}{dx}}.}$

Special cases include:

• The constant factor rule
${\displaystyle (af)'=af'}$
• The sum rule
${\displaystyle (f+g)'=f'+g'}$
• The subtraction rule
${\displaystyle (f-g)'=f'-g'.}$

### The product rule

For the functions f and g, the derivative of the function h(x) = f(x) g(x) with respect to x is

${\displaystyle h'(x)=(fg)'(x)=f'(x)g(x)+f(x)g'(x).}$
In Leibniz's notation this is written
${\displaystyle {\frac {d(fg)}{dx}}={\frac {df}{dx}}g+f{\frac {dg}{dx}}.}$

### The chain rule

The derivative of the function ${\displaystyle h(x)=f(g(x))}$ is

${\displaystyle h'(x)=f'(g(x))\cdot g'(x).}$

In Leibniz's notation, this is written as:

${\displaystyle {\frac {d}{dx}}h(x)=\left.{\frac {d}{dz}}f(z)\right|_{z=g(x)}\cdot {\frac {d}{dx}}g(x),}$
often abridged to
${\displaystyle {\frac {dh(x)}{dx}}={\frac {df(g(x))}{dg(x)}}\cdot {\frac {dg(x)}{dx}}.}$

Focusing on the notion of maps, and the differential being a map ${\displaystyle {\text{D}}}$, this is written in a more concise way as:

${\displaystyle [{\text{D}}(f\circ g)]_{x}=[{\text{D}}f]_{g(x)}\cdot [{\text{D}}g]_{x}\,.}$

### The inverse function rule

If the function f has an inverse function g, meaning that ${\displaystyle g(f(x))=x}$ and ${\displaystyle f(g(y))=y,}$ then

${\displaystyle g'={\frac {1}{f'\circ g}}.}$

In Leibniz notation, this is written as

${\displaystyle {\frac {dx}{dy}}={\frac {1}{\frac {dy}{dx}}}.}$

## Power laws, polynomials, quotients, and reciprocals

### The polynomial or elementary power rule

If ${\displaystyle f(x)=x^{r}}$, for any real number ${\displaystyle r\neq 0,}$ then

${\displaystyle f'(x)=rx^{r-1}.}$

When ${\displaystyle r=1,}$ this becomes the special case that if ${\displaystyle f(x)=x,}$ then ${\displaystyle f'(x)=1.}$

Combining the power rule with the sum and constant multiple rules permits the computation of the derivative of any polynomial.

### The reciprocal rule

The derivative of ${\displaystyle h(x)={\frac {1}{f(x)}}}$for any (nonvanishing) function f is:

${\displaystyle h'(x)=-{\frac {f'(x)}{(f(x))^{2}}}}$ wherever f is non-zero.

In Leibniz's notation, this is written

${\displaystyle {\frac {d(1/f)}{dx}}=-{\frac {1}{f^{2}}}{\frac {df}{dx}}.}$

The reciprocal rule can be derived either from the quotient rule, or from the combination of power rule and chain rule.

### The quotient rule

If f and g are functions, then:

${\displaystyle \left({\frac {f}{g}}\right)'={\frac {f'g-g'f}{g^{2}}}\quad }$ wherever g is nonzero.

This can be derived from the product rule and the reciprocal rule.

### Generalized power rule

The elementary power rule generalizes considerably. The most general power rule is the functional power rule: for any functions f and g,

${\displaystyle (f^{g})'=\left(e^{g\ln f}\right)'=f^{g}\left(f'{g \over f}+g'\ln f\right),\quad }$

wherever both sides are well defined.

Special cases

• If ${\textstyle f(x)=x^{a}\!}$, then ${\textstyle f'(x)=ax^{a-1}}$when a is any non-zero real number and x is positive.
• The reciprocal rule may be derived as the special case where ${\textstyle g(x)=-1\!}$.

## Derivatives of exponential and logarithmic functions

${\displaystyle {\frac {d}{dx}}\left(c^{ax}\right)={ac^{ax}\ln c},\qquad c>0}$

the equation above is true for all c, but the derivative for ${\textstyle c<0}$ yields a complex number.

${\displaystyle {\frac {d}{dx}}\left(e^{ax}\right)=ae^{ax}}$
${\displaystyle {\frac {d}{dx}}\left(\log _{c}x\right)={1 \over x\ln c},\qquad c>1}$

the equation above is also true for all c, but yields a complex number if ${\textstyle c<0\!}$.

${\displaystyle {\frac {d}{dx}}\left(\ln x\right)={1 \over x},\qquad x>0.}$
${\displaystyle {\frac {d}{dx}}\left(\ln |x|\right)={1 \over x},\qquad x\neq 0.}$
${\displaystyle {\frac {d}{dx}}\left(W(x)\right)={1 \over {x+e^{W(x)}}},\qquad x>-{1 \over e}.\qquad }$where ${\displaystyle W(x)}$ is the Lambert W function
${\displaystyle {\frac {d}{dx}}\left(x^{x}\right)=x^{x}(1+\ln x).}$
${\displaystyle {\frac {d}{dx}}\left(f(x)^{g(x)}\right)=g(x)f(x)^{g(x)-1}{\frac {df}{dx}}+f(x)^{g(x)}\ln {(f(x))}{\frac {dg}{dx}},\qquad {\text{if }}f(x)>0,{\text{ and if }}{\frac {df}{dx}}{\text{ and }}{\frac {dg}{dx}}{\text{ exist.}}}$
${\displaystyle {\frac {d}{dx}}\left(f_{1}(x)^{f_{2}(x)^{\left(...\right)^{f_{n}(x)}}}\right)=\left[\sum \limits _{k=1}^{n}{\frac {\partial }{\partial x_{k}}}\left(f_{1}(x_{1})^{f_{2}(x_{2})^{\left(...\right)^{f_{n}(x_{n})}}}\right)\right]{\biggr \vert }_{x_{1}=x_{2}=...=x_{n}=x},{\text{ if }}f_{i0{\text{ and }}}$ ${\displaystyle {\frac {df_{i}}{dx}}{\text{ exists. }}}$

### Logarithmic derivatives

The logarithmic derivative is another way of stating the rule for differentiating the logarithm of a function (using the chain rule):

${\displaystyle (\ln f)'={\frac {f'}{f}}\quad }$ wherever f is positive.

Logarithmic differentiation is a technique which uses logarithms and its differentiation rules to simplify certain expressions before actually applying the derivative.[citation needed]

Logarithms can be used to remove exponents, convert products into sums, and convert division into subtraction — each of which may lead to a simplified expression for taking derivatives.

## Derivatives of trigonometric functions

 ${\displaystyle (\sin x)'=\cos x={\frac {e^{ix}+e^{-ix}}{2}}}$ ${\displaystyle (\arcsin x)'={1 \over {\sqrt {1-x^{2}}}}}$ ${\displaystyle (\cos x)'=-\sin x={\frac {e^{-ix}-e^{ix}}{2i}}}$ ${\displaystyle (\arccos x)'=-{1 \over {\sqrt {1-x^{2}}}}}$ ${\displaystyle (\tan x)'=\sec ^{2}x={1 \over \cos ^{2}x}=1+\tan ^{2}x}$ ${\displaystyle (\arctan x)'={1 \over 1+x^{2}}}$ ${\displaystyle (\cot x)'=-\csc ^{2}x=-{1 \over \sin ^{2}x}=-1-\cot ^{2}x}$ ${\displaystyle (\operatorname {arccot} x)'={1 \over -1-x^{2}}}$ ${\displaystyle (\sec x)'=\sec {x}\tan {x}}$ ${\displaystyle (\operatorname {arcsec} x)'={1 \over |x|{\sqrt {x^{2}-1}}}}$ ${\displaystyle (\csc x)'=-\csc {x}\cot {x}}$ ${\displaystyle (\operatorname {arccsc} x)'=-{1 \over |x|{\sqrt {x^{2}-1}}}}$

The derivatives in the table above is for when the range of the inverse secant is ${\displaystyle [0,\pi ]\!}$ and when the range of the inverse cosecant is ${\displaystyle \left[-{\frac {\pi }{2}},{\frac {\pi }{2}}\right]\!}$.

It is common to additionally define an inverse tangent function with two arguments, ${\displaystyle \arctan(y,x)\!}$. Its value lies in the range ${\displaystyle [-\pi ,\pi ]\!}$ and reflects the quadrant of the point ${\displaystyle (x,y)\!}$. For the first and fourth quadrant (i.e. ${\displaystyle x>0\!}$) one has ${\displaystyle \arctan(y,x>0)=\arctan(y/x)\!}$. Its partial derivatives are

 ${\displaystyle {\frac {\partial \arctan(y,x)}{\partial y}}={\frac {x}{x^{2}+y^{2}}}}$, and ${\displaystyle {\frac {\partial \arctan(y,x)}{\partial x}}={\frac {-y}{x^{2}+y^{2}}}.}$

## Derivatives of hyperbolic functions

 ${\displaystyle (\sinh x)'=\cosh x={\frac {e^{x}+e^{-x}}{2}}}$ ${\displaystyle (\operatorname {arsinh} x)'={1 \over {\sqrt {1+x^{2}}}}}$ ${\displaystyle (\cosh x)'=\sinh x={\frac {e^{x}-e^{-x}}{2}}}$ ${\displaystyle (\operatorname {arcosh} x)'={\frac {1}{\sqrt {x^{2}-1}}}}$ ${\displaystyle (\tanh x)'={\operatorname {sech} ^{2}x}={1 \over \cosh ^{2}x}=1-\tanh ^{2}x}$ ${\displaystyle (\operatorname {artanh} x)'={1 \over 1-x^{2}}}$ ${\displaystyle (\coth x)'=-\operatorname {csch} ^{2}x=-{1 \over \sinh ^{2}x}=1-\coth ^{2}x}$ ${\displaystyle (\operatorname {arcoth} x)'={1 \over 1-x^{2}}}$ ${\displaystyle (\operatorname {sech} x)'=-\operatorname {sech} {x}\tanh {x}}$ ${\displaystyle (\operatorname {arsech} x)'=-{1 \over x{\sqrt {1-x^{2}}}}}$ ${\displaystyle (\operatorname {csch} x)'=-\operatorname {csch} {x}\coth {x}}$ ${\displaystyle (\operatorname {arcsch} x)'=-{1 \over |x|{\sqrt {1+x^{2}}}}}$

See Hyperbolic functions for restrictions on these derivatives.

## Derivatives of special functions

Gamma function
${\displaystyle \Gamma (x)=\int _{0}^{\infty }t^{x-1}e^{-t}\,dt}$
{\displaystyle {\begin{aligned}\Gamma '(x)&=\int _{0}^{\infty }t^{x-1}e^{-t}\ln t\,dt\\&=\Gamma (x)\left(\sum _{n=1}^{\infty }\left(\ln \left(1+{\dfrac {1}{n}}\right)-{\dfrac {1}{x+n}}\right)-{\dfrac {1}{x}}\right)\\&=\Gamma (x)\psi (x)\end{aligned}}}
with ${\displaystyle \psi (x)}$ being the digamma function, expressed by the parenthesized expression to the right of ${\displaystyle \Gamma (x)}$ in the line above.
Riemann Zeta function
${\displaystyle \zeta (x)=\sum _{n=1}^{\infty }{\frac {1}{n^{x}}}}$
{\displaystyle {\begin{aligned}\zeta '(x)&=-\sum _{n=1}^{\infty }{\frac {\ln n}{n^{x}}}=-{\frac {\ln 2}{2^{x}}}-{\frac {\ln 3}{3^{x}}}-{\frac {\ln 4}{4^{x}}}-\cdots \\&=-\sum _{p{\text{ prime}}}{\frac {p^{-x}\ln p}{(1-p^{-x})^{2}}}\prod _{q{\text{ prime}},q\neq p}{\frac {1}{1-q^{-x}}}\end{aligned}}}

## Derivatives of integrals

Suppose that it is required to differentiate with respect to x the function

${\displaystyle F(x)=\int _{a(x)}^{b(x)}f(x,t)\,dt,}$

where the functions ${\displaystyle f(x,t)}$ and ${\displaystyle {\frac {\partial }{\partial x}}\,f(x,t)}$ are both continuous in both ${\displaystyle t}$ and ${\displaystyle x}$ in some region of the ${\displaystyle (t,x)}$ plane, including ${\displaystyle a(x)\leq t\leq b(x),}$ ${\displaystyle x_{0}\leq x\leq x_{1}}$, and the functions ${\displaystyle a(x)}$ and ${\displaystyle b(x)}$ are both continuous and both have continuous derivatives for ${\displaystyle x_{0}\leq x\leq x_{1}}$. Then for ${\displaystyle \,x_{0}\leq x\leq x_{1}}$:

${\displaystyle F'(x)=f(x,b(x))\,b'(x)-f(x,a(x))\,a'(x)+\int _{a(x)}^{b(x)}{\frac {\partial }{\partial x}}\,f(x,t)\;dt\,.}$

This formula is the general form of the Leibniz integral rule and can be derived using the fundamental theorem of calculus.

## Derivatives to nth order

Some rules exist for computing the n-th derivative of functions, where n is a positive integer. These include:

### Faà di Bruno's formula

If f and g are n-times differentiable, then

${\displaystyle {\frac {d^{n}}{dx^{n}}}[f(g(x))]=n!\sum _{\{k_{m}\}}f^{(r)}(g(x))\prod _{m=1}^{n}{\frac {1}{k_{m}!}}\left(g^{(m)}(x)\right)^{k_{m}}}$
where ${\textstyle r=\sum _{m=1}^{n-1}k_{m}}$ and the set ${\displaystyle \{k_{m}\}}$ consists of all non-negative integer solutions of the Diophantine equation ${\textstyle \sum _{m=1}^{n}mk_{m}=n}$.

### General Leibniz rule

If f and g are n-times differentiable, then

${\displaystyle {\frac {d^{n}}{dx^{n}}}[f(x)g(x)]=\sum _{k=0}^{n}{\binom {n}{k}}{\frac {d^{n-k}}{dx^{n-k}}}f(x){\frac {d^{k}}{dx^{k}}}g(x)}$

## References

1. ^ Calculus (5th edition), F. Ayres, E. Mendelson, Schaum's Outline Series, 2009, ISBN 978-0-07-150861-2.
2. ^ Advanced Calculus (3rd edition), R. Wrede, M.R. Spiegel, Schaum's Outline Series, 2010, ISBN 978-0-07-162366-7.
3. ^ Complex Variables, M.R. Speigel, S. Lipschutz, J.J. Schiller, D. Spellman, Schaum's Outlines Series, McGraw Hill (USA), 2009, ISBN 978-0-07-161569-3
4. ^ "Differentiation Rules". University of Waterloo - CEMC Open Courseware. Retrieved 3 May 2022.