Sum rule in integration

In calculus, the sum rule in integration states that the integral of a sum of two functions is equal to the sum of their integrals. It is of particular use for the integration of sums, and is one part of the linearity of integration.

As with many properties of integrals in calculus, the sum rule applies both to definite integrals and indefinite integrals. For indefinite integrals, the sum rule states

${\displaystyle \int \left(f+g\right)\,dx=\int f\,dx+\int g\,dx}$

Application to indefinite integrals

For example, if you know that the integral of exp(x) is exp(x) from calculus with exponentials and that the integral of cos(x) is sin(x) from calculus with trigonometry then:

${\displaystyle \int \left(e^{x}+\cos {x}\right)\,dx=\int e^{x}\,dx+\int \cos {x}\ \,dx=e^{x}+\sin {x}+C}$

Some other general results come from this rule. For example:

 ${\displaystyle \int \left(u-v\right)dx}$ ${\displaystyle =\int u+\left(-v\right)\,dx}$ ${\displaystyle =\int u\,dx+\int \left(-v\right)\,dx}$ ${\displaystyle =\int u\,dx+\left(-\int v\,dx\right)}$ ${\displaystyle =\int u\,dx-\int v\,dx}$

The proof above relied on the special case of the constant factor rule in integration with k=-1.

Thus, the sum rule might be written as:

${\displaystyle \int (u\pm v)\,dx=\int u\,dx\pm \int v\,dx}$

Another basic application is that sigma and integral signs can be changed around. That is:

${\displaystyle \int \sum _{r=a}^{b}f\left(r,x\right)\,dx=\sum _{r=a}^{b}\int f\left(r,x\right)\,dx}$

This is simply because:

${\displaystyle \int \sum _{r=a}^{b}f(r,x)\,dx}$
${\displaystyle =\int f\left(a,x\right)+f((a+1),x)+f((a+2),x)+\dots }$
${\displaystyle +f((b-1),x)+f(b,x)\,dx}$
${\displaystyle =\int f(a,x)\,dx+\int f((a+1),x)\,dx+\int f((a+2),x)\,dx+\dots }$
${\displaystyle +\int f((b-1),x)\,dx+\int f(b,x)\,dx}$
${\displaystyle =\sum _{r=a}^{b}\int f(r,x)\,dx}$

Application to definite integrals

Passing from the case of indefinite integrals to the case of integrals over an interval [a,b], we get exactly the same form of rule (the arbitrary constant of integration disappears).

The proof of the rule

First note that from the definition of integration as the antiderivative, the reverse process of differentiation:

${\displaystyle u=\int {\frac {du}{dx}}\,dx}$
${\displaystyle v=\int {\frac {dv}{dx}}\,dx}$

${\displaystyle u+v=\int {\frac {du}{dx}}\,dx+\int {\frac {dv}{dx}}\,dx\quad {\mbox{(1)}}}$

Now take the sum rule in differentiation:

${\displaystyle {\frac {d}{dx}}\left(u+v\right)={\frac {du}{dx}}+{\frac {dv}{dx}}}$

Integrate both sides with respect to x:

${\displaystyle u+v=\int \left({\frac {du}{dx}}+{\frac {dv}{dx}}\right)\,dx\quad {\mbox{(2)}}}$

So we have, looking at (1) and (2):

${\displaystyle u+v=\int {\frac {du}{dx}}\,dx+\int {\frac {dv}{dx}}\,dx}$
${\displaystyle u+v=\int \left({\frac {du}{dx}}+{\frac {dv}{dx}}\right)\,dx}$

Therefore:

${\displaystyle \int \left({\frac {du}{dx}}+{\frac {dv}{dx}}\right)\,dx=\int {\frac {du}{dx}}\,dx+\int {\frac {dv}{dx}}\,dx}$

Now substitute:

${\displaystyle f={\frac {du}{dx}}}$
${\displaystyle g={\frac {dv}{dx}}}$