In mathematics the indefinite sum operator (also known as the antidifference operator), denoted by
∑
x
{\displaystyle \sum _{x}}
or
Δ
−
1
{\displaystyle \Delta ^{-1}}
,[ 1] [ 2] [ 3] is the linear operator , inverse of the forward difference operator
Δ
{\displaystyle \Delta }
. It relates to the forward difference operator as the indefinite integral relates to the derivative . Thus
Δ
∑
x
f
(
x
)
=
f
(
x
)
.
{\displaystyle \Delta \sum _{x}f(x)=f(x)\,.}
More explicitly, if
∑
x
f
(
x
)
=
F
(
x
)
{\displaystyle \sum _{x}f(x)=F(x)}
, then
F
(
x
+
1
)
−
F
(
x
)
=
f
(
x
)
.
{\displaystyle F(x+1)-F(x)=f(x)\,.}
If F (x ) is a solution of this functional equation for a given f (x ), then so is F (x )+C(x) for any periodic function C(x) with period 1. Therefore each indefinite sum actually represents a family of functions. However the solution equal to its Newton series expansion is unique up to an additive constant C. This unique solution can be represented by formal power series form of the antidifference operator:
Δ
−
1
=
1
e
D
−
1
{\displaystyle \Delta ^{-1}={\frac {1}{e^{D}-1}}}
Fundamental theorem of discrete calculus
Indefinite sums can be used to calculate definite sums with the formula:[ 4]
∑
k
=
a
b
f
(
k
)
=
Δ
−
1
f
(
b
+
1
)
−
Δ
−
1
f
(
a
)
{\displaystyle \sum _{k=a}^{b}f(k)=\Delta ^{-1}f(b+1)-\Delta ^{-1}f(a)}
Definitions
∑
x
f
(
x
)
=
∫
0
x
f
(
t
)
d
t
+
∑
k
=
1
∞
c
k
Δ
k
−
1
f
(
x
)
k
!
+
C
{\displaystyle \sum _{x}f(x)=\int _{0}^{x}f(t)dt+\sum _{k=1}^{\infty }{\frac {c_{k}\Delta ^{k-1}f(x)}{k!}}+C}
where
c
k
=
∫
0
1
Γ
(
x
+
1
)
Γ
(
x
−
k
+
1
)
d
x
{\displaystyle c_{k}=\int _{0}^{1}{\frac {\Gamma (x+1)}{\Gamma (x-k+1)}}dx}
are the Cauchy numbers of the first kind.[ 5]
∑
x
f
(
x
)
=
∑
k
=
1
∞
(
x
k
)
Δ
k
−
1
[
f
]
(
0
)
+
C
=
∑
k
=
1
∞
Δ
k
−
1
[
f
]
(
0
)
k
!
(
x
)
k
+
C
{\displaystyle \sum _{x}f(x)=\sum _{k=1}^{\infty }{\binom {x}{k}}\Delta ^{k-1}[f]\left(0\right)+C=\sum _{k=1}^{\infty }{\frac {\Delta ^{k-1}[f](0)}{k!}}(x)_{k}+C}
where
(
x
)
k
=
Γ
(
x
+
1
)
Γ
(
x
−
k
+
1
)
{\displaystyle (x)_{k}={\frac {\Gamma (x+1)}{\Gamma (x-k+1)}}}
is the falling factorial .
∑
x
f
(
x
)
=
∑
n
=
1
∞
f
(
n
−
1
)
(
0
)
n
!
B
n
(
x
)
+
C
,
{\displaystyle \sum _{x}f(x)=\sum _{n=1}^{\infty }{\frac {f^{(n-1)}(0)}{n!}}B_{n}(x)+C\,,}
provided that the right-hand side of the equation converges.
If
lim
x
→
+
∞
f
(
x
)
=
0
,
{\displaystyle \lim _{x\to {+\infty }}f(x)=0,}
then[ 6]
∑
x
f
(
x
)
=
∑
n
=
0
∞
(
f
(
n
)
−
f
(
n
+
x
)
)
+
C
.
{\displaystyle \sum _{x}f(x)=\sum _{n=0}^{\infty }\left(f(n)-f(n+x)\right)+C.}
∑
x
f
(
x
)
=
∫
0
x
f
(
t
)
d
t
−
1
2
f
(
x
)
+
∑
k
=
1
∞
B
2
k
(
2
k
)
!
f
(
2
k
−
1
)
(
x
)
+
C
{\displaystyle \sum _{x}f(x)=\int _{0}^{x}f(t)dt-{\frac {1}{2}}f(x)+\sum _{k=1}^{\infty }{\frac {B_{2k}}{(2k)!}}f^{(2k-1)}(x)+C}
Choice of the constant term
Often the constant C in indefinite sum is fixed from the following condition.
Let
F
(
x
)
=
∑
x
f
(
x
)
+
C
{\displaystyle F(x)=\sum _{x}f(x)+C}
Then the constant C is fixed from the condition
∫
0
1
F
(
x
)
d
x
=
0
{\displaystyle \int _{0}^{1}F(x)dx=0}
or
∫
1
2
F
(
x
)
d
x
=
0
{\displaystyle \int _{1}^{2}F(x)dx=0}
Alternatively, Ramanujan's sum can be used:
∑
x
≥
1
ℜ
f
(
x
)
=
−
f
(
0
)
−
F
(
0
)
{\displaystyle \sum _{x\geq 1}^{\Re }f(x)=-f(0)-F(0)}
or at 1
∑
x
≥
1
ℜ
f
(
x
)
=
−
F
(
1
)
{\displaystyle \sum _{x\geq 1}^{\Re }f(x)=-F(1)}
respectively[ 7] [ 8]
Summation by parts
Indefinite summation by parts:
∑
x
f
(
x
)
Δ
g
(
x
)
=
f
(
x
)
g
(
x
)
−
∑
x
(
g
(
x
)
+
Δ
g
(
x
)
)
Δ
f
(
x
)
{\displaystyle \sum _{x}f(x)\Delta g(x)=f(x)g(x)-\sum _{x}(g(x)+\Delta g(x))\Delta f(x)}
∑
x
f
(
x
)
Δ
g
(
x
)
+
∑
x
g
(
x
)
Δ
f
(
x
)
=
f
(
x
)
g
(
x
)
−
∑
x
Δ
f
(
x
)
Δ
g
(
x
)
{\displaystyle \sum _{x}f(x)\Delta g(x)+\sum _{x}g(x)\Delta f(x)=f(x)g(x)-\sum _{x}\Delta f(x)\Delta g(x)}
Definite summation by parts:
∑
i
=
a
b
f
(
i
)
Δ
g
(
i
)
=
f
(
b
+
1
)
g
(
b
+
1
)
−
f
(
a
)
g
(
a
)
−
∑
i
=
a
b
g
(
i
+
1
)
Δ
f
(
i
)
{\displaystyle \sum _{i=a}^{b}f(i)\Delta g(i)=f(b+1)g(b+1)-f(a)g(a)-\sum _{i=a}^{b}g(i+1)\Delta f(i)}
Period rules
If
T
{\displaystyle T}
is a period of function
f
(
x
)
{\displaystyle f(x)}
then
∑
x
f
(
T
x
)
=
x
f
(
T
x
)
+
C
{\displaystyle \sum _{x}f(Tx)=xf(Tx)+C}
If
T
{\displaystyle T}
is an antiperiod of function
f
(
x
)
{\displaystyle f(x)}
, that is
f
(
x
+
T
)
=
−
f
(
x
)
{\displaystyle f(x+T)=-f(x)}
then
∑
x
f
(
T
x
)
=
−
1
2
f
(
T
x
)
+
C
{\displaystyle \sum _{x}f(Tx)=-{\frac {1}{2}}f(Tx)+C}
Alternative usage
Some authors use the phrase "indefinite sum" to describe a sum in which the numerical value of the upper limit is not given. e.g.
∑
k
=
1
n
f
(
k
)
{\displaystyle \sum _{k=1}^{n}f(k)}
In this case a closed form expression F(k) for the sum is a solution of
F
(
x
+
1
)
−
F
(
x
)
=
f
(
x
+
1
)
{\displaystyle F(x+1)-F(x)=f(x+1)}
which is called the telescoping equation.[ 9] It is inverse to backward difference
∇
{\displaystyle \nabla }
operator.
It is related to the forward antidifference operator using the fundamental theorem of discrete calculus described earlier.
List of indefinite sums
This is a list of indefinite sums of various functions. Not every function has an indefinite sum that can be expressed in terms of elementary functions.
Antidifferences of rational functions
∑
x
a
=
a
x
+
C
{\displaystyle \sum _{x}a=ax+C}
∑
x
x
=
x
2
2
−
x
2
+
C
{\displaystyle \sum _{x}x={\frac {x^{2}}{2}}-{\frac {x}{2}}+C}
∑
x
x
a
=
B
a
+
1
(
x
)
a
+
1
+
C
,
a
∉
Z
−
{\displaystyle \sum _{x}x^{a}={\frac {B_{a+1}(x)}{a+1}}+C,\,a\notin \mathbb {Z} ^{-}}
where
B
a
(
x
)
=
−
a
ζ
(
−
a
+
1
,
x
)
{\displaystyle B_{a}(x)=-a\zeta (-a+1,x)}
, the generalized to real order Bernoulli polynomials .
∑
x
x
a
=
(
−
1
)
a
−
1
ψ
(
−
a
−
1
)
(
x
)
Γ
(
−
a
)
+
C
,
a
∈
Z
−
{\displaystyle \sum _{x}x^{a}={\frac {(-1)^{a-1}\psi ^{(-a-1)}(x)}{\Gamma (-a)}}+C,\,a\in \mathbb {Z} ^{-}}
where
ψ
(
n
)
(
x
)
{\displaystyle \psi ^{(n)}(x)}
is the polygamma function .
∑
x
1
x
=
ψ
(
x
)
+
C
{\displaystyle \sum _{x}{\frac {1}{x}}=\psi (x)+C}
where
ψ
(
x
)
{\displaystyle \psi (x)}
is the digamma function .
Antidifferences of exponential functions
∑
x
a
x
=
a
x
a
−
1
+
C
{\displaystyle \sum _{x}a^{x}={\frac {a^{x}}{a-1}}+C}
Particularly,
∑
x
2
x
=
2
x
+
C
{\displaystyle \sum _{x}2^{x}=2^{x}+C}
Antidifferences of logarithmic functions
∑
x
log
b
x
=
log
b
Γ
(
x
)
+
C
{\displaystyle \sum _{x}\log _{b}x=\log _{b}\Gamma (x)+C}
∑
x
log
b
a
x
=
log
b
(
a
x
−
1
Γ
(
x
)
)
+
C
{\displaystyle \sum _{x}\log _{b}ax=\log _{b}(a^{x-1}\Gamma (x))+C}
Antidifferences of hyperbolic functions
∑
x
sinh
a
x
=
1
2
csch
(
a
2
)
cosh
(
a
2
−
a
x
)
+
C
{\displaystyle \sum _{x}\sinh ax={\frac {1}{2}}\operatorname {csch} \left({\frac {a}{2}}\right)\cosh \left({\frac {a}{2}}-ax\right)+C}
∑
x
cosh
a
x
=
1
2
coth
(
a
2
)
sinh
a
x
−
1
2
cosh
a
x
+
C
{\displaystyle \sum _{x}\cosh ax={\frac {1}{2}}\coth \left({\frac {a}{2}}\right)\sinh ax-{\frac {1}{2}}\cosh ax+C}
∑
x
tanh
a
x
=
1
a
ψ
e
a
(
x
−
i
π
2
a
)
+
1
a
ψ
e
a
(
x
+
i
π
2
a
)
−
x
+
C
{\displaystyle \sum _{x}\tanh ax={\frac {1}{a}}\psi _{e^{a}}\left(x-{\frac {i\pi }{2a}}\right)+{\frac {1}{a}}\psi _{e^{a}}\left(x+{\frac {i\pi }{2a}}\right)-x+C}
where
ψ
q
(
x
)
{\displaystyle \psi _{q}(x)}
is the q-digamma function.
Antidifferences of trigonometric functions
∑
x
sin
a
x
=
−
1
2
csc
(
a
2
)
cos
(
a
2
−
a
x
)
+
C
,
a
≠
n
π
{\displaystyle \sum _{x}\sin ax=-{\frac {1}{2}}\csc \left({\frac {a}{2}}\right)\cos \left({\frac {a}{2}}-ax\right)+C\,,\,\,a\neq n\pi }
∑
x
cos
a
x
=
1
2
cot
(
a
2
)
sin
a
x
−
1
2
cos
a
x
+
C
,
a
≠
n
π
{\displaystyle \sum _{x}\cos ax={\frac {1}{2}}\cot \left({\frac {a}{2}}\right)\sin ax-{\frac {1}{2}}\cos ax+C\,,\,\,a\neq n\pi }
∑
x
sin
2
a
x
=
x
2
+
1
4
csc
(
a
)
sin
(
a
−
2
a
x
)
+
C
,
a
≠
n
π
2
{\displaystyle \sum _{x}\sin ^{2}ax={\frac {x}{2}}+{\frac {1}{4}}\csc(a)\sin(a-2ax)+C\,\,,\,\,a\neq {\frac {n\pi }{2}}}
∑
x
cos
2
a
x
=
x
2
−
1
4
csc
(
a
)
sin
(
a
−
2
a
x
)
+
C
,
a
≠
n
π
2
{\displaystyle \sum _{x}\cos ^{2}ax={\frac {x}{2}}-{\frac {1}{4}}\csc(a)\sin(a-2ax)+C\,\,,\,\,a\neq {\frac {n\pi }{2}}}
∑
x
tan
a
x
=
i
x
−
1
a
ψ
e
2
i
a
(
x
−
π
2
a
)
+
C
,
a
≠
n
π
2
{\displaystyle \sum _{x}\tan ax=ix-{\frac {1}{a}}\psi _{e^{2ia}}\left(x-{\frac {\pi }{2a}}\right)+C\,,\,\,a\neq {\frac {n\pi }{2}}}
where
ψ
q
(
x
)
{\displaystyle \psi _{q}(x)}
is the q-digamma function.
∑
x
tan
x
=
i
x
−
ψ
e
2
i
(
x
+
π
2
)
+
C
=
−
∑
k
=
1
∞
(
ψ
(
k
π
−
π
2
+
1
−
z
)
+
ψ
(
k
π
−
π
2
+
z
)
−
ψ
(
k
π
−
π
2
+
1
)
−
ψ
(
k
π
−
π
2
)
)
+
C
{\displaystyle \sum _{x}\tan x=ix-\psi _{e^{2i}}\left(x+{\frac {\pi }{2}}\right)+C=-\sum _{k=1}^{\infty }\left(\psi \left(k\pi -{\frac {\pi }{2}}+1-z\right)+\psi \left(k\pi -{\frac {\pi }{2}}+z\right)-\psi \left(k\pi -{\frac {\pi }{2}}+1\right)-\psi \left(k\pi -{\frac {\pi }{2}}\right)\right)+C}
∑
x
cot
a
x
=
−
i
x
−
i
ψ
e
2
i
a
(
x
)
a
+
C
,
a
≠
n
π
2
{\displaystyle \sum _{x}\cot ax=-ix-{\frac {i\psi _{e^{2ia}}(x)}{a}}+C\,,\,\,a\neq {\frac {n\pi }{2}}}
Antidifferences of inverse hyperbolic functions
∑
x
artanh
a
x
=
1
2
ln
(
Γ
(
x
+
1
a
)
Γ
(
x
−
1
a
)
)
+
C
{\displaystyle \sum _{x}\operatorname {artanh} \,ax={\frac {1}{2}}\ln \left({\frac {\Gamma \left(x+{\frac {1}{a}}\right)}{\Gamma \left(x-{\frac {1}{a}}\right)}}\right)+C}
Antidifferences of inverse trigonometric functions
∑
x
arctan
a
x
=
i
2
ln
(
Γ
(
x
+
i
a
)
Γ
(
x
−
i
a
)
)
+
C
{\displaystyle \sum _{x}\arctan ax={\frac {i}{2}}\ln \left({\frac {\Gamma (x+{\frac {i}{a}})}{\Gamma (x-{\frac {i}{a}})}}\right)+C}
Antidifferences of special functions
∑
x
ψ
(
x
)
=
(
x
−
1
)
ψ
(
x
)
−
x
+
C
{\displaystyle \sum _{x}\psi (x)=(x-1)\psi (x)-x+C}
∑
x
Γ
(
x
)
=
(
−
1
)
x
+
1
Γ
(
x
)
Γ
(
1
−
x
,
−
1
)
e
+
C
{\displaystyle \sum _{x}\Gamma (x)=(-1)^{x+1}\Gamma (x){\frac {\Gamma (1-x,-1)}{e}}+C}
where
Γ
(
s
,
x
)
{\displaystyle \Gamma (s,x)}
is the incomplete gamma function .
∑
x
(
x
)
a
=
(
x
)
a
+
1
a
+
1
+
C
{\displaystyle \sum _{x}(x)_{a}={\frac {(x)_{a+1}}{a+1}}+C}
where
(
x
)
a
{\displaystyle (x)_{a}}
is the falling factorial .
∑
x
sexp
a
(
x
)
=
ln
a
(
sexp
a
(
x
)
)
′
(
ln
a
)
x
+
C
{\displaystyle \sum _{x}\operatorname {sexp} _{a}(x)=\ln _{a}{\frac {(\operatorname {sexp} _{a}(x))'}{(\ln a)^{x}}}+C}
(see super-exponential function )
See also
References
^ Indefinite Sum at PlanetMath .
^ On Computing Closed Forms for Indefinite Summations. Yiu-Kwong Man. J. Symbolic Computation (1993), 16, 355-376
^ "If Y is a function whose first difference is the function y , then Y is called an indefinite sum of y and denoted Δ−1 y " Introduction to Difference Equations , Samuel Goldberg
^ "Handbook of discrete and combinatorial mathematics", Kenneth H. Rosen, John G. Michaels, CRC Press, 1999, ISBN 0-8493-0149-1
^ Bernoulli numbers of the second kind on Mathworld
^ Markus Müller. How to Add a Non-Integer Number of Terms, and How to Produce Unusual Infinite Summations (note that he uses a slightly alternative definition of fractional sum in his work, i.e. inverse to backwards difference, hence 1 as the lower limit in his formula)
^ Bruce C. Berndt, Ramanujan's Notebooks , Ramanujan's Theory of Divergent Series , Chapter 6, Springer-Verlag (ed.), (1939), pp. 133–149.
^ Éric Delabaere, Ramanujan's Summation , Algorithms Seminar 2001–2002 , F. Chyzak (ed.), INRIA, (2003), pp. 83–88.
^ Algorithms for Nonlinear Higher Order Difference Equations , Manuel Kauers
Further reading
"Difference Equations: An Introduction with Applications", Walter G. Kelley, Allan C. Peterson, Academic Press, 2001, ISBN 0-12-403330-X
Markus Müller. How to Add a Non-Integer Number of Terms, and How to Produce Unusual Infinite Summations
Markus Mueller, Dierk Schleicher. Fractional Sums and Euler-like Identities
S. P. Polyakov. Indefinite summation of rational functions with additional minimization of the summable part. Programmirovanie, 2008, Vol. 34, No. 2.
"Finite-Difference Equations And Simulations", Francis B. Hildebrand, Prenctice-Hall, 1968