# Campbell's theorem (probability)

In probability theory and statistics, Campell's theorem or the Campbell-Hardy theorem can refer to a particular equation or set of results relating to the expectation of a function summed over a point process to an integral involving the intensity measure of the point process, which allows for the calculation of expected value and variance of the random sum. One version[1] of the theorem specifically relates to the Poisson point process and gives a method for calculating moments as well as Laplace functionals of the process.

Another result by the name of Campell's theorem,[2] but also known as Campbell's formula,[3]:28 entails an integral equation for the aforementioned sum over a general point process, and not necessarily a Poisson point process.[3] There also exist equations involving moment measures and factorial moment measures that are considered versions of Campbell's formula. All these results are employed in probability and statistics with a particular importance in the related fields of point processes,[4] stochastic geometry[2] and continuum percolation theory,[5] spatial statistics.[3][6]

The theorem's name stems from the work[7][8] by Norman R. Campbell on theormionic noise, also known as shot noise, in vacuum tubes,[4][9] which was partly inspired by the work of Ernest Rutherford and Hans Geiger on alpha particle detection, where the Poisson point process arose as a solution to a family of differential equations by Harry Bateman.[9] In Campbell's work, he presents the moments and generating functions of the random sum of a Poisson process on the real line, but remarks that the main mathematical argument was due to G. H. Hardy, which has inspired the result to be sometimes called the Campbell-Hardy theorem.[9][10]

## Background

For a point process ${N}$ defined on (d-dimensional) Euclidean space $\textbf{R}^d$ ,[a] Campbell's theorem offers a way to calculate expectations of a function $f$ (with range in the real line R) defined also on $\textbf{R}^d$ and summed over ${N}$, namely:

$E[ \sum_{x\in {N}}f(x)]$,

where $E$ denotes the expectation and set notation is used such that ${N}$ is considered as a random set (see Point process notation). For a point process ${N}$, Campbell's theorem relates the above expectation with the intensity measure Λ. In relation to a Borel set B the intensity measure of ${N}$ is defined as:

$\Lambda(B)=E[ {N}(B) ]$,

where the measure notation is used such that ${N}$ is considered a random counting measure. The quantity Λ(B) can be interpreted as the average number of points of ${N}$ located in the set B.

## Campbell's theorem: Poisson point process

One version of Campbell's theorem[1] says that for a Poisson point process ${N}$ and a measurable function $f: \textbf{R}^d\rightarrow \textbf{R}$, the random sum

$\Sigma=\sum_{x\in {N}}f(x)$

is absolutely convergent with probability one if and only if the integral

$\int_{ \textbf{R}^d} \min(|f(x)|,1)\Lambda (dx) < \infty.$

Provided that this integral is finite, then the theorem further asserts that for any complex value $\theta$ the equation

$E(e^{\theta\Sigma})=\textrm{exp} \left(\int_{\textbf{R}^d} [e^{\theta f(x)}-1]\Lambda (dx)\right),$

holds if the integral on the right-hand side converges, which is the case for purely imaginary $\theta$. Moreover

$E(\Sigma)=\int_{\textbf{R}^d} f(x)\Lambda (dx),$

and if this integral converges, then

$\text{Var}(\Sigma)=\int_{\textbf{R}^d} f(x)^2\Lambda (dx),$

where $\text{Var}(\Sigma)$ denotes the variance of the random sum $\Sigma$.

From this theorem some expectation results for the Poisson point process follow directly including its Laplace functional.[1] [b]

## Campbell's theorem: general point process

A related result for a general (not necessarily simple) point process ${N}$ with intensity measure:

$\Lambda (B)= E[{N}(B)] ,$

is known as Campbell's formula[3] or Campbell's theorem,[2][12] which gives a method for calculating expectations of sums of measurable functions $f$ with ranges on the real line. More specifically, for a point process ${N}$ and a measurable function $f: \textbf{R}^d\rightarrow \textbf{R}$, the sum of $f$ over the point process is given by the equation:

$E\left[\sum_{x\in {N}}f(x)\right]=\int_{\textbf{R}^d} f(x)\Lambda (dx),$

where if one side of the equation is finite, then so is the other side.[13] This equation is essentially an application of Fubini's theorem[2] and coincides with the aforementioned Poisson case, but holds for a much wider class of point processes, simple or not.[3] Depending on the integral notation,[c] this integral may also be written as:[13]

$E\left[\sum_{x\in {N}}f(x)\right]=\int_{\textbf{R}^d} fd\Lambda ,$

If the intensity measure $\Lambda$ of a point process ${N}$ has a density $\lambda(x)$, then Campbell's formula becomes:

$E\left[\sum_{x\in {N}}f(x)\right]= \int_{\textbf{R}^d} f(x)\lambda(x)dx$

### Stationary point process

For a stationary point process ${N}$ with constant density $\lambda>0$, Campbell's theorem or formula reduces to a volume integral:

$E\left[\sum_{x\in {N}}f(x)\right]=\lambda \int_{\textbf{R}^d} f(x)dx$

This equation naturally holds for the homogeneous Poisson point processes, which is an example of a stationary stochastic process.[2]

## Applications

### Laplace functional of the Poisson point process

For a Poisson point process ${N}$ with intensity measure $\Lambda$, the Laplace functional is a consequence of Campbell's theorem[1] and is given by:[11]

$L_{{N}} := E\bigl[ e^{ \sum_{x \in N} f(x) } \bigr] =\exp \Bigl[-\int_{\textbf{R}^d}(1-e^{ f(x)})\Lambda(dx) \Bigr],$

which for the homogeneous case is:

$L_{{N}}=\exp\Bigl[-\lambda\int_{\textbf{R}^d}(1-e^{ f(x)})dx \Bigr].$

### Neuroscience

The total input current in neurons is the sum of many inputs with similar timecourses. When an Poisson approximation is used the mean current and variance are given by Campbell theorem. As the synaptic timecourse is typically known, this can be applied to infer the input rate.

Extension to higher moments is given in.[14]

Another common extension is to consider a sum with random amplitudes

$\Sigma=\sum_{x\in {N}} a_n f(x)$

In this case the cumulants $\kappa_i$ of $\Sigma$ equal

$\kappa_i= \lambda \overline{a^i} \int f(x) dx$

where $\overline{a^i}$ are the raw moments of the distribution of $a$.

## Notes

1. ^ It can be defined on a more general mathematical space than Euclidean space, but often this space is used for models.[4]
2. ^ Kingman[1] calls it a "characteristic functional" but Daley and Vere-Jones[4] and others call it a "Laplace functional",[2][11] reserving the term "characteristic functional" for when $\theta$ is imaginary.
3. ^ As discussed in Chapter 1 of Stoyan, Kendall and Mecke,[2] which applies to all other integrals presented here and elsewhere due to varying integral notation.

## References

1. Kingman, John (1993). Poisson Processes. Oxford Science Publications. p. 28. ISBN 0-19-853693-3.
2. D. Stoyan, W. S. Kendall, J. Mecke, and L. Ruschendorf. Stochastic geometry and its applications, volume 2. Wiley Chichester, 1995.
3. Baddeley, A.; Barany, I.; Schneider, R.; Weil, W. (2007). "Spatial Point Processes and their Applications". Stochastic Geometry. Lecture Notes in Mathematics 1892. p. 1. doi:10.1007/978-3-540-38175-4_1. ISBN 978-3-540-38174-7. edit
4. ^ a b c d Daley, D. J.; Vere-Jones, D. (2003). "An Introduction to the Theory of Point Processes". Probability and its Applications. doi:10.1007/b97277. ISBN 0-387-95541-0. edit
5. ^ R. Meester and R. Roy. Continuum percolation, volume 119 of Cambridge tracts in mathematics, 1996.
6. ^ Moller, J.; Plenge Waagepetersen, R. (2003). "Statistical Inference and Simulation for Spatial Point Processes". C&H/CRC Monographs on Statistics & Applied Probability 100. doi:10.1201/9780203496930. ISBN 978-1-58488-265-7. edit
7. ^ Campbell, N. (1909). "The study of discontinuous phenomena". Proc. Cambr. Phil. Soc. 15: 117–136.
8. ^ Campbell, N. (1910). "Discontinuities in light emission". Proc. Cambr. Phil. Soc. 15: 310–328.
9. ^ a b c Stirzaker, David (2000). "Advice to Hedgehogs, or, Constants Can Vary". The Mathematical Gazette (197-210) 84 (500): 197–210. JSTOR 3621649. edit
10. ^ Grimmett G. and Stirzaker D. (2001). Probability and random processes. Oxford University Press. p. 290.
11. ^ a b Baccelli, F. O. (2009). "Stochastic Geometry and Wireless Networks: Volume I Theory". Foundations and Trends® in Networking 3 (3–4): 249–449. doi:10.1561/1300000006. edit
12. ^ Daley, D. J.; Vere-Jones, D. (2008). "An Introduction to the Theory of Point Processes". Probability and Its Applications. doi:10.1007/978-0-387-49835-5. ISBN 978-0-387-21337-8. edit
13. ^ a b A. Baddeley. A crash course in stochastic geometry. Stochastic Geometry: Likelihood and Computation Eds OE Barndorff-Nielsen, WS Kendall, HNN van Lieshout (London: Chapman and Hall) pp, pages 1--35, 1999.
14. ^ S.O. Rice Mathematical analysis of random noise Bell Syst. Tech. J. 24, 1944 reprinted in "'Selected papers on noise and random processes N. Wax (editor) Dover 1954.