A radial basis function (RBF) is a real-valued function ${\textstyle \phi }$ whose value depends only on the distance from the origin, so that ${\textstyle \phi \left(\mathbf {x} \right)=\phi \left(\left\|\mathbf {x} \right\|\right)}$; or alternatively on the distance from some other point ${\textstyle \mathbf {c} }$, called a center, so that ${\textstyle \phi \left(\mathbf {x} ,\mathbf {c} \right)=\phi \left(\left\|\mathbf {x} -\mathbf {c} \right\|\right)}$. Any function ${\textstyle \phi }$ that satisfies the property ${\textstyle \phi \left(\mathbf {x} \right)=\phi \left(\left\|\mathbf {x} \right\|\right)}$ is a radial function. The norm is usually Euclidean distance, although other distance functions are also possible.

Sums of radial basis functions are typically used to approximate given functions. This approximation process can also be interpreted as a simple kind of neural network; this was the context in which they originally surfaced, in work by David Broomhead and David Lowe in 1988,[1][2] which stemmed from Michael J. D. Powell's seminal research from 1977.[3][4][5] RBFs are also used as a kernel in support vector classification.[6] The RBF implementation [7] is ready enough to have the RBF exploited in different engineering applications[8].

## Types

Commonly used types of radial basis functions include (writing ${\textstyle r=\left\|\mathbf {x} -\mathbf {x} _{i}\right\|}$ and using ${\textstyle \varepsilon }$ to indicate the inverse of a critical radius):

• Gaussian:
${\displaystyle \phi \left(r\right)=e^{-\left(\varepsilon r\right)^{2}}}$
${\displaystyle \phi \left(r\right)={\sqrt {1+\left(\varepsilon r\right)^{2}}}}$
${\displaystyle \phi \left(r\right)={\dfrac {1}{1+\left(\varepsilon r\right)^{2}}}}$
${\displaystyle \phi \left(r\right)={\dfrac {1}{\sqrt {1+\left(\varepsilon r\right)^{2}}}}}$
• Polyharmonic spline:
{\displaystyle {\begin{aligned}\phi \left(r\right)&=r^{k},&k&=1,3,5,\dotsc \\\phi \left(r\right)&=r^{k}\ln \left(r\right),&k&=2,4,6,\dotsc \end{aligned}}}
• Thin plate spline (a special polyharmonic spline):
${\displaystyle \phi \left(r\right)=r^{2}\ln \left(r\right)}$

## Approximation

Radial basis functions are typically used to build up function approximations of the form

${\displaystyle y\left(\mathbf {x} \right)=\sum _{i=1}^{N}w_{i}\,\phi \left(\left\|\mathbf {x} -\mathbf {x} _{i}\right\|\right),}$

where the approximating function ${\textstyle y\left(\mathbf {x} \right)}$ is represented as a sum of ${\displaystyle N}$ radial basis functions, each associated with a different center ${\textstyle \mathbf {x} _{i}}$, and weighted by an appropriate coefficient ${\textstyle w_{i}}$. The weights ${\textstyle w_{i}}$ can be estimated using the matrix methods of linear least squares, because the approximating function is linear in the weights ${\textstyle w_{i}}$.

Approximation schemes of this kind have been particularly used[citation needed] in time series prediction and control of nonlinear systems exhibiting sufficiently simple chaotic behaviour, 3D reconstruction in computer graphics (for example, hierarchical RBF and Pose Space Deformation).

## RBF Network

Two unnormalized Gaussian radial basis functions in one input dimension. The basis function centers are located at ${\textstyle x_{1}=0.75}$ and ${\textstyle x_{2}=3.25}$.

The sum

${\displaystyle y\left(\mathbf {x} \right)=\sum _{i=1}^{N}w_{i}\,\phi \left(\left\|\mathbf {x} -\mathbf {x} _{i}\right\|\right),}$
can also be interpreted as a rather simple single-layer type of artificial neural network called a radial basis function network, with the radial basis functions taking on the role of the activation functions of the network. It can be shown that any continuous function on a compact interval can in principle be interpolated with arbitrary accuracy by a sum of this form, if a sufficiently large number ${\textstyle N}$ of radial basis functions is used.

The approximant ${\textstyle y\left(\mathbf {x} \right)}$ is differentiable with respect to the weights ${\textstyle w_{i}}$. The weights could thus be learned using any of the standard iterative methods for neural networks.

Using radial basis functions in this manner yields a reasonable interpolation approach provided that the fitting set has been chosen such that it covers the entire range systematically (equidistant data points are ideal). However, without a polynomial term that is orthogonal to the radial basis functions, estimates outside the fitting set tend to perform poorly.

## References

1. ^ Radial Basis Function networks Archived 2014-04-23 at the Wayback Machine.
2. ^ Broomhead, David H.; Lowe, David (1988). "Multivariable Functional Interpolation and Adaptive Networks" (PDF). Complex Systems. 2: 321–355. Archived from the original (PDF) on 2014-07-14.
3. ^ Michael J. D. Powell (1977). "Restart procedures for the conjugate gradient method" (PDF). Mathematical Programming. Springer. 12 (1): 241–254. doi:10.1007/bf01593790.
4. ^ Sahin, Ferat (1997). A Radial Basis Function Approach to a Color Image Classification Problem in a Real Time Industrial Application (PDF) (M.Sc.). Virginia Tech. p. 26. Radial basis functions were first introduced by Powell to solve the real multivariate interpolation problem.
5. ^ Broomhead & Lowe 1988, p. 347: "We would like to thank Professor M.J.D. Powell at the Department of Applied Mathematics and Theoretical Physics at Cambridge University for providing the initial stimulus for this work."
6. ^ VanderPlas, Jake (6 May 2015). "Introduction to Support Vector Machines". [O'Reilly]. Retrieved 14 May 2015.
7. ^ Buhmann, Martin Dietrich (2003). Radial basis functions : theory and implementations. Cambridge University Press. ISBN 0511040202. OCLC 56352083.
8. ^ Biancolini, Marco Evangelos (2018). Fast radial basis functions for engineering applications. Springer International Publishing. ISBN 9783319750118. OCLC 1030746230.