Curve fitting
Curve fitting is finding a curve which has the best fit to a series of data points and possibly other constraints. This section is an introduction to both interpolation (where an exact fit to constraints is expected) and regression analysis. Both are sometimes used for extrapolation. Regression analysis allows for an approximate fit by minimizing the difference between the data points and the curve.
Fitting lines and polynomial curves to data points
Let's start with a first degree polynomial equation:
This is a line with slope a. We know that a line will connect any two points. So, a first degree polynomial equation is an exact fit through any two points.
If we increase the order of the equation to a second degree polynomial, we get:
This will exactly fit three points.
If we increase the order of the equation to a third degree polynomial, we get:
This will exactly fit four points.
A more general statement would be to say it will exactly fit four constraints. Each constraint can be a point, angle, or curvature (which is the reciprocal of the radius of an osculating circle). Angle and curvature constraints are most often added to the ends of a curve, and in such cases are called end conditions. Identical end conditions are frequently used to ensure a smooth transition between polynomial curves contained within a single spline. Higher-order constraints, such as "the change in the rate of curvature", could also be added. This, for example, would be useful in highway cloverleaf design to understand the forces applied to a car, as it follows the cloverleaf, and to set reasonable speed limits, accordingly.
Bearing this in mind, the first degree polynomial equation could also be an exact fit for a single point and an angle while the third degree polynomial equation could also be an exact fit for two points, an angle constraint, and a curvature constraint. Many other combinations of constraints are possible for these and for higher order polynomial equations.
If we have more than n + 1 constraints (n being the degree of the polynomial), we can still run the polynomial curve through those constraints. An exact fit to all the constraints is not certain (but might happen, for example, in the case of a first degree polynomial exactly fitting three collinear points). In general, however, some method is then needed to evaluate each approximation. The least squares method is one way to compare the deviations.
Now, you might wonder why we would ever want to get an approximate fit when we could just increase the degree of the polynomial equation and get an exact match. There are several reasons:
- Even if an exact match exists, it does not necessarily follow that we can find it. Depending on the algorithm used, we may encounter a divergent case, where the exact fit cannot be calculated, or it might take too much computer time to find the solution. Either way, you might end up having to accept an approximate solution.
- We may actually prefer the effect of averaging out questionable data points in a sample, rather than distorting the curve to fit them exactly.
- High order polynomials can be highly oscillatory. If we run a curve through two points A and B, we would expect the curve to run somewhat near the midpoint of A and B, as well. This may not happen with high-order polynomial curves, they may even have values that are very large in positive or negative magnitude. With low-order polynomials, the curve is more likely to fall near the midpoint (it's even guaranteed to exactly run through the midpoint on a first degree polynomial).
- Low-order polynomials tend to be smooth and high order polynomial curves tend to be "lumpy". To define this more precisely, the maximum number of ogee/inflection points possible in a polynomial curve is n-2, where n is the order of the polynomial equation. An inflection point is a location on the curve where it switches from a positive radius to negative. We can also say this is where it transitions from "holding water" to "shedding water". Note that it is only "possible" that high order polynomials will be lumpy, they could also be smooth, but there is no guarantee of this, unlike with low order polynomial curves. A fifteenth degree polynomial could have, at most, thirteen inflection points, but could also have twelve, eleven, or any number down to zero.
Now that we have talked about using a degree too low for an exact fit, let's also discuss what happens if the degree of the polynomial curve is higher than needed for an exact fit. This is bad for all the reasons listed previously for high order polynomials, but also leads to a case where there are an infinite number of solutions. For example, a first degree polynomial (a line) constrained by only a single point, instead of the usual two, would give us an infinite number of solutions. This brings up the problem of how to compare and choose just one solution, which can be a problem for software and for humans, as well. For this reason, it is usually best to choose as low a degree as possible for an exact match on all constraints, and perhaps an even lower degree, if an approximate fit is acceptable.
For more details, see the polynomial interpolation article.
Fitting other curves to data points
Other types of curves, such as conic sections (circular, elliptical, parabolic, and hyperbolic arcs) or trigonometric functions (such as sine and cosine), may also be used, in certain cases. For example, trajectories of objects under the influence of gravity follow a parabolic path, when air resistance is ignored. Hence, matching trajectory data points to a parabolic curve would make sense. Tides follow sinusoidal patterns, hence tidal data points should be matched to a sine wave, or the sum of two sine waves of different periods, if the effects of the Moon and Sun are both considered.
Application to surfaces
Note that while this discussion was in terms of 2D curves, much of this logic also extends to 3D surfaces, each patch of which is defined by a net of curves in two parametric directions, typically called u and v. A surface may be composed of one or more surface patches in each direction.
For more details, see the computer representation of surfaces article.
See also
External links
Implementations
- NLINLS Curve fitting (nonlinear least squares) using Differential Evolution optimizer.
- ALGLIB Linear least squares in C#, C++, Visual Basic, Pascal. BSD license.
- GNU Scientific Library Linear/non-linear least squares fitting in C. GPL license.
- levmar Non-linear least squares in C/C++ (possibly constrained) with interfaces for Matlab, Perl and Python. GPL license.
- Python Equations Linear/non-linear least squares curve and surface fitting in Python. BSD license.
- T-SQL implementation
Software
- Fityk - curve-fitting software on GPL licence.
- Matlab SUrrogate MOdeling Toolbox - SUMO Toolbox - Matlab code for Surrogate Model Regression
Online calculators, applications and demos
- SoftIntegration.com "Fit a set of data points to a linear combination of specified functions" (sic, see site header)
- Zunzun.com Online curve and surface fitting application
- Interactive curve fitting using Least Squares with Weights on savetman.com
- xuru.org Online curve fitting and regression tools
- Curve Fitting by Theodore Gray, The Wolfram Demonstrations Project.
Online textbooks
- online curve-fitting textbook from GraphPad Software
Commercial/Shareware
- LAB Fit Curve Fitting Software 2D and 3D fitting with Finder
- TableCurve2D and TableCurve3D by Systat automates curve fitting.
- Curve Expert (shareware) fits functions to data (limited to one dependent and one independent variable.)
- GOSA software solves global optimization problems, from linear regression with one variable to nonlinear fitting with several independent variables.
- NLREG performs linear and nonlinear regression analysis, surface and curve fitting.
- Origin and OriginPro linear and nonlinear regression analysis, fit comparison, curve and surface fitting, peak analysis wizard including advanced peak fitting.
- IDBS XLfit curve fitting in Microsoft Excel