# Gauss–Newton algorithm

In mathematics, the Gauss-Newton algorithm is used to solve nonlinear least squares problems. It is a modification of Newton's method that does not use second derivatives, due to Carl Friedrich Gauss.

## The problem

Given m functions f1, ..., fm of n parameters p1, ..., pn with mn, we want to minimize the sum

$S(p)=\sum _{i=1}^{m}(f_{i}(p))^{2}.$ Here, p stands for the vector (p1, ..., pn).

## The algorithm

The Gauss-Newton algorithm is an iterative procedure. This means that the user has to provide an initial guess for the parameter vector p, which we will call p0.

Subsequent guesses pk for the parameter vector are then produced by the recurrence relation

$p^{k+1}=p^{k}-{\Big (}J_{f}(p^{k})^{\top }J_{f}(p^{k}){\Big )}^{-1}J_{f}(p^{k})^{\top }f(p^{k}),$ where f=(f1, ..., fm) and Jf(p) denotes the Jacobian of f at p (note that Jf is not square).

The matrix inverse is never computed explicitly in practice. Instead, we use

$p^{k+1}=p^{k}+\delta ^{k},\,$ and we compute the update δk by solving the linear system

$J_{f}(p^{k})^{\top }J_{f}(p^{k})\,\delta ^{k}=-J_{f}(p^{k})^{\top }f(p^{k}).$ A good implementation of the Gauss-Newton algorithm also employs a line search algorithm: instead of the above formula for pk+1, we use

$p^{k+1}=p^{k}+\alpha ^{k}\,\delta ^{k},$ where the number αk is in some sense optimal.

## Other algorithms

The recurrence relation for Newton's method for minimizing a function S is

$p^{k+1}=p^{k}-[H(S)(p^{k})]^{-1}J_{S}(p^{k}),\,$ where $J_{S}$ and $H(S)$ denote the Jacobian and Hessian of S respectively. Using Newton's method for our function

$S(p)=\sum _{i=1}^{m}(f_{i}(p))^{2}$ yields the recurrence relation

$p^{k+1}=p^{k}+\left(J_{f}(p)^{\top }J_{f}(p)+2\sum _{i=1}^{m}f_{i}(p)\,H(f_{i})(p)\right)^{-1}J_{f}(p)^{\top }f(p).$ We can conclude that the Gauss-Newton method is the same as Newton's method with the Σ f H(f) term ignored.

Other algorithms for solving least squares problems include the Levenberg-Marquardt algorithm and gradient descent.