# Backtracking line search

Jump to: navigation, search

In (unconstrained) optimization, the backtracking linesearch strategy is used as part of a line search method, to compute how far one should move along a given search direction.

## Motivation

Usually it is undesirable to exactly minimize the function $\displaystyle \phi(\alpha)$ in the generic linesearch algorithm. One way to inexactly minimize $\displaystyle \phi$ is by finding an $\displaystyle \alpha_k$ that gives a sufficient decrease in the objective function $f:\mathbb R^n\to\mathbb R$ (assumed smooth), in the sense of the Wolfe condition holding. This condition, when used appropriately as part of a backtracking linesearch, is enough to generate an acceptable step length. (It is not sufficient on its own to ensure that a reasonable value is generated, since all $\displaystyle \alpha$ small enough will satisfy the Wolfe condition. To avoid the selection of steps that are too short, the additional curvature condition is usually imposed.)

## Algorithm

i) Set iteration counter $\scriptstyle j\,=\,0$. Make an initial guess $\scriptstyle \alpha^0\,>\,0$ and choose some $\scriptstyle \tau\,\in\,(0,1).\,$
ii) Until $\scriptstyle \alpha_j\,$ satisfies the Wolfe condition:
$\alpha_{j+1}=\tau\alpha_j,\,$
$j=j+1.\,$
iii) Return $\scriptstyle \alpha=\alpha_j.\,$

In other words, reduce $\scriptstyle \alpha^0$ geometrically, with rate $\scriptstyle\tau\,$, until the Wolfe condition holds.

## References

• Dennis, J. E.; Schnabel, R. B. (1996). Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Philadelphia: SIAM Publications.
• Nocedal, J.; Wright, S. J. (1999). Numerical optimization. New York, NY: Springer Verlag.