# Proximal operator

In mathematical optimization, the proximal operator is an operator associated with a proper, lower semi-continuous convex function $f$ from a Hilbert space ${\mathcal {X}}$ to $[-\infty ,+\infty ]$ , and is defined by: 

$\operatorname {prox} _{f}(v)=\arg \min _{x\in {\mathcal {X}}}\left(f(x)+{\frac {1}{2}}\|x-v\|_{2}^{2}\right).$ For any function in this class, the minimizer of the right-hand side above is unique, hence making the proximal operator well-defined. The ${\text{prox}}$ of a function enjoys several useful properties for optimization, enumerated below. Note that all of these items require $f$ to be proper (i.e. not identically $+\infty$ , and never take a value of $-\infty$ ), convex, and lower semi-continuous.

A function is said to be firmly non-expansive if $(\forall (x,y)\in {\mathcal {X}}^{2})\quad \|{\text{prox}}_{f}x-{\text{prox}}_{f}y\|^{2}\leq \langle x-y\ |\ {\text{prox}}_{f}x-{\text{prox}}_{f}y\rangle$ . Fixed points of ${\text{prox}}_{f}$ are minimizers of $f$ : $\{x\in {\mathcal {X}}\ |\ {\text{prox}}_{f}x=x\}=\arg \min f$ .

Global convergence to a minimizer is defined as follows: If $\arg \min f\neq \varnothing$ , then for any initial point $x_{0}\in {\mathcal {X}}$ , the recursion $(\forall n\in \mathbb {N} )\quad x_{n+1}={\text{prox}}_{f}x_{n}$ yields convergence $x_{n}\to x\in \arg \min f$ as $n\to +\infty$ . This convergence may be weak if ${\mathcal {X}}$ is infinite dimensional.

It is frequently used in optimization algorithms associated with non-differentiable optimization problems such as total variation denoising.

If $f(x)$ is the 0-$\infty$ indicator function of a nonempty, closed, convex set, then it is lower semi-continuous, proper, and convex and ${\text{prox}}_{f}$ is the orthogonal projector onto that set.