Khintchine inequality

In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Roman alphabet, is a theorem from probability, and is also frequently used in analysis. Heuristically, it says that if we pick $N$ complex numbers $x_1,\dots,x_N \in\mathbb{C}$, and add them together each multiplied by a random sign $\pm 1$, then the expected value of its modulus, or the modulus it will be closest to on average, will be not too far off from $\sqrt{|x_1|^{2}+\cdots + |x_N|^{2}}$.

Statement of theorem

Let $\{\epsilon_{n}\}_{n=1}^{N}$ be i.i.d. random variables with $P(\epsilon_n=\pm1)=\frac12$ for every $n=1\ldots N$, i.e., a sequence with Rademacher distribution. Let $0 and let $x_1,...,x_N\in \mathbb{C}$. Then

$A_p \left( \sum_{n=1}^{N}|x_{n}|^{2} \right)^{\frac{1}{2}} \leq \left(\mathbb{E}\Big|\sum_{n=1}^{N}\epsilon_{n}x_{n}\Big|^{p} \right)^{1/p} \leq B_p \left(\sum_{n=1}^{N}|x_{n}|^{2}\right)^{\frac{1}{2}}$

for some constants $A_p,B_p>0$ depending only on $p$ (see Expected value for notation). The sharp values of the constants $A_p,B_p$ were found by Haagerup (Ref. 2; see Ref. 3 for a simpler proof).

Uses in analysis

The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let $T$ be a linear operator between two Lp spaces $L^p(X,\mu)$ and $L^p(Y,\nu)$, $1\leq p<\infty$, with bounded norm $\|T\|<\infty$, then one can use Khintchine's inequality to show that

$\left\|\left(\sum_{n=1}^{N}|Tf_n|^{2} \right)^{\frac{1}{2}}\right\|_{L^p(Y,\nu)}\leq C_p\left\|\left(\sum_{n=1}^{N}|f_{n}|^{2}\right)^{\frac{1}{2}}\right\|_{L^p(X,\mu)}$

for some constant $C_p>0$ depending only on $p$ and $\|T\|$.