# Zyablov bound

In coding theory, the Zyablov bound is a lower bound on the rate ${\displaystyle R}$ and relative distance ${\displaystyle \delta }$ of concatenated codes.

## Statement of the bound

Let ${\displaystyle R}$ be the rate of the outer code ${\displaystyle C_{out}}$ and ${\displaystyle \delta }$ be the relative distance, then the rate of the concatenated codes satisfies the following bound.

${\displaystyle {\mathcal {R}}\geqslant \max \limits _{0\leqslant r\leqslant 1-H_{q}(\delta +\varepsilon )}r\left(1-{\delta \over {H_{q}^{-1}(1-r)-\varepsilon }}\right)}$

where ${\displaystyle r}$ is the rate of the inner code ${\displaystyle C_{in}}$.

## Description

Let ${\displaystyle C_{out}}$ be the outer code, ${\displaystyle C_{in}}$ be the inner code.

Consider ${\displaystyle C_{out}}$ meets the Singleton bound with rate of ${\displaystyle R}$, i.e. ${\displaystyle C_{out}}$ has relative distance ${\displaystyle \delta >1-R.}$ In order for ${\displaystyle C_{out}\circ C_{in}}$ to be an asymptotically good code, ${\displaystyle C_{in}}$ also needs to be an asymptotically good code which means, ${\displaystyle C_{in}}$ needs to have rate ${\displaystyle r>0}$ and relative distance ${\displaystyle \delta _{in}>0}$.

Suppose ${\displaystyle C_{in}}$ meets the Gilbert-Varshamov bound with rate of ${\displaystyle r}$ and thus with relative distance

${\displaystyle \delta _{in}\geqslant H_{q}^{-1}(1-r)-\varepsilon ,\qquad \varepsilon >0,}$

then ${\displaystyle C_{out}\circ C_{in}}$ has rate of ${\displaystyle rR}$ and ${\displaystyle \delta =(1-R)\left(H_{q}^{-1}(1-r)-\varepsilon \right).}$

Expressing ${\displaystyle R}$ as a function of ${\displaystyle \delta ,r}$

${\displaystyle R=1-{\frac {\delta }{H^{-1}(1-r)-\varepsilon }}}$

Then optimizing over the choice of r, we get that rate of the Concatenated error correction code satisﬁes,

${\displaystyle {\mathcal {R}}\geqslant \max \limits _{0\leqslant r\leqslant {1-H_{q}(\delta +\varepsilon )}}r\left(1-{\delta \over {H_{q}^{-1}(1-r)-\varepsilon }}\right)}$

This lower bound is called Zyablov bound (the bound of ${\displaystyle r<1-H_{q}(\delta +\varepsilon )}$ is necessary to ensure ${\displaystyle R>0}$). See Figure 2 for a plot of this bound.

Note that the Zyablov bound implies that for every ${\displaystyle \delta >0}$, there exists a (concatenated) code with rate ${\displaystyle R>0.}$

## Remarks

We can construct a code that achieves the Zyablov bound in polynomial time. In particular, we can construct explicit asymptotically good code (over some alphabets) in polynomial time.

Linear Codes will help us complete the proof of the above statement since linear codes have polynomial representation. Let Cout be an ${\displaystyle [N,K]_{Q}}$ Reed-Solomon error correction code where ${\displaystyle N=Q-1}$ (evaluation points being ${\displaystyle \mathbb {F} _{Q}^{*}}$ with ${\displaystyle Q=q^{k}}$, then ${\displaystyle k=\theta (\log N)}$.

We need to construct the Inner code that lies on Gilbert-Varshamov bound. This can be done in two ways

1. To perform an exhaustive search on all generator matrices until the required property is satisfied for ${\displaystyle C_{in}}$. This is because Varshamovs bound states that there exists a linear code that lies on Gilbert-Varshamon bound which will take ${\displaystyle q^{O(kn)}}$ time. Using ${\displaystyle k=rn}$ we get ${\displaystyle q^{O(kn)}=q^{O(k^{2})}=N^{O(\log N)}}$, which is upper bounded by ${\displaystyle nN^{O(\log nN)}}$, a quasi-polynomial time bound.
1. To construct ${\displaystyle C_{in}}$ in ${\displaystyle q^{O(n)}}$ time and use ${\displaystyle (nN)^{O(1)}}$ time overall. This can be achieved by using the method of conditional expectation on the proof that random linear code lies on the bound with high probability.

Thus we can construct a code that achieves the Zyablov bound in polynomial time.