# Generator matrix

In coding theory, a generator matrix is a matrix whose rows form a basis for a linear code. The codewords are all of the linear combinations of the rows of this matrix, that is, the linear code is the row space of its generator matrix.

## Terminology

If G is a matrix, it generates the codewords of a linear code C by

${\displaystyle w=sG}$

where w is a codeword of the linear code C, and s is any input vector. Both w and s are assumed to be row vectors.[1] A generator matrix for a linear ${\displaystyle [n,k,d]_{q}}$-code has format ${\displaystyle k\times n}$, where n is the length of a codeword, k is the number of information bits (the dimension of C as a vector subspace), d is the minimum distance of the code, and q is size of the finite field, that is, the number of symbols in the alphabet (thus, q = 2 indicates a binary code, etc.). The number of redundant bits is denoted by ${\displaystyle r=n-k}$.

The standard form for a generator matrix is,[2]

${\displaystyle G={\begin{bmatrix}I_{k}|P\end{bmatrix}}}$,

where ${\displaystyle I_{k}}$ is the ${\displaystyle k\times k}$ identity matrix and P is a ${\displaystyle k\times (n-k)}$ matrix. When the generator matrix is in standard form, the code C is systematic in its first k coordinate positions.[3]

A generator matrix can be used to construct the parity check matrix for a code (and vice versa). If the generator matrix G is in standard form, ${\displaystyle G={\begin{bmatrix}I_{k}|P\end{bmatrix}}}$, then the parity check matrix for C is[4]

${\displaystyle H={\begin{bmatrix}-P^{\top }|I_{n-k}\end{bmatrix}}}$,

where ${\displaystyle P^{\top }}$ is the transpose of the matrix ${\displaystyle P}$. This is a consequence of the fact that a parity check matrix of ${\displaystyle C}$ is a generator matrix of the dual code ${\displaystyle C^{\perp }}$.

G is a ${\displaystyle k\times n}$ matrix, while H is a ${\displaystyle (n-k)\times n}$ matrix.

## Equivalent codes

Codes C1 and C2 are equivalent (denoted C1 ~ C2) if one code can be obtained from the other via the following two transformations:[5]

1. arbitrarily permute the components, and
2. independently scale by a non-zero element any components.

Equivalent codes have the same minimum distance.

The generator matrices of equivalent codes can be obtained from one another via the following elementary operations:[6]

1. permute rows
2. scale rows by a nonzero scalar
3. add rows to other rows
4. permute columns, and
5. scale columns by a nonzero scalar.

Thus, we can perform Gaussian elimination on G. Indeed, this allows us to assume that the generator matrix is in the standard form. More precisely, for any matrix G we can find an invertible matrix U such that ${\displaystyle UG={\begin{bmatrix}I_{k}|P\end{bmatrix}}}$, where G and ${\displaystyle {\begin{bmatrix}I_{k}|P\end{bmatrix}}}$ generate equivalent codes.

## Notes

1. ^ MacKay, David, J.C. (2003). Information Theory, Inference, and Learning Algorithms (PDF). Cambridge University Press. p. 9. ISBN 9780521642989. Because the Hamming code is a linear code, it can be written compactly in terms of matrices as follows. The transmitted codeword ${\displaystyle \mathbf {t} }$ is obtained from the source sequence ${\displaystyle \mathbf {s} }$ by a linear operation,

${\displaystyle \mathbf {t} =\mathbf {G} ^{\intercal }\mathbf {s} }$

where ${\displaystyle \mathbf {G} }$ is the generator matrix of the code... I have assumed that ${\displaystyle \mathbf {s} }$ and ${\displaystyle \mathbf {t} }$ are column vectors. If instead they are row vectors, then this equation is replaced by

${\displaystyle \mathbf {t} =\mathbf {sG} }$

... I find it easier to relate to the right-multiplication (...) than the left-multiplication (...). Many coding theory texts use the left-multiplying conventions (...), however. ...The rows of the generator matrix can be viewed as defining the basis vectors.
{{cite book}}: CS1 maint: multiple names: authors list (link)
2. ^ Ling & Xing 2004, p. 52
3. ^ Roman 1992, p. 198
4. ^ Roman 1992, p. 200
5. ^ Pless 1998, p. 8
6. ^ Welsh 1988, pp. 54-55