In linear algebra, an alternant matrix is a matrix formed by applying a finite list of functions pointwise to a fixed column of inputs. An alternant determinant is the determinant of a square alternant matrix.
Generally, if are functions from a set to a field , and , then the alternant matrix has size and is defined by
or, more compactly, . (Some authors use the transpose of the above matrix.) Examples of alternant matrices include Vandermonde matrices, for which , and Moore matrices, for which .
Properties
The alternant can be used to check the linear independence of the functions in function space. For example, let , and choose . Then the alternant is the matrix and the alternant determinant is . Therefore M is invertible and the vectors form a basis for their spanning set: in particular, and are linearly independent.
Linear dependence of the columns of an alternant does not imply that the functions are linearly dependent in function space. For example, let , and choose . Then the alternant is and the alternant determinant is 0, but we have already seen that and are linearly independent.
Despite this, the alternant can be used to find a linear dependence if it is already known that one exists. For example, we know from the theory of partial fractions that there are real numbers A and B for which . Choosing ,, and , we obtain the alternant . Therefore, is in the nullspace of the matrix: that is, . Moving to the other side of the equation gives the partial fraction decomposition .
If and for any , then the alternant determinant is zero (as a row is repeated).
If and the functions are all polynomials, then divides the alternant determinant for all . In particular, if V is a Vandermonde matrix, then divides such polynomial alternant determinants. The ratio is therefore a polynomial in called the bialternant. The Schur polynomial is classically defined as the bialternant of the polynomials .