Bézout matrix

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In mathematics, a Bézout matrix (or Bézoutian or Bezoutiant) is a special square matrix associated with two polynomials, introduced by James Joseph Sylvester (1853) and Arthur Cayley (1857) and named after Étienne Bézout. Bézoutian may also refer to the determinant of this matrix, which is equal to the resultant of the two polynomials. Bézout matrices are sometimes used to test the stability of a given polynomial.

Definition[edit]

Let and be two complex polynomials of degree at most n,

(Note that any coefficient or could be zero.) The Bézout matrix of order n associated with the polynomials f and g is

where the entries result from the identity

It is in and the entries of that matrix are such that if we let for each , then:

To each Bézout matrix, one can associate the following bilinear form, called the Bézoutian:

Examples[edit]

  • For n = 3, we have for any polynomials f and g of degree (at most) 3:
  • Let and be two polynomials. Then:

The last row and column are all zero as f and g have degree strictly less than n (equal 4). The other zero entries are because for each , either or is zero.

Properties[edit]

  • is symmetric (as a matrix);
  • ;
  • ;
  • is bilinear in (f,g);
  • is in if f and g have real coefficients;
  • is nonsingular with if and only if f and g have no common roots.
  • with has determinant which is the resultant of f and g.

Applications[edit]

An important application of Bézout matrices can be found in control theory. To see this, let f(z) be a complex polynomial of degree n and denote by q and p the real polynomials such that f(iy) = q(y) + ip(y) (where y is real). We also note r for the rank and σ for the signature of . Then, we have the following statements:

  • f(z) has n − r roots in common with its conjugate;
  • the left r roots of f(z) are located in such a way that:
    • (r + σ)/2 of them lie in the open left half-plane, and
    • (r − σ)/2 lie in the open right half-plane;
  • f is Hurwitz stable if and only if is positive definite.

The third statement gives a necessary and sufficient condition concerning stability. Besides, the first statement exhibits some similarities with a result concerning Sylvester matrices while the second one can be related to Routh–Hurwitz theorem.

References[edit]

  • Cayley, Arthur (1857), "Note sur la methode d'elimination de Bezout", J. Reine Angew. Math., 53: 366–367
  • Kreĭn, M. G.; Naĭmark, M. A. (1981) [1936], "The method of symmetric and Hermitian forms in the theory of the separation of the roots of algebraic equations", Linear and Multilinear Algebra, 10 (4): 265–308, doi:10.1080/03081088108817420, ISSN 0308-1087, MR 0638124
  • Pan, Victor; Bini, Dario (1994). Polynomial and matrix computations. Basel, Switzerland: Birkhäuser. ISBN 0-8176-3786-9.
  • Pritchard, Anthony J.; Hinrichsen, Diederich (2005). Mathematical systems theory I: modelling, state space analysis, stability and robustness. Berlin: Springer. ISBN 3-540-44125-5.
  • Sylvester, James Joseph (1853), "On a Theory of the Syzygetic Relations of Two Rational Integral Functions, Comprising an Application to the Theory of Sturm's Functions, and That of the Greatest Algebraical Common Measure", Philosophical Transactions of the Royal Society of London, The Royal Society, 143: 407–548, doi:10.1098/rstl.1853.0018, ISSN 0080-4614, JSTOR 108572