Richardson's theorem

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Jason Davies (talk | contribs) at 00:16, 4 November 2014 (Grammar.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In mathematics, Richardson's theorem establishes a limit on the extent to which an algorithm can decide whether certain mathematical expressions are equal. It states that for a certain fairly natural class of expressions, it is undecidable whether a particular expression E satisfies the equation E = 0, and similarly undecidable whether the functions defined by expressions E and F are everywhere equal. It was proved in 1968 by computer scientist Daniel Richardson of the University of Bath.

Specifically, the class of expressions for which the theorem holds is that generated by rational numbers, the number π, the number log 2, the variable x, the operations of addition, subtraction, multiplication, composition, and the sin, exp, and abs functions.

For some classes of expressions (generated by other primitives than in Richardson's theorem) there exist algorithms that can determine whether an expression is zero.[1]

Statement of the Theorem

Richardson's theorem can be stated as follows.[2] Let E be a set of real functions such that if A(x), B(x)E then A(x) ± B(x), A(x)B(x), A(B(x))E. The rational numbers are contained as constant functions. Then for expressions A(x) in E,

  • if log(2), π, ex, sin x ∈ E, then A(x) ≥ 0 for all x is unsolvable;
  • if also |x|E then A(x) = 0 is unsolvable.

If furthermore there is a function B(x)E without an antiderivative in E then the integration problem is unsolvable. Example: has an elementary antiderivative if and only if a=0 in the elementary functions.

See also

References

  1. ^ The identity problem for elementary functions and constants by Richardson and Fitch (pdf file)
  2. ^ "Tensor Computer Algebra" (PDF). December 22, 2008.[dead link]

Further reading

External links