Jump to content

Borel's lemma

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 131.111.16.20 (talk) at 12:34, 10 February 2015 (→‎Proof: The Proof is wrong, making \epsilon_m smaller increases the magnitude of the derivatives, rather than reducing them (obvious really, since the bump function becomes more sharply peaked).). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In mathematics, Borel's lemma, named after Émile Borel, is an important result used in the theory of asymptotic expansions and partial differential equations.

Statement

Suppose U is an open set in the Euclidean space Rn, and suppose that f0, f1 ... is a sequence of smooth functions on U.

If I is an any open interval in R containing 0 (possibly I = R), then there exists a smooth function F(t, x) defined on I×U, such that

for k ≥ 0 and x in U.

See also

References

  • Erdélyi, A. (1956), Asymptotic expansions, Dover Publications, pp. 22–25, ISBN 0486603180
  • Golubitsky, M.; Guillemin, V. (1974), Stable mappings and their singularities, Graduate texts in Mathematics, vol. 14, Springer-Verlag, ISBN 0-387-90072-1
  • Hörmander, Lars (1990), The analysis of linear partial differential operators, I. Distribution theory and Fourier analysis (2nd ed.), Springer-Verlag, p. 16, ISBN 3-540-52343-X

This article incorporates material from Borel lemma on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.