# Method of Fluxions

(Redirected from Fluent (mathematics))
French cover of Newton's Method of Fluxions

Method of Fluxions[1] is a book by Isaac Newton. The book was completed in 1671, and published in 1736. Fluxions is Newton's term for differential calculus (fluents was his term for integral calculus[citation needed]). He originally developed the method at Woolsthorpe Manor during the closing of Cambridge during the Great Plague of London from 1665 to 1667, but did not choose to make his findings known (similarly, his findings which eventually became the Philosophiae Naturalis Principia Mathematica were developed at this time and hidden from the world in Newton's notes for many years). Gottfried Leibniz developed his form of calculus independently around 1673, 7 years after Newton had developed the basis for differential calculus, as seen in surviving documents like “the method of fluxions and fluents... from 1666. Leibniz however published his discovery of differential calculus in 1684, nine years before Newton formally published his fluxion notation form of calculus in part during 1693.[2] The calculus notation in use today is mostly that of Leibniz, although Newton's dot notation for differentiation $\dot{x}$ for denoting derivatives with respect to time is still in current use throughout mechanics and circuit analysis.

Newton's Method of Fluxions was formally published posthumously, but following Leibniz's publication of the calculus a bitter rivalry erupted between the two mathematicians over who had developed the calculus first and so Newton no longer hid his knowledge of fluxions.

## Newton's development of analysis

For a period of time encompassing Newton's working life, the discipline of analysis was a subject of controversy in the mathematical community. Although analytic techniques provided solutions to long-standing problems, including problems of quadrature and the finding of tangents, the proofs of these solutions were not known to be reducible to the synthetic rules of Euclidean geometry. Instead, analysts were often forced to invoke infinitesimal, or "infinitely small," quantities to justify their algebraic manipulations. Some of Newton's mathematical contemporaries, such as Isaac Barrow, were highly skeptical of such techniques, which had no clear geometric interpretation. Although in his early work Newton also used infinitesimals in his derivations without justifying them, he later developed something akin to the modern definition of limits in order to justify his work.[3]