# Fluxion

The fluxion of a "fluent" (a time-varying quantity, or function) is its instantaneous rate of change, or gradient, at a given point.[1] Fluxions were introduced by Isaac Newton to describe his form of a time derivative (a derivative with respect to time). Newton introduced the concept in 1665 and detailed them in his mathematical treatise, Method of Fluxions.[2] Fluxions and fluents made up Newton's early calculus.[3]

## History

Fluxions were central to the Leibniz–Newton calculus controversy, when Newton sent a letter to Gottfried Wilhelm Leibniz explaining them, but concealing his words in code due to his suspicion. He wrote:[4]

I cannot proceed with the explanations of the fluxions now, I have preferred to conceal it thus: 6accdæ13eff7i319n4o4qrr4s8t12vz

The gibberish string was in fact an enciphered Latin phrase, meaning: "Given an equation that consists of any number of flowing quantities, to find the fluxions: and vice versa".[5]

## Example

If the fluent ${\displaystyle y}$ is defined as ${\displaystyle y=t^{2}}$ (where ${\displaystyle t}$ is time) the fluxion (derivative) at ${\displaystyle t=2}$ is:

${\displaystyle {\dot {y}}={\frac {\Delta y}{\Delta t}}={\frac {(2+o)^{2}-2^{2}}{(2+o)-2}}={\frac {4+4o+o^{2}-4}{2+o-2}}=4+o}$

Here ${\displaystyle o}$ is an infinitely small amount of time[6] and according to Newton, we can now ignore it because of its infinite smallness.[7] He justified the use of ${\displaystyle o}$ as a non-zero quantity by stating that fluxions were a consequence of movement by an object.

## Criticism

Bishop George Berkeley, a prominent philosopher of the time, slammed Newton's fluxions in his essay The Analyst, published in 1734.[8] Berkeley refused to believe that they were accurate because of the use of the infinitesimal ${\displaystyle o}$. He did not believe it could be ignored and pointed out that if it was zero, the consequence would be division by zero. Berkeley referred to them as "ghosts of departed quantities", a statement which unnerved mathematicians of the time and led to the eventual disuse of infinitesimals in calculus.

Towards the end of his life Newton revised his interpretation of ${\displaystyle o}$ as infinitely small, preferring to define it as approaching zero, using a similar definition to the concept of limit.[9] He believed this put fluxions back on safe ground. By this time, Leibniz's derivative (and his notation) had largely replaced Newton's fluxions and fluents, and remain in use today.