Mathematical notation: Difference between revisions
Line 13: | Line 13: | ||
==Precise semantic meaning== |
==Precise semantic meaning== |
||
Modern mathematics needs to be precise, because ambiguous notations do not allow formal proofs. |
Modern mathematics needs to be precise, because ambiguous notations do not allow formal proofs. Unfortunately the notation in common use is quite ambiguous. For example, |x| may mean the [[absolute value]] of x, the [[cardinality]] of x, the [[determinant]] of x, and so on. The notation sin<sup>-1</sup>x may mean to take the [[arcsine]] of x, to take the [[reciprocal]] of the [[sine]] of x, or to [[multiply]] s times i times n<sup>-1</sup> times x. Ambiguity is particularly common in [[calculus]], where the notation such as <sup>dy</sup>/<sub>dx</sub> is not normally meant to be interpreted as multiplication and division involving a variable d. |
||
Suppose that we have [[proposition|statement]]s, denoted by some formal [[sequence]] of symbols, about some objects (for example, numbers, shapes, patterns). Until the statements can be shown to be valid, their meaning is not yet resolved. While reasoning, we might let the symbols refer to those denoted objects, perhaps in a [[model (abstract)|model]]. The [[semantics]] of that object has a [[heuristic]] side and a [[deductive]] side. In either case, we might want to know the properties of that object, which we might then list in an [[intensional definition]]. |
|||
Those properties might then be expressed by some well-known and agreed-upon symbols from a [[table of mathematical symbols]]. This '''mathematical notation''' might include annotation such as |
Those properties might then be expressed by some well-known and agreed-upon symbols from a [[table of mathematical symbols]]. This '''mathematical notation''' might include annotation such as |
Revision as of 06:49, 19 October 2008
Mathematical notation is used in mathematics, and throughout the physical sciences, engineering, and economics. The complexity of such notation ranges from relatively simple symbolic representations, such as numbers 1 and 2; function symbols sin and +, to conceptual symbols, such as lim and dy/dx; to equations and variables.
Definition
A mathematical notation is a writing system (in fact, a formal language) used for recording concepts in mathematics.
- The notation uses symbols or symbolic expressions which are intended to have a precise semantic meaning.
- In the history of mathematics, these symbols have denoted numbers, shapes, patterns, and change. The notation can also include symbols for parts of the conventional discourse between mathematicians, when viewing mathematics as a language.
The media used for writing are recounted below, but common materials currently include paper and pencil, board and chalk (or dry-erase marker), and electronic media. The systematic adherence to mathematical concepts is a fundamental concept of mathematical notation. (See also some related concepts: Topic (linguistics), Logical argument, Cogency, Mathematical logic, Model theory, and Major themes in mathematics.)
Expressions
A mathematical expression is a sequence of symbols which can be evaluated. For example, if the symbols represent numbers, the expressions are evaluated according to a conventional order of operations which provides for calculation, if possible, of any expressions within parentheses, followed by any exponents and roots, then multiplications and divisions and finally any additions or subtractions, all done from left to right. In a computer language, these rules are implemented by the compilers. For more on expression evaluation, see the computer science topics: eager evaluation, lazy evaluation, and evaluation operator.
Precise semantic meaning
Modern mathematics needs to be precise, because ambiguous notations do not allow formal proofs. Unfortunately the notation in common use is quite ambiguous. For example, |x| may mean the absolute value of x, the cardinality of x, the determinant of x, and so on. The notation sin-1x may mean to take the arcsine of x, to take the reciprocal of the sine of x, or to multiply s times i times n-1 times x. Ambiguity is particularly common in calculus, where the notation such as dy/dx is not normally meant to be interpreted as multiplication and division involving a variable d.
Suppose that we have statements, denoted by some formal sequence of symbols, about some objects (for example, numbers, shapes, patterns). Until the statements can be shown to be valid, their meaning is not yet resolved. While reasoning, we might let the symbols refer to those denoted objects, perhaps in a model. The semantics of that object has a heuristic side and a deductive side. In either case, we might want to know the properties of that object, which we might then list in an intensional definition.
Those properties might then be expressed by some well-known and agreed-upon symbols from a table of mathematical symbols. This mathematical notation might include annotation such as
- "All x", "No x", "There is an x" (or its equivalent, "Some x"), "A set", "A function"
- "A mapping from the real numbers to the complex numbers"
History
Counting
It is believed that a mathematical notation was first developed at least 50,000 years ago in order to assist with counting.[citation needed] Early mathematical ideas for counting were represented by collections of rocks, sticks, bone, clay, stone, wood carvings, and knotted ropes. The tally stick is a timeless way of counting. Perhaps the oldest known mathematical texts are those of ancient Sumer. The Census Quipu of the Andes and the Ishango Bone from Africa both used the tally mark method of accounting for numerical concepts.
The development of zero as a number is one of the most important developments in early mathematics. It was used as a placeholder by the Babylonians and Greek Egyptians, and then as an integer by the Mayans, Indians and Arabs. (See The history of zero for more information.)
Geometry becomes analytic
The mathematical viewpoints in geometry did not lend themselves well to counting. The natural numbers, their relationship to fractions, and the identification of continuous quantities actually took millennia to take form, and even longer to allow for the development of notation. It was not until the invention of analytic geometry by René Descartes that geometry became more subject to a numerical notation. Some symbolic shortcuts for mathematical concepts came to be used in the publication of geometric proofs. Moreover, the power and authority of geometry's theorem and proof structure greatly influenced non-geometric treatises, Isaac Newton's Principia Mathematica, for example.
Counting is mechanized
After the rise of Boolean algebra and the development of positional notation, it became possible to mechanize simple circuits for counting, first by mechanical means, such as gears and rods, using rotation and translation to represent changes of state, then by electrical means, using changes in voltage and current to represent the analogs of quantity. Today, computers use standard circuits to both store and change quantities, which represent not only numbers but pictures, sound, motion, and control.
Modern notation
The 18th and 19th centuries saw the creation and standardization of mathematical notation as used today. Euler was responsible for many of the notations in use today: the use of a, b, c for constants and x, y, z for unknowns, e for the base of the natural logarithm, sigma (Σ) for summation, i for the imaginary unit, and the functional notation f(x). He also popularized the use of π for Archimedes constant (due to William Jones' proposal for the use of π in this way based on the earlier notation of William Oughtred). Many fields of mathematics bear the imprint of their creators for notation: the differential operator is due to Leibniz[1], the cardinal infinities to Georg Cantor (in addition to the lemniscate (∞) of John Wallis), the congruence symbol (≡) to Gauss, and so forth.
Computerized notation
The rise of expression evaluators such as calculators and slide rules were only part of what was required to mathematicize civilization. Today, keyboard-based notations are used for the e-mail of mathematical expressions, the Internet shorthand notation. The wide use of programming languages, which teach their users the need for rigor in the statement of a mathematical expression (or else the compiler will not accept the formula) are all contributing toward a more mathematical viewpoint across all walks of life.
For some people, computerized visualizations have been a boon to comprehending mathematics that mere symbolic notation could not provide. They can benefit from the wide availability of devices, which offer more graphical, visual, aural, and tactile feedback.
Ideographic notation
In the history of writing, ideographic symbols arose first, as more-or-less direct renderings of some concrete item. This has come full circle with the rise of computer visualization systems, which can be applied to abstract visualizations as well, such as for rendering some projections of a Calabi-Yau manifold.
Examples of abstract visualization which properly belong to the mathematical imagination can be found, for example in computer graphics. The need for such models abounds, for example, when the measures for the subject of study are actually random variables and not really ordinary mathematical functions.
Non-Latin-based mathematical notation
Modern Arabic mathematical notation is based mostly on the Arabic alphabet and is used widely in the Arab world, especially in pre-university levels of education.
See also
- Abuse of notation
- Begriffsschrift
- History of mathematical notation
- ISO_31-11
- Notation in probability
- Rendering mathematical formulas in Wikipedia
- Scientific notation
- Table of mathematical symbols
- Typographical conventions in mathematical formulae
Notes
- Florian Cajori, A History of Mathematical Notations (1929), 2 volumes. ISBN 0-486-67766-4
External links
- Earliest Uses of Various Mathematical Symbols
- Mathematical ASCII Notation how to type math notation in any text editor.
- Mathematics as a Language at cut-the-knot
- Stephen Wolfram: Mathematical Notation: Past and Future. October 2000. Transcript of a keynote address presented at MathML and Math on the Web: MathML International Conference.