Mathematical analysis

(Redirected from Analysis (mathematics))

Mathematical analysis is a branch of mathematics that includes the theories of differentiation, integration, measure, limits, infinite series,[1] and analytic functions. These theories are usually studied in the context of real and complex numbers and functions. Analysis evolved from calculus, which involves the elementary concepts and techniques of analysis. Analysis may be distinguished from geometry. However, it can be applied to any space of mathematical objects that has a definition of nearness (a topological space) or specific distances between objects (a metric space).

History

Early results in analysis were implicitly present in the early days of ancient Greek mathematics. For instance, an infinite geometric sum is implicit in Zeno's paradox of the dichotomy.[2] Later, Greek mathematicians such as Eudoxus and Archimedes made more explicit, but informal, use of the concepts of limits and convergence when they used the method of exhaustion to compute the area and volume of regions and solids.[3] In India, the 12th century mathematician Bhāskara II gave examples of the derivative and used what is now known as Rolle's theorem.

In the 14th century, Madhava of Sangamagrama developed infinite series expansions, like the power series and the Taylor series, of functions such as sine, cosine, tangent and arctangent. Alongside his development of the Taylor series of the trigonometric functions, he also estimated the magnitude of the error terms created by truncating these series and gave a rational approximation of an infinite series. His followers at the Kerala school of astronomy and mathematics further expanded his works, up to the 16th century.

In Europe, during the later half of the 17th century, Newton and Leibniz independently developed infinitesimal calculus, which grew, with the stimulus of applied work that continued through the 18th century, into analysis topics such as the calculus of variations, ordinary and partial differential equations, Fourier analysis, and generating functions. During this period, calculus techniques were applied to approximate discrete problems by continuous ones.

In the 18th century, Euler introduced the notion of mathematical function.[4] Real analysis began to emerge as an independent subject when Bernard Bolzano introduced the modern definition of continuity in 1816,[5] but Bolzano's work did not become widely known until the 1870s. In 1821, Cauchy began to put calculus on a firm logical foundation by rejecting the principle of the generality of algebra widely used in earlier work, particularly by Euler. Instead, Cauchy formulated calculus in terms of geometric ideas and infinitesimals. Thus, his definition of continuity required an infinitesimal change in x to correspond to an infinitesimal change in y. He also introduced the concept of the Cauchy sequence, and started the formal theory of complex analysis. Poisson, Liouville, Fourier and others studied partial differential equations and harmonic analysis. The contributions of these mathematicians and others, such as Weierstrass, developed the (ε, δ)-definition of limit approach, thus founding the modern field of mathematical analysis.

In the middle of the 19th century Riemann introduced his theory of integration. The last third of the century saw the arithmetization of analysis by Weierstrass, who thought that geometric reasoning was inherently misleading, and introduced the "epsilon-delta" definition of limit. Then, mathematicians started worrying that they were assuming the existence of a continuum of real numbers without proof. Dedekind then constructed the real numbers by Dedekind cuts, in which irrational numbers are formally defined, which serve to fill the "gaps" between rational numbers, thereby creating a complete set: the continuum of real numbers, which had already been developed by Simon Stevin in terms of decimal expansions. Around that time, the attempts to refine the theorems of Riemann integration led to the study of the "size" of the set of discontinuities of real functions.

Also, "monsters" (nowhere continuous functions, continuous but nowhere differentiable functions, space-filling curves) began to be investigated. In this context, Jordan developed his theory of measure, Cantor developed what is now called naive set theory, and Baire proved the Baire category theorem. In the early 20th century, calculus was formalized using an axiomatic set theory. Lebesgue solved the problem of measure, and Hilbert introduced Hilbert spaces to solve integral equations. The idea of normed vector space was in the air, and in the 1920s Banach created functional analysis.

Main branches

Real analysis

Real analysis (traditionally, the theory of functions of a real variable) is a branch of mathematical analysis dealing with the real numbers and real-valued functions of a real variable. In particular, it deals with the analytic properties of real functions and sequences, including convergence and limits of sequences of real numbers, the calculus of the real numbers, and continuity, smoothness and related properties of real-valued functions.

Complex analysis

Complex analysis, traditionally known as the theory of functions of a complex variable, is the branch of mathematical analysis that investigates functions of complex numbers. It is useful in many branches of mathematics, including algebraic geometry, number theory, applied mathematics; as well as in physics, including hydrodynamics, thermodynamics, mechanical engineering and electrical engineering.

Complex analysis is particularly concerned with the analytic functions of complex variables (or, more generally, meromorphic functions). Because the separate real and imaginary parts of any analytic function must satisfy Laplace's equation, complex analysis is widely applicable to two-dimensional problems in physics.

Functional analysis

Functional analysis is a branch of mathematical analysis, the core of which is formed by the study of vector spaces endowed with some kind of limit-related structure (e.g. inner product, norm, topology, etc.) and the linear operators acting upon these spaces and respecting these structures in a suitable sense. The historical roots of functional analysis lie in the study of spaces of functions and the formulation of properties of transformations of functions such as the Fourier transform as transformations defining continuous, unitary etc. operators between function spaces. This point of view turned out to be particularly useful for the study of differential and integral equations.

Differential equations

A differential equation is a mathematical equation for an unknown function of one or several variables that relates the values of the function itself and its derivatives of various orders. Differential equations play a prominent role in engineering, physics, economics, biology, and other disciplines.

Differential equations arise in many areas of science and technology, specifically whenever a deterministic relation involving some continuously varying quantities (modeled by functions) and their rates of change in space and/or time (expressed as derivatives) is known or postulated. This is illustrated in classical mechanics, where the motion of a body is described by its position and velocity as the time value varies. Newton's laws allow one (given the position, velocity, acceleration and various forces acting on the body) to express these variables dynamically as a differential equation for the unknown position of the body as a function of time. In some cases, this differential equation (called an equation of motion) may be solved explicitly.

Measure theory

A measure on a set is a systematic way to assign a number to each suitable subset of that set, intuitively interpreted as its size. In this sense, a measure is a generalization of the concepts of length, area, and volume. A particularly important example is the Lebesgue measure on a Euclidean space, which assigns the conventional length, area, and volume of Euclidean geometry to suitable subsets of the $n$-dimensional Euclidean space $\mathbb{R}^n$. For instance, the Lebesgue measure of the interval $\left[0, 1\right]$ in the real numbers is its length in the everyday sense of the word – specifically, 1.

Technically, a measure is a function that assigns a non-negative real number or +∞ to (certain) subsets of a set $X$ (see Definition below). It must assign 0 to the empty set and be (countably) additive: the measure of a 'large' subset that can be decomposed into a finite (or countable) number of 'smaller' disjoint subsets, is the sum of the measures of the "smaller" subsets. In general, if one wants to associate a consistent size to each subset of a given set while satisfying the other axioms of a measure, one only finds trivial examples like the counting measure. This problem was resolved by defining measure only on a sub-collection of all subsets; the so-called measurable subsets, which are required to form a $\sigma$-algebra. This means that countable unions, countable intersections and complements of measurable subsets are measurable. Non-measurable sets in a Euclidean space, on which the Lebesgue measure cannot be defined consistently, are necessarily complicated in the sense of being badly mixed up with their complement. Indeed, their existence is a non-trivial consequence of the axiom of choice.

Numerical analysis

Numerical analysis is the study of algorithms that use numerical approximation (as opposed to general symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics).

Modern numerical analysis does not seek exact answers, because exact answers are often impossible to obtain in practice. Instead, much of numerical analysis is concerned with obtaining approximate solutions while maintaining reasonable bounds on errors.

Numerical analysis naturally finds applications in all fields of engineering and the physical sciences, but in the 21st century, the life sciences and even the arts have adopted elements of scientific computations. Ordinary differential equations appear in celestial mechanics (planets, stars and galaxies); numerical linear algebra is important for data analysis; stochastic differential equations and Markov chains are essential in simulating living cells for medicine and biology.

Other topics in mathematical analysis

Classical analysis

Classical analysis would normally be understood as any work not using functional analysis techniques, and is sometimes also called hard analysis; it also naturally refers to the more traditional topics. The study of differential equations is now shared with other fields such as dynamical systems theory, though the overlap with conventional analysis is large.

Applied analytical techniques

Techniques from analysis are also found in other areas such as:

Topological spaces, metric spaces

The motivation for studying mathematical analysis in the wider context of topological or metric spaces is threefold:

1. The same basic techniques have proved applicable to a wider class of problems (e.g., the study of function spaces).
2. A greater understanding of analysis in more abstract spaces frequently proves to be directly applicable to classical problems. For example, in Fourier analysis, functions are expressed in terms of a certain infinite sum of trigonometric functions. Thus Fourier analysis might be used to decompose a sound into a unique combination of pure tones of various pitches. The "weights", or coefficients, of the terms in the Fourier expansion of a function can be thought of as components of a vector in an infinite dimensional space known as a Hilbert space. Study of functions defined in this more general setting thus provides a convenient method of deriving results about the way functions vary in space as well as time or, in more mathematical terms, partial differential equations, where this technique is known as separation of variables.
3. The conditions needed to prove the particular result are stated more explicitly. The analyst then becomes more aware exactly what aspect of the assumption is needed to prove the theorem.

Calculus of finite differences, discrete calculus or discrete analysis

As the above section on topological spaces makes clear, analysis isn't just about continuity in the traditional sense of real numbers. Analysis is fundamentally about functions, the spaces that the functions act on and the function spaces that the functions themselves are members of. A discrete function f(n) is usually called a sequence a(n). A sequence could be a finite sequence from some data source or an infinite sequence from a discrete dynamical system. A discrete function could be defined explicitly by a list, or by a formula for f(n) or it could be given implicitly by a recurrence relation or difference equation. A difference equation is the discrete equivalent of a differential equation and can be used to approximate the latter or studied in its own right. Every question and method about differential equations has a discrete equivalent for difference equations. For instance where there are integral transforms in harmonic analysis for studying continuous functions or analog signals, there are discrete transforms for discrete functions or digital signals. As well as the discrete metric there are more general discrete or finite metric spaces and finite topological spaces.