In mathematics, differential refers to several related notions^{ [1]} derived from the early days of calculus, put on a rigorous footing, such as infinitesimal differences and the derivatives of functions.^{ [2]}
The term is used in various branches of mathematics such as calculus, differential geometry, algebraic geometry and algebraic topology.
The term differential is used nonrigorously in calculus to refer to an infinitesimal ("infinitely small") change in some varying quantity. For example, if x is a variable, then a change in the value of x is often denoted Δx (pronounced delta x). The differential dx represents an infinitely small change in the variable x. The idea of an infinitely small or infinitely slow change is, intuitively, extremely useful, and there are a number of ways to make the notion mathematically precise.
Using calculus, it is possible to relate the infinitely small changes of various variables to each other mathematically using derivatives. If y is a function of x, then the differential dy of y is related to dx by the formula
Infinitesimal quantities played a significant role in the development of calculus. Archimedes used them, even though he did not believe that arguments involving infinitesimals were rigorous.^{ [3]} Isaac Newton referred to them as fluxions. However, it was Gottfried Leibniz who coined the term differentials for infinitesimal quantities and introduced the notation for them which is still used today.
In Leibniz's notation, if x is a variable quantity, then dx denotes an infinitesimal change in the variable x. Thus, if y is a function of x, then the derivative of y with respect to x is often denoted dy/dx, which would otherwise be denoted (in the notation of Newton or Lagrange) ẏ or y′. The use of differentials in this form attracted much criticism, for instance in the famous pamphlet The Analyst by Bishop Berkeley. Nevertheless, the notation has remained popular because it suggests strongly the idea that the derivative of y at x is its instantaneous rate of change (the slope of the graph's tangent line), which may be obtained by taking the limit of the ratio Δy/Δx as Δx becomes arbitrarily small. Differentials are also compatible with dimensional analysis, where a differential such as dx has the same dimensions as the variable x.
Calculus evolved into a distinct branch of mathematics during the 17th century CE, although there were antecedents going back to antiquity. The presentations of, e.g., Newton, Leibniz, were marked by non-rigorous definitions of terms like differential, fluent and "infinitely small". While many of the arguments in Bishop Berkeley's 1734 The Analyst are theological in nature, modern mathematicians acknowledge the validity of his argument against " the Ghosts of departed Quantities"; however, the modern approaches do not have the same technical issues. Despite the lack of rigor, immense progress was made in the 17th and 18th centuries. In the 19th century, Cauchy and others gradually developed the Epsilon, delta approach to continuity, limits and derivatives, giving a solid conceptual foundation for calculus.
In the 20th century, several new concepts in, e.g., multivariable calculus, differential geometry, seemed to encapsulate the intent of the old terms, especially differential; both differential and infinitesimal are used with new, more rigorous, meanings.
Differentials are also used in the notation for integrals because an integral can be regarded as an infinite sum of infinitesimal quantities: the area under a graph is obtained by subdividing the graph into infinitely thin strips and summing their areas. In an expression such as
Part of a series of articles about |
Calculus |
---|
There are several approaches for making the notion of differentials mathematically precise.
These approaches are very different from each other, but they have in common the idea of being quantitative, i.e., saying not just that a differential is infinitely small, but how small it is.
There is a simple way to make precise sense of differentials, first used on the Real line by regarding them as linear maps. It can be used on , , a Hilbert space, a Banach space, or more generally, a topological vector space. The case of the Real line is the easiest to explain. This type of differential is also known as a covariant vector or cotangent vector, depending on context.
Suppose is a real-valued function on . We can reinterpret the variable in as being a function rather than a number, namely the identity map on the real line, which takes a real number to itself: . Then is the composite of with , whose value at is . The differential (which of course depends on ) is then a function whose value at (usually denoted ) is not a number, but a linear map from to . Since a linear map from to is given by a matrix, it is essentially the same thing as a number, but the change in the point of view allows us to think of as an infinitesimal and compare it with the standard infinitesimal , which is again just the identity map from to (a matrix with entry ). The identity map has the property that if is very small, then is very small, which enables us to regard it as infinitesimal. The differential has the same property, because it is just a multiple of , and this multiple is the derivative by definition. We therefore obtain that , and hence . Thus we recover the idea that is the ratio of the differentials and .
This would just be a trick were it not for the fact that:
If is a function from to , then we say that is differentiable^{ [8]} at if there is a linear map from to such that for any , there is a neighbourhood of such that for ,
We can now use the same trick as in the one-dimensional case and think of the expression as the composite of with the standard coordinates on (so that is the -th component of ). Then the differentials at a point form a basis for the vector space of linear maps from to and therefore, if is differentiable at , we can write as a linear combination of these basis elements:
The coefficients are (by definition) the partial derivatives of at with respect to . Hence, if is differentiable on all of , we can write, more concisely:
In the one-dimensional case this becomes
This idea generalizes straightforwardly to functions from to . Furthermore, it has the decisive advantage over other definitions of the derivative that it is invariant under changes of coordinates. This means that the same idea can be used to define the differential of smooth maps between smooth manifolds.
Aside: Note that the existence of all the partial derivatives of at is a necessary condition for the existence of a differential at . However it is not a sufficient condition. For counterexamples, see Gateaux derivative.
The same procedure works on a vector space with a enough additional structure to reasonably talk about continuity. The most concrete case is a Hilbert space, also known as a complete inner product space, where the inner product and its associated norm define a suitable concept of distance. The same procedure works for a Banach space, also known as a complete Normed vector space. However, for a more general topological vector space, some of the details are more abstract because there is no concept of distance.
For the important case of a finite dimension, any inner product space is a Hilbert space, any normed vector space is a Banach space and any topological vector space is complete. As a result, you can define a coordinate system from an arbitrary basis and use the same technique as for .
This approach works on any differentiable manifold. If
then f is equivalent to g at p, denoted , if and only if there is an open containing p such that for every x in W. The germ of f at p, denoted , is the set of all real continuous functions equivalent to f at p; if f is smooth at p then is a smooth germ. If
then
This shows that the germs at p form an algebra.
Define to be the set of all smooth germs vanishing at p and to be the product of ideals . Then a differential at p (cotangent vector at p) is an element of . The differential of a smooth function f at p, denoted , is .
A similar approach is to define differential equivalence of first order in terms of derivatives in an arbitrary coordinate patch. Then the differential of f at p is the set of all functions differentially equivalent to at p.
In algebraic geometry, differentials and other infinitesimal notions are handled in a very explicit way by accepting that the coordinate ring or structure sheaf of a space may contain nilpotent elements. The simplest example is the ring of dual numbers Rε], where ε^{2} = 0.
This can be motivated by the algebro-geometric point of view on the derivative of a function f from R to R at a point p. For this, note first that f − f(p) belongs to the ideal I_{p} of functions on R which vanish at p. If the derivative f vanishes at p, then f − f(p) belongs to the square I_{p}^{2} of this ideal. Hence the derivative of f at p may be captured by the equivalence class [f − f(p)] in the quotient space I_{p}/I_{p}^{2}, and the 1-jet of f (which encodes its value and its first derivative) is the equivalence class of f in the space of all functions modulo I_{p}^{2}. Algebraic geometers regard this equivalence class as the restriction of f to a thickened version of the point p whose coordinate ring is not R (which is the quotient space of functions on R modulo I_{p}) but Rε] which is the quotient space of functions on R modulo I_{p}^{2}. Such a thickened point is a simple example of a scheme.^{ [5]}
Differentials are also important in algebraic geometry, and there are several important notions.
A fifth approach to infinitesimals is the method of synthetic differential geometry^{ [9]} or smooth infinitesimal analysis.^{ [10]} This is closely related to the algebraic-geometric approach, except that the infinitesimals are more implicit and intuitive. The main idea of this approach is to replace the category of sets with another category of smoothly varying sets which is a topos. In this category, one can define the real numbers, smooth functions, and so on, but the real numbers automatically contain nilpotent infinitesimals, so these do not need to be introduced by hand as in the algebraic geometric approach. However the logic in this new category is not identical to the familiar logic of the category of sets: in particular, the law of the excluded middle does not hold. This means that set-theoretic mathematical arguments only extend to smooth infinitesimal analysis if they are constructive (e.g., do not use proof by contradiction). Some^{[ who?]} regard this disadvantage as a positive thing, since it forces one to find constructive arguments wherever they are available.
The final approach to infinitesimals again involves extending the real numbers, but in a less drastic way. In the nonstandard analysis approach there are no nilpotent infinitesimals, only invertible ones, which may be viewed as the reciprocals of infinitely large numbers.^{ [7]} Such extensions of the real numbers may be constructed explicitly using equivalence classes of sequences of real numbers, so that, for example, the sequence (1, 1/2, 1/3, ..., 1/n, ...) represents an infinitesimal. The first-order logic of this new set of hyperreal numbers is the same as the logic for the usual real numbers, but the completeness axiom (which involves second-order logic) does not hold. Nevertheless, this suffices to develop an elementary and quite intuitive approach to calculus using infinitesimals, see transfer principle.
The notion of a differential motivates several concepts in differential geometry (and differential topology).
The term differential has also been adopted in homological algebra and algebraic topology, because of the role the exterior derivative plays in de Rham cohomology: in a cochain complex the maps (or coboundary operators) d_{i} are often called differentials. Dually, the boundary operators in a chain complex are sometimes called codifferentials.
The properties of the differential also motivate the algebraic notions of a derivation and a differential algebra.
The word differential has several related meaning in mathematics. In the most common context, it means "related to derivatives." So, for example, the portion of calculus dealing with taking derivatives (i.e., differentiation), is known as differential calculus.
The word "differential" also has a more technical meaning in the theory of differential k-forms as a so-called one-form.