Concept

Differential of a function

Summary
In calculus, the differential represents the principal part of the change in a function with respect to changes in the independent variable. The differential is defined by where is the derivative of f with respect to , and is an additional real variable (so that is a function of and ). The notation is such that the equation holds, where the derivative is represented in the Leibniz notation , and this is consistent with regarding the derivative as the quotient of the differentials. One also writes The precise meaning of the variables and depends on the context of the application and the required level of mathematical rigor. The domain of these variables may take on a particular geometrical significance if the differential is regarded as a particular differential form, or analytical significance if the differential is regarded as a linear approximation to the increment of a function. Traditionally, the variables and are considered to be very small (infinitesimal), and this interpretation is made rigorous in non-standard analysis. The differential was first introduced via an intuitive or heuristic definition by Isaac Newton and furthered by Gottfried Leibniz, who thought of the differential dy as an infinitely small (or infinitesimal) change in the value y of the function, corresponding to an infinitely small change dx in the function's argument x. For that reason, the instantaneous rate of change of y with respect to x, which is the value of the derivative of the function, is denoted by the fraction in what is called the Leibniz notation for derivatives. The quotient is not infinitely small; rather it is a real number. The use of infinitesimals in this form was widely criticized, for instance by the famous pamphlet The Analyst by Bishop Berkeley. Augustin-Louis Cauchy (1823) defined the differential without appeal to the atomism of Leibniz's infinitesimals. Instead, Cauchy, following d'Alembert, inverted the logical order of Leibniz and his successors: the derivative itself became the fundamental object, defined as a limit of difference quotients, and the differentials were then defined in terms of it.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Ontological neighbourhood
Related people (2)
Related concepts (16)
Notation for differentiation
In differential calculus, there is no single uniform notation for differentiation. Instead, various notations for the derivative of a function or variable have been proposed by various mathematicians. The usefulness of each notation varies with the context, and it is sometimes advantageous to use more than one notation in a given context. The most common notations for differentiation (and its opposite operation, the antidifferentiation or indefinite integration) are listed below.
Function of several real variables
In mathematical analysis and its applications, a function of several real variables or real multivariate function is a function with more than one argument, with all arguments being real variables. This concept extends the idea of a function of a real variable to several variables. The "input" variables take real values, while the "output", also called the "value of the function", may be real or complex.
Differential (mathematics)
In mathematics, differential refers to several related notions derived from the early days of calculus, put on a rigorous footing, such as infinitesimal differences and the derivatives of functions. The term is used in various branches of mathematics such as calculus, differential geometry, algebraic geometry and algebraic topology. The term differential is used nonrigorously in calculus to refer to an infinitesimal ("infinitely small") change in some varying quantity.
Show more