Cusp (singularity)In mathematics, a cusp, sometimes called spinode in old texts, is a point on a curve where a moving point must reverse direction. A typical example is given in the figure. A cusp is thus a type of singular point of a curve. For a plane curve defined by an analytic, parametric equation a cusp is a point where both derivatives of f and g are zero, and the directional derivative, in the direction of the tangent, changes sign (the direction of the tangent is the direction of the slope ).
Arc lengthArc length is the distance between two points along a section of a curve. Determining the length of an irregular arc segment by approximating the arc segment as connected (straight) line segments is also called curve rectification. A rectifiable curve has a finite number of segments in its rectification (so the curve has a finite length). If a curve can be parameterized as an injective and continuously differentiable function (i.e., the derivative is a continuous function) , then the curve is rectifiable (i.
GradientIn vector calculus, the gradient of a scalar-valued differentiable function of several variables is the vector field (or vector-valued function) whose value at a point is the "direction and rate of fastest increase". If the gradient of a function is non-zero at a point , the direction of the gradient is the direction in which the function increases most quickly from , and the magnitude of the gradient is the rate of increase in that direction, the greatest absolute directional derivative.
Implicit function theoremIn multivariable calculus, the implicit function theorem is a tool that allows relations to be converted to functions of several real variables. It does so by representing the relation as the graph of a function. There may not be a single function whose graph can represent the entire relation, but there may be such a function on a restriction of the domain of the relation. The implicit function theorem gives a sufficient condition to ensure that there is such a function. More precisely, given a system of m equations fi (x1, .
Bernoulli trialIn the theory of probability and statistics, a Bernoulli trial (or binomial trial) is a random experiment with exactly two possible outcomes, "success" and "failure", in which the probability of success is the same every time the experiment is conducted. It is named after Jacob Bernoulli, a 17th-century Swiss mathematician, who analyzed them in his Ars Conjectandi (1713). The mathematical formalisation of the Bernoulli trial is known as the Bernoulli process.
Divergent seriesIn mathematics, a divergent series is an infinite series that is not convergent, meaning that the infinite sequence of the partial sums of the series does not have a finite limit. If a series converges, the individual terms of the series must approach zero. Thus any series in which the individual terms do not approach zero diverges. However, convergence is a stronger condition: not all series whose terms approach zero converge. A counterexample is the harmonic series The divergence of the harmonic series was proven by the medieval mathematician Nicole Oresme.
Differential equationIn mathematics, a differential equation is an equation that relates one or more unknown functions and their derivatives. In applications, the functions generally represent physical quantities, the derivatives represent their rates of change, and the differential equation defines a relationship between the two. Such relations are common; therefore, differential equations play a prominent role in many disciplines including engineering, physics, economics, and biology.
Coin flippingCoin flipping, coin tossing, or heads or tails is the practice of throwing a coin in the air and checking which side is showing when it lands, in order to choose between two alternatives, heads or tails, sometimes used to resolve a dispute between two parties. It is a form of sortition which inherently has two possible outcomes. The party who calls the side that is facing up when the coin lands wins. Coin flipping was known to the Romans as navia aut caput ("ship or head"), as some coins had a ship on one side and the head of the emperor on the other.
Fermat's theorem (stationary points)In mathematics, Fermat's theorem (also known as interior extremum theorem) is a method to find local maxima and minima of differentiable functions on open sets by showing that every local extremum of the function is a stationary point (the function's derivative is zero at that point). Fermat's theorem is a theorem in real analysis, named after Pierre de Fermat. By using Fermat's theorem, the potential extrema of a function , with derivative , are found by solving an equation in .
Integral test for convergenceIn mathematics, the integral test for convergence is a method used to test infinite series of monotonous terms for convergence. It was developed by Colin Maclaurin and Augustin-Louis Cauchy and is sometimes known as the Maclaurin–Cauchy test. Consider an integer N and a function f defined on the unbounded interval , on which it is monotone decreasing. Then the infinite series converges to a real number if and only if the improper integral is finite. In particular, if the integral diverges, then the series diverges as well.