Integration by partsIn calculus, and more generally in mathematical analysis, integration by parts or partial integration is a process that finds the integral of a product of functions in terms of the integral of the product of their derivative and antiderivative. It is frequently used to transform the antiderivative of a product of functions into an antiderivative for which a solution can be more easily found. The rule can be thought of as an integral version of the product rule of differentiation.
CalculusCalculus is the mathematical study of continuous change, in the same way that geometry is the study of shape, and algebra is the study of generalizations of arithmetic operations. It has two major branches, differential calculus and integral calculus; the former concerns instantaneous rates of change, and the slopes of curves, while the latter concerns accumulation of quantities, and areas under or between curves.
Infinite productIn mathematics, for a sequence of complex numbers a1, a2, a3, ... the infinite product is defined to be the limit of the partial products a1a2...an as n increases without bound. The product is said to converge when the limit exists and is not zero. Otherwise the product is said to diverge. A limit of zero is treated specially in order to obtain results analogous to those for infinite sums. Some sources allow convergence to 0 if there are only a finite number of zero factors and the product of the non-zero factors is non-zero, but for simplicity we will not allow that here.
Bernoulli numberIn mathematics, the Bernoulli numbers Bn are a sequence of rational numbers which occur frequently in analysis. The Bernoulli numbers appear in (and can be defined by) the Taylor series expansions of the tangent and hyperbolic tangent functions, in Faulhaber's formula for the sum of m-th powers of the first n positive integers, in the Euler–Maclaurin formula, and in expressions for certain values of the Riemann zeta function. The values of the first 20 Bernoulli numbers are given in the adjacent table.
SummationIn mathematics, summation is the addition of a sequence of any kind of numbers, called addends or summands; the result is their sum or total. Beside numbers, other types of values can be summed as well: functions, vectors, matrices, polynomials and, in general, elements of any type of mathematical objects on which an operation denoted "+" is defined. Summations of infinite sequences are called series. They involve the concept of limit, and are not considered in this article.
Principal branchIn mathematics, a principal branch is a function which selects one branch ("slice") of a multi-valued function. Most often, this applies to functions defined on the complex plane. Principal branches are used in the definition of many inverse trigonometric functions, such as the selection either to define that or that A more familiar principal branch function, limited to real numbers, is that of a positive real number raised to the power of 1/2. For example, take the relation y = x1/2, where x is any positive real number.
Geometric seriesIn mathematics, a geometric series is the sum of an infinite number of terms that have a constant ratio between successive terms. For example, the series is geometric, because each successive term can be obtained by multiplying the previous term by . In general, a geometric series is written as , where is the coefficient of each term and is the common ratio between adjacent terms.
ExsecantThe exsecant (exsec, exs) and excosecant (excosec, excsc, exc) are trigonometric functions defined in terms of the secant and cosecant functions. They used to be important in fields such as surveying, railway engineering, civil engineering, astronomy, and spherical trigonometry and could help improve accuracy, but are rarely used today except to simplify some calculations.
Ordinary differential equationIn mathematics, an ordinary differential equation (ODE) is a differential equation (DE) dependent on only a single independent variable. As with other DE, its unknown(s) consists of one (or more) function(s) and involves the derivatives of those functions. The term "ordinary" is used in contrast with partial differential equations which may be with respect to one independent variable. A linear differential equation is a differential equation that is defined by a linear polynomial in the unknown function and its derivatives, that is an equation of the form where a_0(x), .
Fundamental theorem of algebraThe fundamental theorem of algebra, also known as d'Alembert's theorem, or the d'Alembert–Gauss theorem, states that every non-constant single-variable polynomial with complex coefficients has at least one complex root. This includes polynomials with real coefficients, since every real number is a complex number with its imaginary part equal to zero. Equivalently (by definition), the theorem states that the field of complex numbers is algebraically closed.