Absolute convergenceIn mathematics, an infinite series of numbers is said to converge absolutely (or to be absolutely convergent) if the sum of the absolute values of the summands is finite. More precisely, a real or complex series is said to converge absolutely if for some real number Similarly, an improper integral of a function, is said to converge absolutely if the integral of the absolute value of the integrand is finite—that is, if Absolute convergence is important for the study of infinite series because its definition is strong enough to have properties of finite sums that not all convergent series possess – a convergent series that is not absolutely convergent is called conditionally convergent, while absolutely convergent series behave "nicely".
Harmonic series (mathematics)In mathematics, the harmonic series is the infinite series formed by summing all positive unit fractions: The first terms of the series sum to approximately , where is the natural logarithm and is the Euler–Mascheroni constant. Because the logarithm has arbitrarily large values, the harmonic series does not have a finite limit: it is a divergent series. Its divergence was proven in the 14th century by Nicole Oresme using a precursor to the Cauchy condensation test for the convergence of infinite series.
Monotone convergence theoremIn the mathematical field of real analysis, the monotone convergence theorem is any of a number of related theorems proving the convergence of monotonic sequences (sequences that are non-decreasing or non-increasing) that are also bounded. Informally, the theorems state that if a sequence is increasing and bounded above by a supremum, then the sequence will converge to the supremum; in the same way, if a sequence is decreasing and is bounded below by an infimum, it will converge to the infimum.
Root testIn mathematics, the root test is a criterion for the convergence (a convergence test) of an infinite series. It depends on the quantity where are the terms of the series, and states that the series converges absolutely if this quantity is less than one, but diverges if it is greater than one. It is particularly useful in connection with power series. The root test was developed first by Augustin-Louis Cauchy who published it in his textbook Cours d'analyse (1821). Thus, it is sometimes known as the Cauchy root test or Cauchy's radical test.
Divergent seriesIn mathematics, a divergent series is an infinite series that is not convergent, meaning that the infinite sequence of the partial sums of the series does not have a finite limit. If a series converges, the individual terms of the series must approach zero. Thus any series in which the individual terms do not approach zero diverges. However, convergence is a stronger condition: not all series whose terms approach zero converge. A counterexample is the harmonic series The divergence of the harmonic series was proven by the medieval mathematician Nicole Oresme.
Binary logarithmIn mathematics, the binary logarithm (log2n) is the power to which the number 2 must be raised to obtain the value n. That is, for any real number x, For example, the binary logarithm of 1 is 0, the binary logarithm of 2 is 1, the binary logarithm of 4 is 2, and the binary logarithm of 32 is 5. The binary logarithm is the logarithm to the base 2 and is the inverse function of the power of two function. As well as log2, an alternative notation for the binary logarithm is lb (the notation preferred by ISO 31-11 and ISO 80000-2).
Conditional convergenceIn mathematics, a series or integral is said to be conditionally convergent if it converges, but it does not converge absolutely. More precisely, a series of real numbers is said to converge conditionally if exists (as a finite real number, i.e. not or ), but A classic example is the alternating harmonic series given by which converges to , but is not absolutely convergent (see Harmonic series). Bernhard Riemann proved that a conditionally convergent series may be rearranged to converge to any value at all, including ∞ or −∞; see Riemann series theorem.
Hyperbolic functionsIn mathematics, hyperbolic functions are analogues of the ordinary trigonometric functions, but defined using the hyperbola rather than the circle. Just as the points (cos t, sin t) form a circle with a unit radius, the points (cosh t, sinh t) form the right half of the unit hyperbola. Also, similarly to how the derivatives of sin(t) and cos(t) are cos(t) and –sin(t) respectively, the derivatives of sinh(t) and cosh(t) are cosh(t) and +sinh(t) respectively. Hyperbolic functions occur in the calculations of angles and distances in hyperbolic geometry.
Power of twoA power of two is a number of the form 2n where n is an integer, that is, the result of exponentiation with number two as the base and integer n as the exponent. In a context where only integers are considered, n is restricted to non-negative values, so there are 1, 2, and 2 multiplied by itself a certain number of times. The first ten powers of 2 for non-negative values of n are: 1, 2, 4, 8, 16, 32, 64, 128, 256, 512, ... Because two is the base of the binary numeral system, powers of two are common in computer science.
Power seriesIn mathematics, a power series (in one variable) is an infinite series of the form where an represents the coefficient of the nth term and c is a constant. Power series are useful in mathematical analysis, where they arise as Taylor series of infinitely differentiable functions. In fact, Borel's theorem implies that every power series is the Taylor series of some smooth function. In many situations, c (the center of the series) is equal to zero, for instance when considering a Maclaurin series.