In mathematics, a limit is the value that a function (or sequence) approaches as the input (or index) approaches some value. Limits are essential to calculus and mathematical analysis, and are used to define continuity, derivatives, and integrals.
The concept of a limit of a sequence is further generalized to the concept of a limit of a topological net, and is closely related to and direct limit in .
In formulas, a limit of a function is usually written as
(although a few authors use "Lt" instead of "lim")
and is read as "the limit of f of x as x approaches c equals L". The fact that a function f approaches the limit L as x approaches c is sometimes denoted by a right arrow (→ or ), as in
which reads " of tends to as tends to ".
Grégoire de Saint-Vincent gave the first definition of limit (terminus) of a geometric series in his work Opus Geometricum (1647): "The terminus of a progression is the end of the series, which none progression can reach, even not if she is continued in infinity, but which she can approach nearer than a given segment."
The modern definition of a limit goes back to Bernard Bolzano who, in 1817, developed the basics of the epsilon-delta technique to define continuous functions. However, his work remained unknown to other mathematicians until thirty years after his death.
Augustin-Louis Cauchy in 1821, followed by Karl Weierstrass, formalized the definition of the limit of a function which became known as the (ε, δ)-definition of limit.
The modern notation of placing the arrow below the limit symbol is due to G. H. Hardy, who introduced it in his book A Course of Pure Mathematics in 1908.
Limit of a sequence
The expression 0.999... should be interpreted as the limit of the sequence 0.9, 0.99, 0.999, ... and so on. This sequence can be rigorously shown to have the limit 1, and therefore this expression is meaningfully interpreted as having the value 1.
Formally, suppose a1, a2, ... is a sequence of real numbers.