Automatic summarizationAutomatic summarization is the process of shortening a set of data computationally, to create a subset (a summary) that represents the most important or relevant information within the original content. Artificial intelligence algorithms are commonly developed and employed to achieve this, specialized for different types of data. Text summarization is usually implemented by natural language processing methods, designed to locate the most informative sentences in a given document.
Brunnian linkIn knot theory, a branch of topology, a Brunnian link is a nontrivial link that becomes a set of trivial unlinked circles if any one component is removed. In other words, cutting any loop frees all the other loops (so that no two loops can be directly linked). The name Brunnian is after Hermann Brunn. Brunn's 1892 article Über Verkettung included examples of such links. The best-known and simplest possible Brunnian link is the Borromean rings, a link of three unknots.
Natural language generationNatural language generation (NLG) is a software process that produces natural language output. A widely-cited survey of NLG methods describes NLG as "the subfield of artificial intelligence and computational linguistics that is concerned with the construction of computer systems than can produce understandable texts in English or other human languages from some underlying non-linguistic representation of information". While it is widely agreed that the output of any NLG process is text, there is some disagreement about whether the inputs of an NLG system need to be non-linguistic.
Knot theoryIn topology, knot theory is the study of mathematical knots. While inspired by knots which appear in daily life, such as those in shoelaces and rope, a mathematical knot differs in that the ends are joined so it cannot be undone, the simplest knot being a ring (or "unknot"). In mathematical language, a knot is an embedding of a circle in 3-dimensional Euclidean space, . Two mathematical knots are equivalent if one can be transformed into the other via a deformation of upon itself (known as an ambient isotopy); these transformations correspond to manipulations of a knotted string that do not involve cutting it or passing it through itself.
Constant-recursive sequenceIn mathematics and theoretical computer science, a constant-recursive sequence is an infinite sequence of numbers where each number in the sequence is equal to a fixed linear combination of one or more of its immediate predecessors. A constant-recursive sequence is also known as a linear recurrence sequence, linear-recursive sequence, linear-recurrent sequence, a C-finite sequence, or a solution to a linear recurrence with constant coefficients.
Alexander polynomialIn mathematics, the Alexander polynomial is a knot invariant which assigns a polynomial with integer coefficients to each knot type. James Waddell Alexander II discovered this, the first knot polynomial, in 1923. In 1969, John Conway showed a version of this polynomial, now called the Alexander–Conway polynomial, could be computed using a skein relation, although its significance was not realized until the discovery of the Jones polynomial in 1984.
Floer homologyIn mathematics, Floer homology is a tool for studying symplectic geometry and low-dimensional topology. Floer homology is a novel invariant that arises as an infinite-dimensional analogue of finite-dimensional Morse homology. Andreas Floer introduced the first version of Floer homology, now called Lagrangian Floer homology, in his proof of the Arnold conjecture in symplectic geometry. Floer also developed a closely related theory for Lagrangian submanifolds of a symplectic manifold.
Canonical quantum gravityIn physics, canonical quantum gravity is an attempt to quantize the canonical formulation of general relativity (or canonical gravity). It is a Hamiltonian formulation of Einstein's general theory of relativity. The basic theory was outlined by Bryce DeWitt in a seminal 1967 paper, and based on earlier work by Peter G. Bergmann using the so-called canonical quantization techniques for constrained Hamiltonian systems invented by Paul Dirac. Dirac's approach allows the quantization of systems that include gauge symmetries using Hamiltonian techniques in a fixed gauge choice.
Proof assistantIn computer science and mathematical logic, a proof assistant or interactive theorem prover is a software tool to assist with the development of formal proofs by human-machine collaboration. This involves some sort of interactive proof editor, or other interface, with which a human can guide the search for proofs, the details of which are stored in, and some steps provided by, a computer. A recent effort within this field is making these tools use artificial intelligence to automate the formalization of ordinary mathematics.
Intensive and extensive propertiesPhysical properties of materials and systems can often be categorized as being either intensive or extensive, according to how the property changes when the size (or extent) of the system changes. According to IUPAC, an intensive quantity is one whose magnitude is independent of the size of the system, whereas an extensive quantity is one whose magnitude is additive for subsystems. The terms "intensive and extensive quantities" were introduced into physics by German writer Georg Helm in 1898, and by American physicist and chemist Richard C.