Universality (dynamical systems)In statistical mechanics, universality is the observation that there are properties for a large class of systems that are independent of the dynamical details of the system. Systems display universality in a scaling limit, when a large number of interacting parts come together. The modern meaning of the term was introduced by Leo Kadanoff in the 1960s, but a simpler version of the concept was already implicit in the van der Waals equation and in the earlier Landau theory of phase transitions, which did not incorporate scaling correctly.
Paper machineA paper machine (or paper-making machine) is an industrial machine which is used in the pulp and paper industry to create paper in large quantities at high speed. Modern paper-making machines are based on the principles of the Fourdrinier Machine, which uses a moving woven mesh to create a continuous paper web by filtering out the fibres held in a paper stock and producing a continuously moving wet mat of fibre. This is dried in the machine to produce a strong paper web.
Scale invarianceIn physics, mathematics and statistics, scale invariance is a feature of objects or laws that do not change if scales of length, energy, or other variables, are multiplied by a common factor, and thus represent a universality. The technical term for this transformation is a dilatation (also known as dilation). Dilatations can form part of a larger conformal symmetry. In mathematics, scale invariance usually refers to an invariance of individual functions or curves.
RationalityRationality is the quality of being guided by or based on reasons. In this regard, a person acts rationally if they have a good reason for what they do or a belief is rational if it is based on strong evidence. This quality can apply to an ability, as in rational animal, to a psychological process, like reasoning, to mental states, such as beliefs and intentions, or to persons who possess these other forms of rationality.
Scene graphA scene graph is a general data structure commonly used by vector-based graphics editing applications and modern computer games, which arranges the logical and often spatial representation of a graphical scene. It is a collection of nodes in a graph or tree structure. A tree node may have many children but only a single parent, with the effect of a parent applied to all its child nodes; an operation performed on a group automatically propagates its effect to all of its members.
Recrystallization (metallurgy)In materials science, recrystallization is a process by which deformed grains are replaced by a new set of defect-free grains that nucleate and grow until the original grains have been entirely consumed. Recrystallization is usually accompanied by a reduction in the strength and hardness of a material and a simultaneous increase in the ductility. Thus, the process may be introduced as a deliberate step in metals processing or may be an undesirable byproduct of another processing step.
Computational geometryComputational geometry is a branch of computer science devoted to the study of algorithms which can be stated in terms of geometry. Some purely geometrical problems arise out of the study of computational geometric algorithms, and such problems are also considered to be part of computational geometry. While modern computational geometry is a recent development, it is one of the oldest fields of computing with a history stretching back to antiquity.
SimplexIn geometry, a simplex (plural: simplexes or simplices) is a generalization of the notion of a triangle or tetrahedron to arbitrary dimensions. The simplex is so-named because it represents the simplest possible polytope in any given dimension. For example, a 0-dimensional simplex is a point, a 1-dimensional simplex is a line segment, a 2-dimensional simplex is a triangle, a 3-dimensional simplex is a tetrahedron, and a 4-dimensional simplex is a 5-cell. Specifically, a k-simplex is a k-dimensional polytope which is the convex hull of its k + 1 vertices.
Scheme (programming language)Scheme is a dialect of the Lisp family of programming languages. Scheme was created during the 1970s at the MIT Computer Science and Artificial Intelligence Laboratory (MIT AI Lab) and released by its developers, Guy L. Steele and Gerald Jay Sussman, via a series of memos now known as the Lambda Papers. It was the first dialect of Lisp to choose lexical scope and the first to require implementations to perform tail-call optimization, giving stronger support for functional programming and associated techniques such as recursive algorithms.
Sample size determinationSample size determination is the act of choosing the number of observations or replicates to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences about a population from a sample. In practice, the sample size used in a study is usually determined based on the cost, time, or convenience of collecting the data, and the need for it to offer sufficient statistical power.