LightnessLightness is a visual perception of the luminance of an object. It is often judged relative to a similarly lit object. In colorimetry and color appearance models, lightness is a prediction of how an illuminated color will appear to a standard observer. While luminance is a linear measurement of light, lightness is a linear prediction of the human perception of that light. This distinction is meaningful because human vision's lightness perception is non-linear relative to light.
Space complexityThe space complexity of an algorithm or a computer program is the amount of memory space required to solve an instance of the computational problem as a function of characteristics of the input. It is the memory required by an algorithm until it executes completely. This includes the memory space used by its inputs, called input space, and any other (auxiliary) memory it uses during execution, which is called auxiliary space. Similar to time complexity, space complexity is often expressed asymptotically in big O notation, such as etc.
Communication complexityIn theoretical computer science, communication complexity studies the amount of communication required to solve a problem when the input to the problem is distributed among two or more parties. The study of communication complexity was first introduced by Andrew Yao in 1979, while studying the problem of computation distributed among several machines. The problem is usually stated as follows: two parties (traditionally called Alice and Bob) each receive a (potentially different) -bit string and .
Transformation (function)In mathematics, a transformation is a function f, usually with some geometrical underpinning, that maps a set X to itself, i.e. f: X → X. Examples include linear transformations of vector spaces and geometric transformations, which include projective transformations, affine transformations, and specific affine transformations, such as rotations, reflections and translations.
Kolmogorov complexityIn algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as algorithmic complexity, Solomonoff–Kolmogorov–Chaitin complexity, program-size complexity, descriptive complexity, or algorithmic entropy.
Strong perfect graph theoremIn graph theory, the strong perfect graph theorem is a forbidden graph characterization of the perfect graphs as being exactly the graphs that have neither odd holes (odd-length induced cycles of length at least 5) nor odd antiholes (complements of odd holes). It was conjectured by Claude Berge in 1961. A proof by Maria Chudnovsky, Neil Robertson, Paul Seymour, and Robin Thomas was announced in 2002 and published by them in 2006.
Geometric transformationIn mathematics, a geometric transformation is any bijection of a set to itself (or to another such set) with some salient geometrical underpinning. More specifically, it is a function whose domain and range are sets of points — most often both or both — such that the function is bijective so that its inverse exists. The study of geometry may be approached by the study of these transformations. Geometric transformations can be classified by the dimension of their operand sets (thus distinguishing between, say, planar transformations and spatial transformations).
Transparency (behavior)As an ethic that spans science, engineering, business, and the humanities, transparency is operating in such a way that it is easy for others to see what actions are performed. Transparency implies openness, communication, and accountability. Transparency is practiced in companies, organizations, administrations, and communities. For example, in a business relation, fees are clarified at the outset by a transparent agent, so there are no surprises later. This is opposed to keeping this information hidden which is "non-transparent".
Desargues graphIn the mathematical field of graph theory, the Desargues graph is a distance-transitive, cubic graph with 20 vertices and 30 edges. It is named after Girard Desargues, arises from several different combinatorial constructions, has a high level of symmetry, is the only known non-planar cubic partial cube, and has been applied in chemical databases. The name "Desargues graph" has also been used to refer to a ten-vertex graph, the complement of the Petersen graph, which can also be formed as the bipartite half of the 20-vertex Desargues graph.
Randomized algorithmA randomized algorithm is an algorithm that employs a degree of randomness as part of its logic or procedure. The algorithm typically uses uniformly random bits as an auxiliary input to guide its behavior, in the hope of achieving good performance in the "average case" over all possible choices of random determined by the random bits; thus either the running time, or the output (or both) are random variables.