Entropy productionEntropy production (or generation) is the amount of entropy which is produced during heat process to evaluate the efficiency of the process. Entropy is produced in irreversible processes. The importance of avoiding irreversible processes (hence reducing the entropy production) was recognized as early as 1824 by Carnot. In 1865 Rudolf Clausius expanded his previous work from 1854 on the concept of "unkompensierte Verwandlungen" (uncompensated transformations), which, in our modern nomenclature, would be called the entropy production.
Principle of maximum entropyThe principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition that expresses testable information). Another way of stating this: Take precisely stated prior data or testable information about a probability distribution function. Consider the set of all trial probability distributions that would encode the prior data.
Ensemble (mathematical physics)In physics, specifically statistical mechanics, an ensemble (also statistical ensemble) is an idealization consisting of a large number of virtual copies (sometimes infinitely many) of a system, considered all at once, each of which represents a possible state that the real system might be in. In other words, a statistical ensemble is a set of systems of particles used in statistical mechanics to describe a single system. The concept of an ensemble was introduced by J. Willard Gibbs in 1902.
Reversible process (thermodynamics)In thermodynamics, a reversible process is a process, involving a system and its surroundings, whose direction can be reversed by infinitesimal changes in some properties of the surroundings, such as pressure or temperature. Throughout an entire reversible process, the system is in thermodynamic equilibrium, both physical and chemical, and nearly in pressure and temperature equilibrium with its surroundings. This prevents unbalanced forces and acceleration of moving system boundaries, which in turn avoids friction and other dissipation.
Numerical cognitionNumerical cognition is a subdiscipline of cognitive science that studies the cognitive, developmental and neural bases of numbers and mathematics. As with many cognitive science endeavors, this is a highly interdisciplinary topic, and includes researchers in cognitive psychology, developmental psychology, neuroscience and cognitive linguistics. This discipline, although it may interact with questions in the philosophy of mathematics, is primarily concerned with empirical questions.
Differential entropyDifferential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average (surprisal) of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP).
John von NeumannJohn von Neumann (vɒn_ˈnɔɪmən ; Neumann János Lajos ˈnɒjmɒn ˈjaːnoʃ ˈlɒjoʃ; December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. He was regarded as having perhaps the widest coverage of any mathematician of his time and was said to have been "the last representative of the great mathematicians who were equally at home in both pure and applied mathematics". He integrated pure and applied sciences.
Objections to evolutionObjections to evolution have been raised since evolutionary ideas came to prominence in the 19th century. When Charles Darwin published his 1859 book On the Origin of Species, his theory of evolution (the idea that species arose through descent with modification from a single common ancestor in a process driven by natural selection) initially met opposition from scientists with different theories, but eventually came to receive overwhelming acceptance in the scientific community.
Min-entropyThe min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the most likely outcome. The various Rényi entropies are all equal for a uniform distribution, but measure the unpredictability of a nonuniform distribution in different ways.
Magnetic resonance angiographyMagnetic resonance angiography (MRA) is a group of techniques based on magnetic resonance imaging (MRI) to image blood vessels. Magnetic resonance angiography is used to generate images of arteries (and less commonly veins) in order to evaluate them for stenosis (abnormal narrowing), occlusions, aneurysms (vessel wall dilatations, at risk of rupture) or other abnormalities. MRA is often used to evaluate the arteries of the neck and brain, the thoracic and abdominal aorta, the renal arteries, and the legs (the latter exam is often referred to as a "run-off").