Free entropyA thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies frequently appear as the logarithm of a partition function. The Onsager reciprocal relations in particular, are developed in terms of entropic potentials. In mathematics, free entropy means something quite different: it is a generalization of entropy defined in the subject of free probability.
Heat exchangerA heat exchanger is a system used to transfer heat between a source and a working fluid. Heat exchangers are used in both cooling and heating processes. The fluids may be separated by a solid wall to prevent mixing or they may be in direct contact. They are widely used in space heating, refrigeration, air conditioning, power stations, chemical plants, petrochemical plants, petroleum refineries, natural-gas processing, and sewage treatment.
Ecological debtEcological debt refers to the accumulated debt seen by some campaigners as owed by the Global North to Global South countries, due to the net sum of historical environmental injustice, especially through resource exploitation, habitat degradation, and pollution by waste discharge. The concept was coined by Global Southerner non-governmental organizations in the 1990s and its definition has varied over the years, in several attempts of greater specification.
Carnot heat engineA Carnot heat engine is a theoretical heat engine that operates on the Carnot cycle. The basic model for this engine was developed by Nicolas Léonard Sadi Carnot in 1824. The Carnot engine model was graphically expanded by Benoît Paul Émile Clapeyron in 1834 and mathematically explored by Rudolf Clausius in 1857, work that led to the fundamental thermodynamic concept of entropy. The Carnot engine is the most efficient heat engine which is theoretically possible.
History of entropyThe concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost.
Binary entropy functionIn information theory, the binary entropy function, denoted or , is defined as the entropy of a Bernoulli process with probability of one of two values. It is a special case of , the entropy function. Mathematically, the Bernoulli trial is modelled as a random variable that can take on only two values: 0 and 1, which are mutually exclusive and exhaustive. If , then and the entropy of (in shannons) is given by where is taken to be 0. The logarithms in this formula are usually taken (as shown in the graph) to the base 2.
Negative relationshipIn statistics, there is a negative relationship or inverse relationship between two variables if higher values of one variable tend to be associated with lower values of the other. A negative relationship between two variables usually implies that the correlation between them is negative, or — what is in some contexts equivalent — that the slope in a corresponding graph is negative. A negative correlation between variables is also called anticorrelation or inverse correlation.
Global Footprint NetworkThe Global Footprint Network was founded in 2003 and is an independent think tank originally based in the United States, Belgium and Switzerland. It was established as a charitable not-for-profit organization in each of those three countries. Its aim is to develop and promote tools for advancing sustainability, including the ecological footprint and biocapacity, which measure the amount of resources we use and how much we have. These tools aim at bringing ecological limits to the center of decision-making.
Entropy in thermodynamics and information theoryThe mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s. The defining expression for entropy in the theory of statistical mechanics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, is of the form: where is the probability of the microstate i taken from an equilibrium ensemble, and is the Boltzmann's constant.
Gibbs paradoxIn statistical mechanics, a semi-classical derivation of entropy that does not take into account the indistinguishability of particles yields an expression for entropy which is not extensive (is not proportional to the amount of substance in question). This leads to a paradox known as the Gibbs paradox, after Josiah Willard Gibbs, who proposed this thought experiment in 1874‒1875. The paradox allows for the entropy of closed systems to decrease, violating the second law of thermodynamics.