EntropyEntropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory.
Entropy (information theory)In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Entropy (statistical thermodynamics)The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.
Spin–spin relaxationIn physics, the spin–spin relaxation is the mechanism by which Mxy, the transverse component of the magnetization vector, exponentially decays towards its equilibrium value in nuclear magnetic resonance (NMR) and magnetic resonance imaging (MRI). It is characterized by the spin–spin relaxation time, known as T2, a time constant characterizing the signal decay. It is named in contrast to T1, the spin–lattice relaxation time.
Spin–lattice relaxationDuring nuclear magnetic resonance observations, spin–lattice relaxation is the mechanism by which the longitudinal component of the total nuclear magnetic moment vector (parallel to the constant magnetic field) exponentially relaxes from a higher energy, non-equilibrium state to thermodynamic equilibrium with its surroundings (the "lattice"). It is characterized by the spin–lattice relaxation time, a time constant known as T1.
Nuclear magnetic resonanceNuclear magnetic resonance (NMR) is a physical phenomenon in which nuclei in a strong constant magnetic field are perturbed by a weak oscillating magnetic field (in the near field) and respond by producing an electromagnetic signal with a frequency characteristic of the magnetic field at the nucleus. This process occurs near resonance, when the oscillation frequency matches the intrinsic frequency of the nuclei, which depends on the strength of the static magnetic field, the chemical environment, and the magnetic properties of the isotope involved; in practical applications with static magnetic fields up to ca.
Rényi entropyIn information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions. The Rényi entropy is important in ecology and statistics as index of diversity.
Nuclear magnetic resonance spectroscopyNuclear magnetic resonance spectroscopy, most commonly known as NMR spectroscopy or magnetic resonance spectroscopy (MRS), is a spectroscopic technique to observe local magnetic fields around atomic nuclei. This spectroscopy is based on the measurement of absorption of electromagnetic radiations in the radio frequency region from roughly 4 to 900 MHz. Absorption of radio waves in the presence of magnetic field is accompanied by a special type of nuclear transition, and for this reason, such type of spectroscopy is known as Nuclear Magnetic Resonance Spectroscopy.
Entropy (classical thermodynamics)In classical thermodynamics, entropy () is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy.
Boltzmann's entropy formulaIn statistical mechanics, Boltzmann's equation (also known as the Boltzmann–Planck equation) is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity (commonly denoted as or ), the number of real microstates corresponding to the gas's macrostate: where is the Boltzmann constant (also written as simply ) and equal to 1.380649 × 10−23 J/K, and is the natural logarithm function (also written as , as in the image above).