EntropyEntropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory.
Entropy (statistical thermodynamics)The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.
Negative temperatureCertain systems can achieve negative thermodynamic temperature; that is, their temperature can be expressed as a negative quantity on the Kelvin or Rankine scales. This should be distinguished from temperatures expressed as negative numbers on non-thermodynamic Celsius or Fahrenheit scales, which are nevertheless higher than absolute zero. The absolute temperature (Kelvin) scale can be understood loosely as a measure of average kinetic energy. Usually, system temperatures are positive.
LiquidA liquid is a nearly incompressible fluid that conforms to the shape of its container but retains a nearly constant volume independent of pressure. It is one of the four fundamental states of matter (the others being solid, gas, and plasma), and is the only state with a definite volume but no fixed shape. The density of a liquid is usually close to that of a solid, and much higher than that of a gas. Therefore, liquid and solid are both termed condensed matter.
Entropy (classical thermodynamics)In classical thermodynamics, entropy () is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy.
Specific heat capacityIn thermodynamics, the specific heat capacity (symbol c) of a substance is the heat capacity of a sample of the substance divided by the mass of the sample, also sometimes referred to as massic heat capacity. Informally, it is the amount of heat that must be added to one unit of mass of the substance in order to cause an increase of one unit in temperature. The SI unit of specific heat capacity is joule per kelvin per kilogram, J⋅kg−1⋅K−1.
Third law of thermodynamicsThe third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero (zero kelvins) the system must be in a state with the minimum possible energy. Entropy is related to the number of accessible microstates, and there is typically one unique state (called the ground state) with minimum energy.
Boltzmann constantThe Boltzmann constant (kB or k) is the proportionality factor that relates the average relative thermal energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, and in Planck's law of black-body radiation and Boltzmann's entropy formula, and is used in calculating thermal noise in resistors. The Boltzmann constant has dimensions of energy divided by temperature, the same as entropy. It is named after the Austrian scientist Ludwig Boltzmann.
Entropy (energy dispersal)In thermodynamics, the interpretation of entropy as a measure of energy dispersal has been exercised against the background of the traditional view, introduced by Ludwig Boltzmann, of entropy as a quantitative measure of disorder. The energy dispersal approach avoids the ambiguous term 'disorder'. An early advocate of the energy dispersal conception was Edward A. Guggenheim in 1949, using the word 'spread'. In this alternative approach, entropy is a measure of energy dispersal or spread at a specific temperature.
ViscosityThe viscosity of a fluid is a measure of its resistance to deformation at a given rate. For liquids, it corresponds to the informal concept of "thickness": for example, syrup has a higher viscosity than water. Viscosity is defined scientifically as a force multiplied by a time divided by an area. Thus its SI units are newton-seconds per square metre, or pascal-seconds. Viscosity quantifies the internal frictional force between adjacent layers of fluid that are in relative motion.