EntropyEntropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory.
Entropy (classical thermodynamics)In classical thermodynamics, entropy () is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy.
Entropy as an arrow of timeEntropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one goes "forward" in time, the second law of thermodynamics says, the entropy of an isolated system can increase, but not decrease. Thus, entropy measurement is a way of distinguishing the past from the future. In thermodynamic systems that are not isolated, local entropy can decrease over time, accompanied by a compensating entropy increase in the surroundings; examples include objects undergoing cooling, living systems, and the formation of typical crystals.
Entropy (information theory)In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
SemiconductorA semiconductor is a material which has an electrical conductivity value falling between that of a conductor, such as copper, and an insulator, such as glass. Its resistivity falls as its temperature rises; metals behave in the opposite way. Its conducting properties may be altered in useful ways by introducing impurities ("doping") into the crystal structure. When two differently doped regions exist in the same crystal, a semiconductor junction is created.
Fermi levelThe Fermi level of a solid-state body is the thermodynamic work required to add one electron to the body. It is a thermodynamic quantity usually denoted by μ or EF for brevity. The Fermi level does not include the work required to remove the electron from wherever it came from. A precise understanding of the Fermi level—how it relates to electronic band structure in determining electronic properties; how it relates to the voltage and flow of charge in an electronic circuit—is essential to an understanding of solid-state physics.
Boltzmann's entropy formulaIn statistical mechanics, Boltzmann's equation (also known as the Boltzmann–Planck equation) is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity (commonly denoted as or ), the number of real microstates corresponding to the gas's macrostate: where is the Boltzmann constant (also written as simply ) and equal to 1.380649 × 10−23 J/K, and is the natural logarithm function (also written as , as in the image above).
Heavy fermion materialIn solid-state physics, heavy fermion materials are a specific type of intermetallic compound, containing elements with 4f or 5f electrons in unfilled electron bands. Electrons are one type of fermion, and when they are found in such materials, they are sometimes referred to as heavy electrons. Heavy fermion materials have a low-temperature specific heat whose linear term is up to 1000 times larger than the value expected from the free electron model.
Electric-field screeningIn physics, screening is the damping of electric fields caused by the presence of mobile charge carriers. It is an important part of the behavior of charge-carrying fluids, such as ionized gases (classical plasmas), electrolytes, and charge carriers in electronic conductors (semiconductors, metals). In a fluid, with a given permittivity ε, composed of electrically charged constituent particles, each pair of particles (with charges q1 and q2) interact through the Coulomb force as where the vector r is the relative position between the charges.
Electrical resistivity and conductivityElectrical resistivity (also called volume resistivity or specific electrical resistance) is a fundamental specific property of a material that measures its electrical resistance or how strongly it resists electric current. A low resistivity indicates a material that readily allows electric current. Resistivity is commonly represented by the Greek letter ρ (rho). The SI unit of electrical resistivity is the ohm-metre (Ω⋅m).