EntropyEntropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory.
Valence and conduction bandsIn solid-state physics, the valence band and conduction band are the bands closest to the Fermi level, and thus determine the electrical conductivity of the solid. In nonmetals, the valence band is the highest range of electron energies in which electrons are normally present at absolute zero temperature, while the conduction band is the lowest range of vacant electronic states. On a graph of the electronic band structure of a semiconducting material, the valence band is located below the Fermi level, while the conduction band is located above it.
Electron holeIn physics, chemistry, and electronic engineering, an electron hole (often simply called a hole) is a quasiparticle denoting the lack of an electron at a position where one could exist in an atom or atomic lattice. Since in a normal atom or crystal lattice the negative charge of the electrons is balanced by the positive charge of the atomic nuclei, the absence of an electron leaves a net positive charge at the hole's location. Holes in a metal or semiconductor crystal lattice can move through the lattice as electrons can, and act similarly to positively-charged particles.
Heat capacity ratioIn thermal physics and thermodynamics, the heat capacity ratio, also known as the adiabatic index, the ratio of specific heats, or Laplace's coefficient, is the ratio of the heat capacity at constant pressure (CP) to heat capacity at constant volume (CV). It is sometimes also known as the isentropic expansion factor and is denoted by γ (gamma) for an ideal gas or κ (kappa), the isentropic exponent for a real gas. The symbol γ is used by aerospace and chemical engineers.
Rényi entropyIn information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions. The Rényi entropy is important in ecology and statistics as index of diversity.