EntropyEntropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory.
Heat capacityHeat capacity or thermal capacity is a physical property of matter, defined as the amount of heat to be supplied to an object to produce a unit change in its temperature. The SI unit of heat capacity is joule per kelvin (J/K). Heat capacity is an extensive property. The corresponding intensive property is the specific heat capacity, found by dividing the heat capacity of an object by its mass. Dividing the heat capacity by the amount of substance in moles yields its molar heat capacity.
Entropy (statistical thermodynamics)The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.
AmphiboleAmphibole (ˈæmfəboʊl) is a group of inosilicate minerals, forming prism or needlelike crystals, composed of double chain SiO4 tetrahedra, linked at the vertices and generally containing ions of iron and/or magnesium in their structures. Its IMA symbol is Amp. Amphiboles can be green, black, colorless, white, yellow, blue, or brown. The International Mineralogical Association currently classifies amphiboles as a mineral supergroup, within which are two groups and several subgroups.
HeatIn thermodynamics, heat is the thermal energy transferred between systems due to a temperature difference. In colloquial use, heat sometimes refers to thermal energy itself. An example of formal vs. informal usage may be obtained from the right-hand photo, in which the metal bar is "conducting heat" from its hot end to its cold end, but if the metal bar is considered a thermodynamic system, then the energy flowing within the metal bar is called internal energy, not heat.
Specific heat capacityIn thermodynamics, the specific heat capacity (symbol c) of a substance is the heat capacity of a sample of the substance divided by the mass of the sample, also sometimes referred to as massic heat capacity. Informally, it is the amount of heat that must be added to one unit of mass of the substance in order to cause an increase of one unit in temperature. The SI unit of specific heat capacity is joule per kelvin per kilogram, J⋅kg−1⋅K−1.
Third law of thermodynamicsThe third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature approaches absolute zero. This constant value cannot depend on any other parameters characterizing the system, such as pressure or applied magnetic field. At absolute zero (zero kelvins) the system must be in a state with the minimum possible energy. Entropy is related to the number of accessible microstates, and there is typically one unique state (called the ground state) with minimum energy.
Heat capacity ratioIn thermal physics and thermodynamics, the heat capacity ratio, also known as the adiabatic index, the ratio of specific heats, or Laplace's coefficient, is the ratio of the heat capacity at constant pressure (CP) to heat capacity at constant volume (CV). It is sometimes also known as the isentropic expansion factor and is denoted by γ (gamma) for an ideal gas or κ (kappa), the isentropic exponent for a real gas. The symbol γ is used by aerospace and chemical engineers.
Entropy (classical thermodynamics)In classical thermodynamics, entropy () is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy.
CalorimetryIn chemistry and thermodynamics, calorimetry () is the science or act of measuring changes in state variables of a body for the purpose of deriving the heat transfer associated with changes of its state due, for example, to chemical reactions, physical changes, or phase transitions under specified constraints. Calorimetry is performed with a calorimeter. Scottish physician and scientist Joseph Black, who was the first to recognize the distinction between heat and temperature, is said to be the founder of the science of calorimetry.