EntropyEntropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory.
NegentropyIn information theory and statistics, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book What is Life? Later, Léon Brillouin shortened the phrase to negentropy. In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of biology and physics.
Negative temperatureCertain systems can achieve negative thermodynamic temperature; that is, their temperature can be expressed as a negative quantity on the Kelvin or Rankine scales. This should be distinguished from temperatures expressed as negative numbers on non-thermodynamic Celsius or Fahrenheit scales, which are nevertheless higher than absolute zero. The absolute temperature (Kelvin) scale can be understood loosely as a measure of average kinetic energy. Usually, system temperatures are positive.
Molecular design softwareMolecular design software is notable software for molecular modeling, that provides special support for developing molecular models de novo. In contrast to the usual molecular modeling programs, such as for molecular dynamics and quantum chemistry, such software directly supports the aspects related to constructing molecular models, including: Molecular graphics interactive molecular drawing and conformational editing building polymeric molecules, crystals, and solvated systems partial charges development g
Coordinate timeIn the theory of relativity, it is convenient to express results in terms of a spacetime coordinate system relative to an implied observer. In many (but not all) coordinate systems, an event is specified by one time coordinate and three spatial coordinates. The time specified by the time coordinate is referred to as coordinate time to distinguish it from proper time.
Entropy (classical thermodynamics)In classical thermodynamics, entropy () is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy.
Parallel (geometry)In geometry, parallel lines are coplanar infinite straight lines that do not intersect at any point. Parallel planes are planes in the same three-dimensional space that never meet. Parallel curves are curves that do not touch each other or intersect and keep a fixed minimum distance. In three-dimensional Euclidean space, a line and a plane that do not share a point are also said to be parallel. However, two noncoplanar lines are called skew lines. Parallel lines are the subject of Euclid's parallel postulate.
Surface energyIn surface science, surface free energy (also interfacial free energy or surface energy) quantifies the disruption of intermolecular bonds that occurs when a surface is created. In solid-state physics, surfaces must be intrinsically less energetically favorable than the bulk of the material (the atoms on the surface have more energy compared with the atoms in the bulk), otherwise there would be a driving force for surfaces to be created, removing the bulk of the material (see sublimation).
Force field (chemistry)In the context of chemistry and molecular modelling, a force field is a computational method that is used to estimate the forces between atoms within molecules and also between molecules. More precisely, the force field refers to the functional form and parameter sets used to calculate the potential energy of a system of atoms or coarse-grained particles in molecular mechanics, molecular dynamics, or Monte Carlo simulations. The parameters for a chosen energy function may be derived from experiments in physics and chemistry, calculations in quantum mechanics, or both.
Differential entropyDifferential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average (surprisal) of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP).