Market distortionIn neoclassical economics, a market distortion is any event in which a market reaches a market clearing price for an item that is substantially different from the price that a market would achieve while operating under conditions of perfect competition and state enforcement of legal contracts and the ownership of private property. A distortion is "any departure from the ideal of perfect competition that therefore interferes with economic agents maximizing social welfare when they maximize their own".
Laffer curveIn economics, the Laffer curve illustrates a theoretical relationship between rates of taxation and the resulting levels of the government's tax revenue. The Laffer curve assumes that no tax revenue is raised at the extreme tax rates of 0% and 100%, meaning that there is a tax rate between 0% and 100% that maximizes government tax revenue. The shape of the curve is a function of taxable income elasticity—i.e., taxable income changes in response to changes in the rate of taxation.
Total fertility rateThe total fertility rate (TFR) of a population is the average number of children that would be born to a female over their lifetime if: they were to experience the exact current age-specific fertility rates (ASFRs) through their lifetime they were to live from birth until the end of their reproductive life. It is obtained by summing the single-year age-specific rates at a given time. , the total fertility rate varied widely across the world, from 0.78 in South Korea to 6.73 in Niger.
Cross-entropyIn information theory, the cross-entropy between two probability distributions and over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution , rather than the true distribution . The cross-entropy of the distribution relative to a distribution over a given set is defined as follows: where is the expected value operator with respect to the distribution .
EntropyEntropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory.
Phase (matter)In the physical sciences, a phase is a region of material that is chemically uniform, physically distinct, and (often) mechanically separable. In a system consisting of ice and water in a glass jar, the ice cubes are one phase, the water is a second phase, and the humid air is a third phase over the ice and water. The glass of the jar is another separate phase. (See .) More precisely, a phase is a region of space (a thermodynamic system), throughout which all physical properties of a material are essentially uniform.
Tube soundTube sound (or valve sound) is the characteristic sound associated with a vacuum tube amplifier (valve amplifier in British English), a vacuum tube-based audio amplifier. At first, the concept of tube sound did not exist, because practically all electronic amplification of audio signals was done with vacuum tubes and other comparable methods were not known or used. After introduction of solid state amplifiers, tube sound appeared as the logical complement of transistor sound, which had some negative connotations due to crossover distortion in early transistor amplifiers.
Phase diagramA phase diagram in physical chemistry, engineering, mineralogy, and materials science is a type of chart used to show conditions (pressure, temperature, volume, etc.) at which thermodynamically distinct phases (such as solid, liquid or gaseous states) occur and coexist at equilibrium. Common components of a phase diagram are lines of equilibrium or phase boundaries, which refer to lines that mark conditions under which multiple phases can coexist at equilibrium. Phase transitions occur along lines of equilibrium.
Rate of natural increaseIn Demography, the rate of natural increase (RNI), also known as natural population change, is defined as the birth rate subtracted by the death rate of a particular population, over a particular time period. It is typically expressed either as a number per 1,000 individuals in the population or as a percentage. RNI can be either positive or negative. It contrasts to total population change by ignoring net migration. This RNI gives demographers an insight into how a region's population is evolving, and these analyses can inform government attempts to shape RNI.
Entropy (statistical thermodynamics)The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.