Siegel's lemmaIn mathematics, specifically in transcendental number theory and Diophantine approximation, Siegel's lemma refers to bounds on the solutions of linear equations obtained by the construction of auxiliary functions. The existence of these polynomials was proven by Axel Thue; Thue's proof used Dirichlet's box principle. Carl Ludwig Siegel published his lemma in 1929. It is a pure existence theorem for a system of linear equations. Siegel's lemma has been refined in recent years to produce sharper bounds on the estimates given by the lemma.
Efficiency (statistics)In statistics, efficiency is a measure of quality of an estimator, of an experimental design, or of a hypothesis testing procedure. Essentially, a more efficient estimator needs fewer input data or observations than a less efficient one to achieve the Cramér–Rao bound. An efficient estimator is characterized by having the smallest possible variance, indicating that there is a small deviance between the estimated value and the "true" value in the L2 norm sense.
Linear programming relaxationIn mathematics, the relaxation of a (mixed) integer linear program is the problem that arises by removing the integrality constraint of each variable. For example, in a 0–1 integer program, all constraints are of the form The relaxation of the original integer program instead uses a collection of linear constraints The resulting relaxation is a linear program, hence the name.
Moment magnitude scaleThe moment magnitude scale (MMS; denoted explicitly with or Mw, and generally implied with use of a single M for magnitude) is a measure of an earthquake's magnitude ("size" or strength) based on its seismic moment. It was defined in a 1979 paper by Thomas C. Hanks and Hiroo Kanamori. Similar to the local magnitude/Richter scale () defined by Charles Francis Richter in 1935, it uses a logarithmic scale; small earthquakes have approximately the same magnitudes on both scales.
Richter magnitude scaleThe Richter scale (ˈrɪktər), also called the Richter magnitude scale, Richter's magnitude scale, and the Gutenberg–Richter scale, is a measure of the strength of earthquakes, developed by Charles Francis Richter and presented in his landmark 1935 paper, where he called it the "magnitude scale". This was later revised and renamed the local magnitude scale, denoted as ML or .
Order of magnitudeAn order of magnitude is an approximation of the logarithm of a value relative to some contextually understood reference value, usually 10, interpreted as the base of the logarithm and the representative of values of magnitude one. Logarithmic distributions are common in nature and considering the order of magnitude of values sampled from such a distribution can be more intuitive. When the reference value is 10, the order of magnitude can be understood as the number of digits in the base-10 representation of the value.
Apparent magnitudeApparent magnitude (m) is a measure of the brightness of a star or other astronomical object. An object's apparent magnitude depends on its intrinsic luminosity, its distance, and any extinction of the object's light caused by interstellar dust along the line of sight to the observer. The word magnitude in astronomy, unless stated otherwise, usually refers to a celestial object's apparent magnitude. The magnitude scale dates back to the ancient Roman astronomer Claudius Ptolemy, whose star catalog listed stars from 1st magnitude (brightest) to 6th magnitude (dimmest).