Propagation of uncertaintyIn statistics, propagation of uncertainty (or propagation of error) is the effect of variables' uncertainties (or errors, more specifically random errors) on the uncertainty of a function based on them. When the variables are the values of experimental measurements they have uncertainties due to measurement limitations (e.g., instrument precision) which propagate due to the combination of variables in the function. The uncertainty u can be expressed in a number of ways. It may be defined by the absolute error Δx.
Nuclear cross sectionThe nuclear cross section of a nucleus is used to describe the probability that a nuclear reaction will occur. The concept of a nuclear cross section can be quantified physically in terms of "characteristic area" where a larger area means a larger probability of interaction. The standard unit for measuring a nuclear cross section (denoted as σ) is the barn, which is equal to e-28m2, e-24cm2 or 100fm2.
Absolute magnitudeAbsolute magnitude (M) is a measure of the luminosity of a celestial object on an inverse logarithmic astronomical magnitude scale. An object's absolute magnitude is defined to be equal to the apparent magnitude that the object would have if it were viewed from a distance of exactly , without extinction (or dimming) of its light due to absorption by interstellar matter and cosmic dust. By hypothetically placing all objects at a standard reference distance from the observer, their luminosities can be directly compared among each other on a magnitude scale.
DataIn common usage and statistics, data (USˈdætə; UKˈdeɪtə) is a collection of discrete or continuous values that convey information, describing the quantity, quality, fact, statistics, other basic units of meaning, or simply sequences of symbols that may be further interpreted formally. A datum is an individual value in a collection of data. Data is usually organized into structures such as tables that provide additional context and meaning, and which may themselves be used as data in larger structures.
Majorana equationIn physics, the Majorana equation is a relativistic wave equation. It is named after the Italian physicist Ettore Majorana, who proposed it in 1937 as a means of describing fermions that are their own antiparticle. Particles corresponding to this equation are termed Majorana particles, although that term now has a more expansive meaning, referring to any (possibly non-relativistic) fermionic particle that is its own anti-particle (and is therefore electrically neutral).
Surface brightnessIn astronomy, surface brightness (SB) quantifies the apparent brightness or flux density per unit angular area of a spatially extended object such as a galaxy or nebula, or of the night sky background. An object's surface brightness depends on its surface luminosity density, i.e., its luminosity emitted per unit surface area. In visible and infrared astronomy, surface brightness is often quoted on a magnitude scale, in magnitudes per square arcsecond (MPSAS) in a particular filter band or photometric system.
Data managementData management comprises all disciplines related to handling data as a valuable resource. The concept of data management arose in the 1980s as technology moved from sequential processing (first punched cards, then magnetic tape) to random access storage. Since it was now possible to store a discrete fact and quickly access it using random access disk technology, those suggesting that data management was more important than business process management used arguments such as "a customer's home address is stored in 75 (or some other large number) places in our computer systems.
Belle experimentThe Belle experiment was a particle physics experiment conducted by the Belle Collaboration, an international collaboration of more than 400 physicists and engineers, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan. The experiment ran from 1999 to 2010. The Belle detector was located at the collision point of the asymmetric-energy electron–positron collider, KEKB.
Experimental uncertainty analysisExperimental uncertainty analysis is a technique that analyses a derived quantity, based on the uncertainties in the experimentally measured quantities that are used in some form of mathematical relationship ("model") to calculate that derived quantity. The model used to convert the measurements into the derived quantity is usually based on fundamental principles of a science or engineering discipline. The uncertainty has two components, namely, bias (related to accuracy) and the unavoidable random variation that occurs when making repeated measurements (related to precision).
Hamiltonian (quantum mechanics)In quantum mechanics, the Hamiltonian of a system is an operator corresponding to the total energy of that system, including both kinetic energy and potential energy. Its spectrum, the system's energy spectrum or its set of energy eigenvalues, is the set of possible outcomes obtainable from a measurement of the system's total energy. Due to its close relation to the energy spectrum and time-evolution of a system, it is of fundamental importance in most formulations of quantum theory.