Data managementData management comprises all disciplines related to handling data as a valuable resource. The concept of data management arose in the 1980s as technology moved from sequential processing (first punched cards, then magnetic tape) to random access storage. Since it was now possible to store a discrete fact and quickly access it using random access disk technology, those suggesting that data management was more important than business process management used arguments such as "a customer's home address is stored in 75 (or some other large number) places in our computer systems.
Electron captureElectron capture (K-electron capture, also K-capture, or L-electron capture, L-capture) is a process in which the proton-rich nucleus of an electrically neutral atom absorbs an inner atomic electron, usually from the K or L electron shells. This process thereby changes a nuclear proton to a neutron and simultaneously causes the emission of an electron neutrino. Proton + Electron → Neutron + Electron Neutrino or when written as a nuclear reaction equation, ^{0}{-1}e + ^{1}{1}p -> ^{1}{0}n + ^{0}{0} ν Since this single emitted neutrino carries the entire decay energy, it has this single characteristic energy.
Data scienceData science is an interdisciplinary academic field that uses statistics, scientific computing, scientific methods, processes, algorithms and systems to extract or extrapolate knowledge and insights from noisy, structured, and unstructured data. Data science also integrates domain knowledge from the underlying application domain (e.g., natural sciences, information technology, and medicine). Data science is multifaceted and can be described as a science, a research paradigm, a research method, a discipline, a workflow, and a profession.
Big dataBig data primarily refers to data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many entries (rows) offer greater statistical power, while data with higher complexity (more attributes or columns) may lead to a higher false discovery rate. Though used sometimes loosely partly because of a lack of formal definition, the interpretation that seems to best describe big data is the one associated with a large body of information that we could not comprehend when used only in smaller amounts.
Absolute magnitudeAbsolute magnitude (M) is a measure of the luminosity of a celestial object on an inverse logarithmic astronomical magnitude scale. An object's absolute magnitude is defined to be equal to the apparent magnitude that the object would have if it were viewed from a distance of exactly , without extinction (or dimming) of its light due to absorption by interstellar matter and cosmic dust. By hypothetically placing all objects at a standard reference distance from the observer, their luminosities can be directly compared among each other on a magnitude scale.
Surface brightnessIn astronomy, surface brightness (SB) quantifies the apparent brightness or flux density per unit angular area of a spatially extended object such as a galaxy or nebula, or of the night sky background. An object's surface brightness depends on its surface luminosity density, i.e., its luminosity emitted per unit surface area. In visible and infrared astronomy, surface brightness is often quoted on a magnitude scale, in magnitudes per square arcsecond (MPSAS) in a particular filter band or photometric system.
Minimal Supersymmetric Standard ModelThe Minimal Supersymmetric Standard Model (MSSM) is an extension to the Standard Model that realizes supersymmetry. MSSM is the minimal supersymmetrical model as it considers only "the [minimum] number of new particle states and new interactions consistent with "Reality". Supersymmetry pairs bosons with fermions, so every Standard Model particle has a superpartner yet undiscovered. If discovered, such superparticles could be candidates for dark matter, and could provide evidence for grand unification or the viability of string theory.
Test methodA test method is a method for a test in science or engineering, such as a physical test, chemical test, or statistical test. It is a definitive procedure that produces a test result. In order to ensure accurate and relevant test results, a test method should be "explicit, unambiguous, and experimentally feasible.", as well as effective and reproducible. A test can be considered an observation or experiment that determines one or more characteristics of a given sample, product, process, or service.
Belle experimentThe Belle experiment was a particle physics experiment conducted by the Belle Collaboration, an international collaboration of more than 400 physicists and engineers, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan. The experiment ran from 1999 to 2010. The Belle detector was located at the collision point of the asymmetric-energy electron–positron collider, KEKB.
Measurement uncertaintyIn metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation. By international agreement, this uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity value. It is a non-negative parameter.