Hyperfine structureIn atomic physics, hyperfine structure is defined by small shifts in otherwise degenerate energy levels and the resulting splittings in those energy levels of atoms, molecules, and ions, due to electromagnetic multipole interaction between the nucleus and electron clouds. In atoms, hyperfine structure arises from the energy of the nuclear magnetic dipole moment interacting with the magnetic field generated by the electrons and the energy of the nuclear electric quadrupole moment in the electric field gradient due to the distribution of charge within the atom.
Zeeman effectThe Zeeman effect (ˈzeɪmən; ˈzeːmɑn) is the effect of splitting of a spectral line into several components in the presence of a static magnetic field. It is named after the Dutch physicist Pieter Zeeman, who discovered it in 1896 and received a Nobel prize for this discovery. It is analogous to the Stark effect, the splitting of a spectral line into several components in the presence of an electric field.
Mössbauer spectroscopyMössbauer spectroscopy is a spectroscopic technique based on the Mössbauer effect. This effect, discovered by Rudolf Mössbauer (sometimes written "Moessbauer", German: "Mößbauer") in 1958, consists of the nearly recoil-free emission and absorption of nuclear gamma rays in solids. The consequent nuclear spectroscopy method is exquisitely sensitive to small changes in the chemical environment of certain nuclei.
Decision treeA decision tree is a decision support hierarchical model that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. It is one way to display an algorithm that only contains conditional control statements. Decision trees are commonly used in operations research, specifically in decision analysis, to help identify a strategy most likely to reach a goal, but are also a popular tool in machine learning.
Cross-validation (statistics)Cross-validation, sometimes called rotation estimation or out-of-sample testing, is any of various similar model validation techniques for assessing how the results of a statistical analysis will generalize to an independent data set. Cross-validation is a resampling method that uses different portions of the data to test and train a model on different iterations. It is mainly used in settings where the goal is prediction, and one wants to estimate how accurately a predictive model will perform in practice.
Random forestRandom forests or random decision forests is an ensemble learning method for classification, regression and other tasks that operates by constructing a multitude of decision trees at training time. For classification tasks, the output of the random forest is the class selected by most trees. For regression tasks, the mean or average prediction of the individual trees is returned. Random decision forests correct for decision trees' habit of overfitting to their training set.