Ion sourceAn ion source is a device that creates atomic and molecular ions. Ion sources are used to form ions for mass spectrometers, optical emission spectrometers, particle accelerators, ion implanters and ion engines. Electron ionization Electron ionization is widely used in mass spectrometry, particularly for organic molecules. The gas phase reaction producing electron ionization is M{} + e^- -> M^{+\bullet}{} + 2e^- where M is the atom or molecule being ionized, e^- is the electron, and M^{+\bullet} is the resulting ion.
J-couplingIn nuclear chemistry and nuclear physics, J-couplings (also called spin-spin coupling or indirect dipole–dipole coupling) are mediated through chemical bonds connecting two spins. It is an indirect interaction between two nuclear spins that arises from hyperfine interactions between the nuclei and local electrons. In NMR spectroscopy, J-coupling contains information about relative bond distances and angles. Most importantly, J-coupling provides information on the connectivity of chemical bonds.
Cyclotron resonanceCyclotron resonance describes the interaction of external forces with charged particles experiencing a magnetic field, thus already moving on a circular path. It is named after the cyclotron, a cyclic particle accelerator that utilizes an oscillating electric field tuned to this resonance to add kinetic energy to charged particles. The cyclotron frequency or gyrofrequency is the frequency of a charged particle moving perpendicular to the direction of a uniform magnetic field B (constant magnitude and direction).
Fourier inversion theoremIn mathematics, the Fourier inversion theorem says that for many types of functions it is possible to recover a function from its Fourier transform. Intuitively it may be viewed as the statement that if we know all frequency and phase information about a wave then we may reconstruct the original wave precisely. The theorem says that if we have a function satisfying certain conditions, and we use the convention for the Fourier transform that then In other words, the theorem says that This last equation is called the Fourier integral theorem.
Ion trapAn ion trap is a combination of electric and/or magnetic fields used to capture charged particles — known as ions — often in a system isolated from an external environment. Atomic and molecular ion traps have a number of applications in physics and chemistry such as precision mass spectrometry, improved atomic frequency standards, and quantum computing. In comparison to neutral atom traps, ion traps have deeper trapping potentials (up to several electronvolts) that do not depend on the internal electronic structure of a trapped ion.
Fractional Fourier transformIn mathematics, in the area of harmonic analysis, the fractional Fourier transform (FRFT) is a family of linear transformations generalizing the Fourier transform. It can be thought of as the Fourier transform to the n-th power, where n need not be an integer — thus, it can transform a function to any intermediate domain between time and frequency. Its applications range from filter design and signal analysis to phase retrieval and pattern recognition.
CyclotronA cyclotron is a type of particle accelerator invented by Ernest Lawrence in 1929–1930 at the University of California, Berkeley, and patented in 1932. A cyclotron accelerates charged particles outwards from the center of a flat cylindrical vacuum chamber along a spiral path. The particles are held to a spiral trajectory by a static magnetic field and accelerated by a rapidly varying electric field. Lawrence was awarded the 1939 Nobel Prize in Physics for this invention. The cyclotron was the first "cyclical" accelerator.
Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Entropy (information theory)In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Selected-ion flow-tube mass spectrometrySelected-ion flow-tube mass spectrometry (SIFT-MS) is a quantitative mass spectrometry technique for trace gas analysis which involves the chemical ionization of trace volatile compounds by selected positive precursor ions during a well-defined time period along a flow tube. Absolute concentrations of trace compounds present in air, breath or the headspace of bottled liquid samples can be calculated in real time from the ratio of the precursor and product ion signal ratios, without the need for sample preparation or calibration with standard mixtures.