Fisher informationIn mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information. The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth).
Spectral density estimationIn statistical signal processing, the goal of spectral density estimation (SDE) or simply spectral estimation is to estimate the spectral density (also known as the power spectral density) of a signal from a sequence of time samples of the signal. Intuitively speaking, the spectral density characterizes the frequency content of the signal. One purpose of estimating the spectral density is to detect any periodicities in the data, by observing peaks at the frequencies corresponding to these periodicities.
Tests of general relativityTests of general relativity serve to establish observational evidence for the theory of general relativity. The first three tests, proposed by Albert Einstein in 1915, concerned the "anomalous" precession of the perihelion of Mercury, the bending of light in gravitational fields, and the gravitational redshift. The precession of Mercury was already known; experiments showing light bending in accordance with the predictions of general relativity were performed in 1919, with increasingly precise measurements made in subsequent tests; and scientists claimed to have measured the gravitational redshift in 1925, although measurements sensitive enough to actually confirm the theory were not made until 1954.
Astronomical surveyAn astronomical survey is a general map or image of a region of the sky (or of the whole sky) that lacks a specific observational target. Alternatively, an astronomical survey may comprise a set of images, spectra, or other observations of objects that share a common type or feature. Surveys are often restricted to one band of the electromagnetic spectrum due to instrumental limitations, although multiwavelength surveys can be made by using multiple detectors, each sensitive to a different bandwidth.
Astronomical seeingIn astronomy, seeing is the degradation of the of an astronomical object due to turbulence in the atmosphere of Earth that may become visible as blurring, twinkling or variable distortion. The origin of this effect is rapidly changing variations of the optical refractive index along the light path from the object to the detector. Seeing is a major limitation to the angular resolution in astronomical observations with telescopes that would otherwise be limited through diffraction by the size of the telescope aperture.
Least-squares spectral analysisLeast-squares spectral analysis (LSSA) is a method of estimating a frequency spectrum based on a least-squares fit of sinusoids to data samples, similar to Fourier analysis. Fourier analysis, the most used spectral method in science, generally boosts long-periodic noise in the long and gapped records; LSSA mitigates such problems. Unlike in Fourier analysis, data need not be equally spaced to use LSSA.
Jeffreys priorIn Bayesian probability, the Jeffreys prior, named after Sir Harold Jeffreys, is a non-informative prior distribution for a parameter space; its density function is proportional to the square root of the determinant of the Fisher information matrix: It has the key feature that it is invariant under a change of coordinates for the parameter vector . That is, the relative probability assigned to a volume of a probability space using a Jeffreys prior will be the same regardless of the parameterization used to define the Jeffreys prior.
Principle of maximum entropyThe principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition that expresses testable information). Another way of stating this: Take precisely stated prior data or testable information about a probability distribution function. Consider the set of all trial probability distributions that would encode the prior data.
Information theoryInformation theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
Conjugate priorIn Bayesian probability theory, if the posterior distribution is in the same probability distribution family as the prior probability distribution , the prior and posterior are then called conjugate distributions, and the prior is called a conjugate prior for the likelihood function . A conjugate prior is an algebraic convenience, giving a closed-form expression for the posterior; otherwise, numerical integration may be necessary. Further, conjugate priors may give intuition by more transparently showing how a likelihood function updates a prior distribution.