Estimation theoryEstimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
Symmetric polynomialIn mathematics, a symmetric polynomial is a polynomial P(X1, X2, ..., Xn) in n variables, such that if any of the variables are interchanged, one obtains the same polynomial. Formally, P is a symmetric polynomial if for any permutation σ of the subscripts 1, 2, ..., n one has P(Xσ(1), Xσ(2), ..., Xσ(n)) = P(X1, X2, ..., Xn). Symmetric polynomials arise naturally in the study of the relation between the roots of a polynomial in one variable and its coefficients, since the coefficients can be given by polynomial expressions in the roots, and all roots play a similar role in this setting.
State observerIn control theory, a state observer or state estimator is a system that provides an estimate of the internal state of a given real system, from measurements of the input and output of the real system. It is typically computer-implemented, and provides the basis of many practical applications. Knowing the system state is necessary to solve many control theory problems; for example, stabilizing a system using state feedback. In most practical cases, the physical state of the system cannot be determined by direct observation.
Phase-contrast imagingPhase-contrast imaging is a method of that has a range of different applications. It measures differences in the refractive index of different materials to differentiate between structures under analysis. In conventional light microscopy, phase contrast can be employed to distinguish between structures of similar transparency, and to examine crystals on the basis of their double refraction. This has uses in biological, medical and geological science.
Binomial coefficientIn mathematics, the binomial coefficients are the positive integers that occur as coefficients in the binomial theorem. Commonly, a binomial coefficient is indexed by a pair of integers n ≥ k ≥ 0 and is written It is the coefficient of the xk term in the polynomial expansion of the binomial power (1 + x)n; this coefficient can be computed by the multiplicative formula which using factorial notation can be compactly expressed as For example, the fourth power of 1 + x is and the binomial coefficient is the coefficient of the x2 term.
Hidden Markov modelA hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process — call it — with unobservable ("hidden") states. As part of the definition, HMM requires that there be an observable process whose outcomes are "influenced" by the outcomes of in a known way.
Gaussian binomial coefficientIn mathematics, the Gaussian binomial coefficients (also called Gaussian coefficients, Gaussian polynomials, or q-binomial coefficients) are q-analogs of the binomial coefficients. The Gaussian binomial coefficient, written as or , is a polynomial in q with integer coefficients, whose value when q is set to a prime power counts the number of subspaces of dimension k in a vector space of dimension n over , a finite field with q elements; i.e. it is the number of points in the finite Grassmannian .
Initial conditionIn mathematics and particularly in dynamic systems, an initial condition, in some contexts called a seed value, is a value of an evolving variable at some point in time designated as the initial time (typically denoted t = 0). For a system of order k (the number of time lags in discrete time, or the order of the largest derivative in continuous time) and dimension n (that is, with n different evolving variables, which together can be denoted by an n-dimensional coordinate vector), generally nk initial conditions are needed in order to trace the system's variables forward through time.
CoefficientIn mathematics, a coefficient is a multiplicative factor involved in some term of a polynomial, a series, or an expression. It may be a number (dimensionless), in which case it is known as a numerical factor. It may also be a constant with units of measurement, in which it is known as a constant multiplier. In general, coefficients may be any expression (including variables such as a, b and c). When the combination of variables and constants is not necessarily involved in a product, it may be called a parameter.
Robust statisticsRobust statistics are statistics with good performance for data drawn from a wide range of probability distributions, especially for distributions that are not normal. Robust statistical methods have been developed for many common problems, such as estimating location, scale, and regression parameters. One motivation is to produce statistical methods that are not unduly affected by outliers. Another motivation is to provide methods with good performance when there are small departures from a parametric distribution.