Hilbert's NullstellensatzIn mathematics, Hilbert's Nullstellensatz (German for "theorem of zeros", or more literally, "zero-locus-theorem") is a theorem that establishes a fundamental relationship between geometry and algebra. This relationship is the basis of algebraic geometry. It relates algebraic sets to ideals in polynomial rings over algebraically closed fields. This relationship was discovered by David Hilbert, who proved the Nullstellensatz in his second major paper on invariant theory in 1893 (following his seminal 1890 paper in which he proved Hilbert's basis theorem).
Convex functionIn mathematics, a real-valued function is called convex if the line segment between any two distinct points on the graph of the function lies above the graph between the two points. Equivalently, a function is convex if its epigraph (the set of points on or above the graph of the function) is a convex set. A twice-differentiable function of a single variable is convex if and only if its second derivative is nonnegative on its entire domain.
Affine varietyIn algebraic geometry, an affine algebraic set is the set of the common zeros over an algebraically closed field k of some family of polynomials in the polynomial ring An affine variety or affine algebraic variety, is an affine algebraic set such that the ideal generated by the defining polynomials is prime. Some texts call variety any algebraic set, and irreducible variety an algebraic set whose defining ideal is prime (affine variety in the above sense).
Standard deviationIn statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean (also called the expected value) of the set, while a high standard deviation indicates that the values are spread out over a wider range. Standard deviation may be abbreviated SD, and is most commonly represented in mathematical texts and equations by the lower case Greek letter σ (sigma), for the population standard deviation, or the Latin letter s, for the sample standard deviation.
Affine hullIn mathematics, the affine hull or affine span of a set S in Euclidean space Rn is the smallest affine set containing S, or equivalently, the intersection of all affine sets containing S. Here, an affine set may be defined as the translation of a vector subspace. The affine hull aff(S) of S is the set of all affine combinations of elements of S, that is, The affine hull of the empty set is the empty set. The affine hull of a singleton (a set made of one single element) is the singleton itself.
Carathéodory's theorem (convex hull)Carathéodory's theorem is a theorem in convex geometry. It states that if a point lies in the convex hull of a set , then can be written as the convex combination of at most points in . More sharply, can be written as the convex combination of at most extremal points in , as non-extremal points can be removed from without changing the membership of in the convex hull. Its equivalent theorem for conical combinations states that if a point lies in the conical hull of a set , then can be written as the conical combination of at most points in .
Bessel's correctionIn statistics, Bessel's correction is the use of n − 1 instead of n in the formula for the sample variance and sample standard deviation, where n is the number of observations in a sample. This method corrects the bias in the estimation of the population variance. It also partially corrects the bias in the estimation of the population standard deviation. However, the correction often increases the mean squared error in these estimations. This technique is named after Friedrich Bessel.
Coefficient of determinationIn statistics, the coefficient of determination, denoted R2 or r2 and pronounced "R squared", is the proportion of the variation in the dependent variable that is predictable from the independent variable(s). It is a statistic used in the context of statistical models whose main purpose is either the prediction of future outcomes or the testing of hypotheses, on the basis of other related information. It provides a measure of how well observed outcomes are replicated by the model, based on the proportion of total variation of outcomes explained by the model.
VarianceIn probability theory and statistics, variance is the squared deviation from the mean of a random variable. The variance is also often defined as the square of the standard deviation. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers is spread out from their average value. It is the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by , , , , or .
Function of several complex variablesThe theory of functions of several complex variables is the branch of mathematics dealing with functions defined on the complex coordinate space , that is, n-tuples of complex numbers. The name of the field dealing with the properties of these functions is called several complex variables (and analytic space), which the Mathematics Subject Classification has as a top-level heading. As in complex analysis of functions of one variable, which is the case n = 1, the functions studied are holomorphic or complex analytic so that, locally, they are power series in the variables zi.