Sampling biasIn statistics, sampling bias is a bias in which a sample is collected in such a way that some members of the intended population have a lower or higher sampling probability than others. It results in a biased sample of a population (or non-human factors) in which all individuals, or instances, were not equally likely to have been selected. If this is not accounted for, results can be erroneously attributed to the phenomenon under study rather than to the method of sampling.
Commuting matricesIn linear algebra, two matrices and are said to commute if , or equivalently if their commutator is zero. A set of matrices is said to commute if they commute pairwise, meaning that every pair of matrices in the set commute with each other. Commuting matrices preserve each other's eigenspaces. As a consequence, commuting matrices over an algebraically closed field are simultaneously triangularizable; that is, there are bases over which they are both upper triangular.
Inverse Gaussian distributionIn probability theory, the inverse Gaussian distribution (also known as the Wald distribution) is a two-parameter family of continuous probability distributions with support on (0,∞). Its probability density function is given by for x > 0, where is the mean and is the shape parameter. The inverse Gaussian distribution has several properties analogous to a Gaussian distribution.
Star domainIn geometry, a set in the Euclidean space is called a star domain (or star-convex set, star-shaped set or radially convex set) if there exists an such that for all the line segment from to lies in This definition is immediately generalizable to any real, or complex, vector space. Intuitively, if one thinks of as a region surrounded by a wall, is a star domain if one can find a vantage point in from which any point in is within line-of-sight. A similar, but distinct, concept is that of a radial set.
Companion matrixIn linear algebra, the Frobenius companion matrix of the monic polynomial is the square matrix defined as Some authors use the transpose of this matrix, , which is more convenient for some purposes such as linear recurrence relations (see below). is defined from the coefficients of , while the characteristic polynomial as well as the minimal polynomial of are equal to . In this sense, the matrix and the polynomial are "companions". Any matrix A with entries in a field F has characteristic polynomial , which in turn has companion matrix .
Spectrum of a matrixIn mathematics, the spectrum of a matrix is the set of its eigenvalues. More generally, if is a linear operator on any finite-dimensional vector space, its spectrum is the set of scalars such that is not invertible. The determinant of the matrix equals the product of its eigenvalues. Similarly, the trace of the matrix equals the sum of its eigenvalues. From this point of view, we can define the pseudo-determinant for a singular matrix to be the product of its nonzero eigenvalues (the density of multivariate normal distribution will need this quantity).
Fourier transform on finite groupsIn mathematics, the Fourier transform on finite groups is a generalization of the discrete Fourier transform from cyclic to arbitrary finite groups. The Fourier transform of a function at a representation of is For each representation of , is a matrix, where is the degree of . The inverse Fourier transform at an element of is given by The convolution of two functions is defined as The Fourier transform of a convolution at any representation of is given by For functions , the Plancherel formula states where are the irreducible representations of .