Linear mapIn mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping between two vector spaces that preserves the operations of vector addition and scalar multiplication. The same names and the same definition are also used for the more general case of modules over a ring; see Module homomorphism. If a linear map is a bijection then it is called a .
Quadratic fieldIn algebraic number theory, a quadratic field is an algebraic number field of degree two over , the rational numbers. Every such quadratic field is some where is a (uniquely defined) square-free integer different from and . If , the corresponding quadratic field is called a real quadratic field, and, if , it is called an imaginary quadratic field or a complex quadratic field, corresponding to whether or not it is a subfield of the field of the real numbers.
Generalized least squaresIn statistics, generalized least squares (GLS) is a method used to estimate the unknown parameters in a linear regression model when there is a certain degree of correlation between the residuals in the regression model. Least squares and weighted least squares may need to be more statistically efficient and prevent misleading inferences. GLS was first described by Alexander Aitken in 1935. In standard linear regression models one observes data on n statistical units.
M-estimatorIn statistics, M-estimators are a broad class of extremum estimators for which the objective function is a sample average. Both non-linear least squares and maximum likelihood estimation are special cases of M-estimators. The definition of M-estimators was motivated by robust statistics, which contributed new types of M-estimators. However, M-estimators are not inherently robust, as is clear from the fact that they include maximum likelihood estimators, which are in general not robust.
Errors and residualsIn statistics and optimization, errors and residuals are two closely related and easily confused measures of the deviation of an observed value of an element of a statistical sample from its "true value" (not necessarily observable). The error of an observation is the deviation of the observed value from the true value of a quantity of interest (for example, a population mean). The residual is the difference between the observed value and the estimated value of the quantity of interest (for example, a sample mean).
Studentized residualIn statistics, a studentized residual is the quotient resulting from the division of a residual by an estimate of its standard deviation. It is a form of a Student's t-statistic, with the estimate of error varying between points. This is an important technique in the detection of outliers. It is among several named in honor of William Sealey Gosset, who wrote under the pseudonym Student. Dividing a statistic by a sample standard deviation is called studentizing, in analogy with standardizing and normalizing.
Quadratic formIn mathematics, a quadratic form is a polynomial with terms all of degree two ("form" is another name for a homogeneous polynomial). For example, is a quadratic form in the variables x and y. The coefficients usually belong to a fixed field K, such as the real or complex numbers, and one speaks of a quadratic form over K. If , and the quadratic form equals zero only when all variables are simultaneously zero, then it is a definite quadratic form; otherwise it is an isotropic quadratic form.
Linear independenceIn the theory of vector spaces, a set of vectors is said to be if there exists no nontrivial linear combination of the vectors that equals the zero vector. If such a linear combination exists, then the vectors are said to be . These concepts are central to the definition of dimension. A vector space can be of finite dimension or infinite dimension depending on the maximum number of linearly independent vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space is linearly dependent are central to determining the dimension of a vector space.
Reduced chi-squared statisticIn statistics, the reduced chi-square statistic is used extensively in goodness of fit testing. It is also known as mean squared weighted deviation (MSWD) in isotopic dating and variance of unit weight in the context of weighted least squares. Its square root is called regression standard error, standard error of the regression, or standard error of the equation (see ) It is defined as chi-square per degree of freedom: where the chi-squared is a weighted sum of squared deviations: with inputs: variance , observations O, and calculated data C.
Performance appraisalA performance appraisal, also referred to as a performance review, performance evaluation, (career) development discussion, or employee appraisal, sometimes shortened to "PA", is a periodic and systematic process whereby the job performance of an employee is documented and evaluated. This is done after employees are trained about work and settle into their jobs. Performance appraisals are a part of career development and consist of regular reviews of employee performance within organizations.