Total least squaresIn applied statistics, total least squares is a type of errors-in-variables regression, a least squares data modeling technique in which observational errors on both dependent and independent variables are taken into account. It is a generalization of Deming regression and also of orthogonal regression, and can be applied to both linear and non-linear models. The total least squares approximation of the data is generically equivalent to the best, in the Frobenius norm, low-rank approximation of the data matrix.
Elastic net regularizationIn statistics and, in particular, in the fitting of linear or logistic regression models, the elastic net is a regularized regression method that linearly combines the L1 and L2 penalties of the lasso and ridge methods. The elastic net method overcomes the limitations of the LASSO (least absolute shrinkage and selection operator) method which uses a penalty function based on Use of this penalty function has several limitations. For example, in the "large p, small n" case (high-dimensional data with few examples), the LASSO selects at most n variables before it saturates.
Cube de HilbertEn topologie, on appelle cube de Hilbert l'espace produit muni de la topologie produit, autrement dit : l'espace des suites à valeurs dans [0, 1], muni de la topologie de la convergence simple. D'après le théorème de Tykhonov, c'est un espace compact. Il est homéomorphe au sous-espace suivant de l, pour tous : Il est donc métrisable et par conséquent (puisqu'il est compact), séparable et possède la propriété suivante : Cela fournit en particulier un moyen commode pour compactifier les espaces métrisables séparables, et aussi un critère pour les classifier selon leur complexité ; par exemple un espace est polonais si et seulement s'il est homéomorphe à l'intersection d'une suite d'ouverts de K.
Robust regressionIn robust statistics, robust regression seeks to overcome some limitations of traditional regression analysis. A regression analysis models the relationship between one or more independent variables and a dependent variable. Standard types of regression, such as ordinary least squares, have favourable properties if their underlying assumptions are true, but can give misleading results otherwise (i.e. are not robust to assumption violations).
Multinomial logistic regressionIn statistics, multinomial logistic regression is a classification method that generalizes logistic regression to multiclass problems, i.e. with more than two possible discrete outcomes. That is, it is a model that is used to predict the probabilities of the different possible outcomes of a categorically distributed dependent variable, given a set of independent variables (which may be real-valued, binary-valued, categorical-valued, etc.).
Weighted least squaresWeighted least squares (WLS), also known as weighted linear regression, is a generalization of ordinary least squares and linear regression in which knowledge of the unequal variance of observations (heteroscedasticity) is incorporated into the regression. WLS is also a specialization of generalized least squares, when all the off-diagonal entries of the covariance matrix of the errors, are null.
Régression vers la moyenneEn statistique, la régression vers la moyenne décrit le phénomène suivant : si une variable est extrême à sa première mesure, elle va généralement se rapprocher de la moyenne à sa seconde mesure. Si elle est extrême à sa seconde mesure elle va tendre à être proche de la moyenne à sa première mesure. Afin d'éviter des inférences erronées, la régression vers la moyenne doit être considérée à la base de la conception des expériences scientifiques et prise en compte lors de l'interprétation des données.
Triplet de GelfandEn analyse fonctionnelle, le triplet de Gelfand (aussi triplet de Banach-Gelfand ou triade hilbertienne ou rigged Hilbert space) est un espace-triplet consistant en un espace de Hilbert , un espace de Banach (ou plus généralement un espace vectoriel topologique) et son dual topologique . L'espace est choisi tel que soit un sous-espace dense dans et que son inclusion soitcontinue. Cette construction a l'avantage que les éléments de peuvent être exprimés comme des éléments de l'espace dual en utilisant le théorème de représentation de Fréchet-Riesz.
Generalized least squaresIn statistics, generalized least squares (GLS) is a method used to estimate the unknown parameters in a linear regression model when there is a certain degree of correlation between the residuals in the regression model. Least squares and weighted least squares may need to be more statistically efficient and prevent misleading inferences. GLS was first described by Alexander Aitken in 1935. In standard linear regression models one observes data on n statistical units.
MulticollinearityIn statistics, multicollinearity (also collinearity) is a phenomenon in which one predictor variable in a multiple regression model can be linearly predicted from the others with a substantial degree of accuracy. In this situation, the coefficient estimates of the multiple regression may change erratically in response to small changes in the model or the data. Multicollinearity does not reduce the predictive power or reliability of the model as a whole, at least within the sample data set; it only affects calculations regarding individual predictors.