Régression polynomialePolyreg scheffe.svg thumb|Régression sur un nuage de points par un polynôme de degré croissant. La régression polynomiale est une analyse statistique qui décrit la variation d'une variable aléatoire expliquée à partir d'une fonction polynomiale d'une variable aléatoire explicative. C'est un cas particulier de régression linéaire multiple, où les observations sont construites à partir des puissances d'une seule variable.
Régression quantileLes régressions quantiles sont des outils statistiques dont l’objet est de décrire l’impact de variables explicatives sur une variable d’intérêt. Elles permettent une description plus riche que les régressions linéaires classiques, puisqu’elles s’intéressent à l’ensemble de la distribution conditionnelle de la variable d’intérêt et non seulement à la moyenne de celle-ci. En outre, elles peuvent être plus adaptées pour certains types de données (variables censurées ou tronquées, présence de valeurs extrêmes, modèles non linéaires.
Segmented regressionSegmented regression, also known as piecewise regression or broken-stick regression, is a method in regression analysis in which the independent variable is partitioned into intervals and a separate line segment is fit to each interval. Segmented regression analysis can also be performed on multivariate data by partitioning the various independent variables. Segmented regression is useful when the independent variables, clustered into different groups, exhibit different relationships between the variables in these regions.
Multinomial logistic regressionIn statistics, multinomial logistic regression is a classification method that generalizes logistic regression to multiclass problems, i.e. with more than two possible discrete outcomes. That is, it is a model that is used to predict the probabilities of the different possible outcomes of a categorically distributed dependent variable, given a set of independent variables (which may be real-valued, binary-valued, categorical-valued, etc.).
Total least squaresIn applied statistics, total least squares is a type of errors-in-variables regression, a least squares data modeling technique in which observational errors on both dependent and independent variables are taken into account. It is a generalization of Deming regression and also of orthogonal regression, and can be applied to both linear and non-linear models. The total least squares approximation of the data is generically equivalent to the best, in the Frobenius norm, low-rank approximation of the data matrix.
SmoothnessIn mathematical analysis, the smoothness of a function is a property measured by the number of continuous derivatives it has over some domain, called differentiability class. At the very minimum, a function could be considered smooth if it is differentiable everywhere (hence continuous). At the other end, it might also possess derivatives of all orders in its domain, in which case it is said to be infinitely differentiable and referred to as a C-infinity function (or function).
Fonction convexevignette|upright=1.5|droite|Fonction convexe. En mathématiques, une fonction réelle d'une variable réelle est dite convexe : si quels que soient deux points et du graphe de la fonction, le segment est entièrement situé au-dessus du graphe, c’est-à-dire que la courbe représentative de la fonction se situe toujours en dessous de ses cordes ; ou si l'épigraphe de la fonction (l'ensemble des points qui sont au-dessus de son graphe) est un ensemble convexe ; ou si vu d'en dessous, le graphe de la fonction est en bosse.
Semi-differentiabilityIn calculus, a branch of mathematics, the notions of one-sided differentiability and semi-differentiability of a real-valued function f of a real variable are weaker than differentiability. Specifically, the function f is said to be right differentiable at a point a if, roughly speaking, a derivative can be defined as the function's argument x moves to a from the right, and left differentiable at a if the derivative can be defined as x moves to a from the left.
Kernel (linear algebra)In mathematics, the kernel of a linear map, also known as the null space or nullspace, is the linear subspace of the domain of the map which is mapped to the zero vector. That is, given a linear map L : V → W between two vector spaces V and W, the kernel of L is the vector space of all elements v of V such that L(v) = 0, where 0 denotes the zero vector in W, or more symbolically: The kernel of L is a linear subspace of the domain V.
Regularized least squaresRegularized least squares (RLS) is a family of methods for solving the least-squares problem while using regularization to further constrain the resulting solution. RLS is used for two main reasons. The first comes up when the number of variables in the linear system exceeds the number of observations. In such settings, the ordinary least-squares problem is ill-posed and is therefore impossible to fit because the associated optimization problem has infinitely many solutions.