**Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?**

Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.

Concept# Explained sum of squares

Résumé

In statistics, the explained sum of squares (ESS), alternatively known as the model sum of squares or sum of squares due to regression (SSR – not to be confused with the residual sum of squares (RSS) or sum of squares of errors), is a quantity used in describing how well a model, often a regression model, represents the data being modelled. In particular, the explained sum of squares measures how much variation there is in the modelled values and this is compared to the total sum of squares (TSS), which measures how much variation there is in the observed data, and to the residual sum of squares, which measures the variation in the error between the observed data and modelled values.
The explained sum of squares (ESS) is the sum of the squares of the deviations of the predicted values from the mean value of a response variable, in a standard regression model — for example, yi = a + b1x1i + b2x2i + ... + εi, where yi is the i th observation of the response variable, xji is the i th observation of the j th explanatory variable, a and bj are coefficients, i indexes the observations from 1 to n, and εi is the i th value of the error term. In general, the greater the ESS, the better the estimated model performs.
If and are the estimated coefficients, then
is the i th predicted value of the response variable. The ESS is then:
where the value estimated by the regression line .
In some cases (see below): total sum of squares (TSS) = explained sum of squares (ESS) + residual sum of squares (RSS).
The following equality, stating that the total sum of squares (TSS) equals the residual sum of squares (=SSE : the sum of squared errors of prediction) plus the explained sum of squares (SSR :the sum of squares due to regression or explained sum of squares), is generally true in simple linear regression:
Square both sides and sum over all i:
Here is how the last term above is zero from simple linear regression
So,
Therefore,
The general regression model with n observations and k explanators, the first of which is a constant unit vector whose coefficient is the regression intercept, is
where y is an n × 1 vector of dependent variable observations, each column of the n × k matrix X is a vector of observations on one of the k explanators, is a k × 1 vector of true coefficients, and e is an n × 1 vector of the true underlying errors.

Source officielle

Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.

Publications associées (10)

Personnes associées (2)

Cours associés (1)

Concepts associés (9)

MATH-413: Statistics for data science

Statistics lies at the foundation of data science, providing a unifying theoretical and methodological backbone for the diverse tasks enountered in this emerging field. This course rigorously develops

Lack-of-fit sum of squares

In statistics, a sum of squares due to lack of fit, or more tersely a lack-of-fit sum of squares, is one of the components of a partition of the sum of squares of residuals in an analysis of variance, used in the numerator in an F-test of the null hypothesis that says that a proposed model fits well. The other component is the pure-error sum of squares. The pure-error sum of squares is the sum of squared deviations of each value of the dependent variable from the average value over all observations sharing its independent variable value(s).

Total sum of squares

In statistical data analysis the total sum of squares (TSS or SST) is a quantity that appears as part of a standard way of presenting results of such analyses. For a set of observations, , it is defined as the sum over all squared differences between the observations and their overall mean .: For wide classes of linear models, the total sum of squares equals the explained sum of squares plus the residual sum of squares. For proof of this in the multivariate OLS case, see partitioning in the general OLS model.

Partition of sums of squares

The partition of sums of squares is a concept that permeates much of inferential statistics and descriptive statistics. More properly, it is the partitioning of sums of squared deviations or errors. Mathematically, the sum of squared deviations is an unscaled, or unadjusted measure of dispersion (also called variability). When scaled for the number of degrees of freedom, it estimates the variance, or spread of the observations about their mean value.

Séances de cours associées (32)

Estimation de la distribution

Couvre l'estimation des distributions en utilisant diverses méthodes telles que la perte minimale et les attentes.

ANOVA: Partitionnement Total SS

Couvre la méthode ANOVA, en se concentrant sur la partition de la somme totale des carrés en composantes de traitement et d'erreur, les calculs carrés moyens, les statistiques de Fisher et la distribution F.

Modèle avec interactions

Explique les modèles avec des interactions, en mettant laccent sur limportance des facteurs et des valeurs F.

This paper presents a method to verify closed-loop properties of optimization-based controllers for deterministic and stochastic constrained polynomial discrete-time dynamical systems. The closed-loop properties amenable to the proposed technique include g ...

Luiz Felippe De Alencastro, Edouard René Gilbert Lehmann, Jean-Jacques Stéphane Nfon Dibie, Morgan Fargues

This study proposes a comprehensive approach to investigate water resource contamination by pesticides under the specific climatic and hydrological conditions of the Sudano-Sahelian climate. Samples were collected from tradi- tional wells, boreholes, and a ...

2018Olaf Blanke, Mohamed Bouri, Oliver Alan Kannape, Atena Fadaeijouybari, Selim Jean Habiby Alaoui

Background :Sensory reafferents are crucial to correct our posture and movements, both reflexively and in a cognitively driven manner. They are also integral to developing and maintaining a sense of agency for our actions. In cases of compromised reafferen ...

2024