In probability theory and in particular in information theory, total correlation (Watanabe 1960) is one of several generalizations of the mutual information. It is also known as the multivariate constraint (Garner 1962) or multiinformation (Studený & Vejnarová 1999). It quantifies the redundancy or dependency among a set of n random variables. For a given set of n random variables , the total correlation is defined as the Kullback–Leibler divergence from the joint distribution to the independent distribution of , This divergence reduces to the simpler difference of entropies, where is the information entropy of variable , and is the joint entropy of the variable set . In terms of the discrete probability distributions on variables , the total correlation is given by The total correlation is the amount of information shared among the variables in the set. The sum represents the amount of information in bits (assuming base-2 logs) that the variables would possess if they were totally independent of one another (non-redundant), or, equivalently, the average code length to transmit the values of all variables if each variable was (optimally) coded independently. The term is the actual amount of information that the variable set contains, or equivalently, the average code length to transmit the values of all variables if the set of variables was (optimally) coded together. The difference between these terms therefore represents the absolute redundancy (in bits) present in the given set of variables, and thus provides a general quantitative measure of the structure or organization embodied in the set of variables (Rothstein 1952). The total correlation is also the Kullback–Leibler divergence between the actual distribution and its maximum entropy product approximation . Total correlation quantifies the amount of dependence among a group of variables. A near-zero total correlation indicates that the variables in the group are essentially statistically independent; they are completely unrelated, in the sense that knowing the value of one variable does not provide any clue as to the values of the other variables.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (1)
Maximal Correlation: Information Measures
Explores maximal correlation in information theory, mutual information properties, Renyi's measures, and mathematical foundations of information theory.
Related publications (22)

From Generalisation Error to Transportation-cost Inequalities and Back

Michael Christoph Gastpar, Amedeo Roberto Esposito

In this work, we connect the problem of bounding the expected generalisation error with transportation-cost inequalities. Exposing the underlying pattern behind both approaches we are able to generalise them and go beyond Kullback- Leibler Divergences/Mutu ...
2022

A Wasserstein-based measure of conditional dependence

Negar Kiyavash, Seyed Jalal Etesami, Kun Zhang

Measuring conditional dependencies among the variables of a network is of great interest to many disciplines. This paper studies some shortcomings of the existing dependency measures in detecting direct causal influences or their lack of ability for group ...
2022

Contextual Games: Multi-Agent Learning with Side Information

Maryam Kamgarpour, Andreas Krause, Ilija Bogunovic

We formulate the novel class of contextual games, a type of repeated games driven by contextual information at each round. By means of kernel-based regularity assumptions, we model the correlation between different contexts and game out- comes and propose ...
Curran Associates, Inc.2020
Show more
Related concepts (1)
Mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.