Variation of informationIn probability theory and information theory, the variation of information or shared information distance is a measure of the distance between two clusterings (partitions of elements). It is closely related to mutual information; indeed, it is a simple linear expression involving the mutual information. Unlike the mutual information, however, the variation of information is a true metric, in that it obeys the triangle inequality. Suppose we have two partitions and of a set into disjoint subsets, namely and .
Total correlationIn probability theory and in particular in information theory, total correlation (Watanabe 1960) is one of several generalizations of the mutual information. It is also known as the multivariate constraint (Garner 1962) or multiinformation (Studený & Vejnarová 1999). It quantifies the redundancy or dependency among a set of n random variables. For a given set of n random variables , the total correlation is defined as the Kullback–Leibler divergence from the joint distribution to the independent distribution of , This divergence reduces to the simpler difference of entropies, where is the information entropy of variable , and is the joint entropy of the variable set .