The interaction information is a generalization of the mutual information for more than two variables.
There are many names for interaction information, including amount of information, information correlation, co-information, and simply mutual information. Interaction information expresses the amount of information (redundancy or synergy) bound up in a set of variables, beyond that which is present in any subset of those variables. Unlike the mutual information, the interaction information can be either positive or negative. These functions, their negativity and minima have a direct interpretation in algebraic topology.
The conditional mutual information can be used to inductively define the interaction information for any finite number of variables as follows:
where
Some authors define the interaction information differently, by swapping the two terms being subtracted in the preceding equation. This has the effect of reversing the sign for an odd number of variables.
For three variables , the interaction information is given by
where is the mutual information between variables and , and is the conditional mutual information between variables and given . The interaction information is symmetric, so it does not matter which variable is conditioned on. This is easy to see when the interaction information is written in terms of entropy and joint entropy, as follows:
In general, for the set of variables , the interaction information can be written in the following form (compare with Kirkwood approximation):
For three variables, the interaction information measures the influence of a variable on the amount of information shared between and . Because the term can be larger than , the interaction information can be negative as well as positive. This will happen, for example, when and are independent but not conditionally independent given . Positive interaction information indicates that variable inhibits (i.e., accounts for or explains some of) the correlation between and , whereas negative interaction information indicates that variable facilitates or enhances the correlation.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
We discuss a set of topics that are important for the understanding of modern data science but that are typically not taught in an introductory ML course. In particular we discuss fundamental ideas an
The class will focus on information-theoretic progress of the last decade. Topics include: Network Information Theory ; Information Measures: definitions, properties, and applications to probabilistic
Students extend their knowledge on wireless communication systems to spread-spectrum communication and to multi-antenna systems. They also learn about the basic information theoretic concepts, about c
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Explains how entropy measures uncertainty in a system based on possible outcomes.
Explores the CHSH operator, self-testing, eigenstates, and quantifying randomness in quantum systems.
The management of existing civil infrastructure is becoming more crucial as a large share of bridges is approaching their theoretical end of service duration. Structural performance monitoring aims to verify bridge safety at a given time, and it should be ...
2023
,
This paper presents explicit solutions for two related non-convex information extremization problems due to Gray and Wyner in the Gaussian case. The first problem is the Gray-Wyner network subject to a sum-rate constraint on the two private links. Here, ou ...
Biochemistry, ecology, and neuroscience are examples of prominent fields aiming at describing interacting systems that exhibit nontrivial couplings to complex, ever-changing environments. We have recently shown that linear interactions and a switching envi ...