Mutual Information Disentangles Interactions from Changing Environments
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Other ...
EPFL2022
,
Biochemistry, ecology, and neuroscience are examples of prominent fields aiming at describing interacting systems that exhibit nontrivial couplings to complex, ever-changing environments. We have recently shown that linear interactions and a switching envi ...
AMER PHYSICAL SOC2022
,
This paper presents explicit solutions for two related non-convex information extremization problems due to Gray and Wyner in the Gaussian case. The first problem is the Gray-Wyner network subject to a sum-rate constraint on the two private links. Here, ou ...
Due to conservative approaches in construction design and practice, infrastructure often has hidden reserve capacity. When quantified, this reserve has potential to improve decisions related to asset management. Field measurements, collected through load t ...
We examine a class of stochastic deep learning models with a tractable method to compute information-theoretic quantities. Our contributions are three-fold: (i) we show how entropies and mutual informations can be derived from heuristic statistical physics ...
We are living in the era of "Big Data", an era characterized by a voluminous amount of available data. Such amount is mainly due to the continuing advances in the computational capabilities for capturing, storing, transmitting and processing data. However, ...
We examine a class of stochastic deep learning models with a tractable method to compute information-theoretic quantities. Our contributions are three-fold: (i) We show how entropies and mutual informations can be derived from heuristic statistical physics ...
NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)2018
, ,
The aim of this work is to provide bounds connecting two probability measures of the same event using Rényi α-Divergences and Sibson’s α-Mutual Information, a generalization of respectively the Kullback-Leibler Divergence and Shannon’s Mutual ...
ETHZ2020
, ,
We rigorously derive a single-letter variational expression for the mutual information of the asymmetric two-groups stochastic block model in the dense graph regime. Existing proofs in the literature are indirect, as they involve mapping the model to a ran ...
IEEE2019
, ,
The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) in terms of P X P Y (E) (where P X P Y is the product of the marginals of P XY ) and a measure of dependence of X and Y. Such bounds have direct application ...