Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Other ...
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bound that includes part of the family of f-Divergences. The results are then applied to specific settings of interest and compared to other notable results i ...
In this work, we connect the problem of bounding the expected generalisation error with transportation-cost inequalities. Exposing the underlying pattern behind both approaches we are able to generalise them and go beyond Kullback- Leibler Divergences/Mutu ...
2022
, ,
In this work, the probability of an event under some joint distribution is bounded by measuring it with the product of the marginals instead (which is typically easier to analyze) together with a measure of the dependence between the two random variables. ...
2021
, ,
The aim of this work is to provide bounds connecting two probability measures of the same event using Rényi α-Divergences and Sibson’s α-Mutual Information, a generalization of respectively the Kullback-Leibler Divergence and Shannon’s Mutual ...
ETHZ2020
, ,
The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) in terms of P X P Y (E) (where P X P Y is the product of the marginals of P XY ) and a measure of dependence of X and Y. Such bounds have direct application ...
IEEE2019
, ,
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) and f-divergences. In particular, the bounds are instantiated when the measures considered are a joint distribution and the corresponding product of marginals ...
IEEE2020
, ,
There has been growing interest in studying connections between generalization error of learning algorithms and information measures. In this work, we generalize a result that employs the maximal leakage, a measure of leakage of information, and explore ho ...