Publication

Generalization Error Bounds Via Rényi-, f-Divergences and Maximal Leakage

Related publications (63)

Information Spectrum Converse for Minimum Entropy Couplings and Functional Representations

Given two jointly distributed random variables (X,Y), a functional representation of X is a random variable Z independent of Y, and a deterministic function g(⋅,⋅) such that X=g(Y,Z). The problem of finding a minimum entropy functional representation is kn ...
2023

A KL Divergence-Based Loss for In Vivo Ultrafast Ultrasound Image Enhancement with Deep Learning

Jean-Philippe Thiran

Ultrafast ultrasound imaging, characterized by high frame rates, generates low-quality images. Convolutional neural networks (CNNs) have demonstrated great potential to enhance image quality without compromising the frame rate. However, CNNs have been most ...
2023

A Functional Perspective on Information Measures

Amedeo Roberto Esposito

Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Other ...
EPFL2022

Lower-bounds on the Bayesian Risk in Estimation Procedures via f–Divergences

Michael Christoph Gastpar, Adrien Vandenbroucque, Amedeo Roberto Esposito

We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bound that includes part of the family of f-Divergences. The results are then applied to specific settings of interest and compared to other notable results i ...
2022

From Generalisation Error to Transportation-cost Inequalities and Back

Michael Christoph Gastpar, Amedeo Roberto Esposito

In this work, we connect the problem of bounding the expected generalisation error with transportation-cost inequalities. Exposing the underlying pattern behind both approaches we are able to generalise them and go beyond Kullback- Leibler Divergences/Mutu ...
2022

Feedback and Common Information: Bounds and Capacity for Gaussian Networks

Erixhen Sula

Network information theory studies the communication of information in a network and considers its fundamental limits. Motivating from the extensive presence of the networks in the daily life, the thesis studies the fundamental limits of particular network ...
EPFL2021

Is there an analog of Nesterov acceleration for gradient-based MCMC?

Nicolas Henri Bernard Flammarion, Xiang Cheng

We formulate gradient-based Markov chain Monte Carlo (MCMC) sampling as optimization on the space of probability measures, with Kullback-Leibler (KL) divergence as the objective functional. We show that an under-damped form of the Langevin algorithm perfor ...
INT STATISTICAL INST2021

Common Information Components Analysis

Michael Christoph Gastpar, Erixhen Sula

Wyner's common information is a measure that quantifies and assesses the commonality between two random variables. Based on this, we introduce a novel two-step procedure to construct features from data, referred to as Common Information Components Analysis ...
MDPI2021

Locally Differentially-Private Randomized Response for Discrete Distribution Learning

Michael Christoph Gastpar, Adriano Pastore

We consider a setup in which confidential i.i.d. samples X1, . . . , Xn from an unknown finite-support distribution p are passed through n copies of a discrete privatization chan- nel (a.k.a. mechanism) producing outputs Y1, . . . , Yn. The channel law gua ...
2021

Robust Generalization via $\alpha$-Mutual Information

Michael Christoph Gastpar, Amedeo Roberto Esposito, Ibrahim Issa

The aim of this work is to provide bounds connecting two probability measures of the same event using Rényi α\alpha-Divergences and Sibson’s α\alpha-Mutual Information, a generalization of respectively the Kullback-Leibler Divergence and Shannon’s Mutual ...
ETHZ2020

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.