Publications associées (35)

Lower-bounds on the Bayesian Risk in Estimation Procedures via f–Divergences

Michael Christoph Gastpar, Adrien Vandenbroucque, Amedeo Roberto Esposito

We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bound that includes part of the family of f-Divergences. The results are then applied to specific settings of interest and compared to other notable results i ...
2022

Evaluating the Explainers: Black-Box Explainable Machine Learning for Student Success Prediction in MOOCs

Vinitra Swamy, Bahar Radmehr, Mirko Marras, Natasa Krco

Neural networks are ubiquitous in applied machine learning for education. Their pervasive success in predictive performance comes alongside a severe weakness, the lack of explainability of their decisions, especially relevant in human-centric fields. We im ...
2022

A Wasserstein-based measure of conditional dependence

Negar Kiyavash, Seyed Jalal Etesami, Kun Zhang

Measuring conditional dependencies among the variables of a network is of great interest to many disciplines. This paper studies some shortcomings of the existing dependency measures in detecting direct causal influences or their lack of ability for group ...
2022

Locally Differentially-Private Randomized Response for Discrete Distribution Learning

Michael Christoph Gastpar, Adriano Pastore

We consider a setup in which confidential i.i.d. samples X1, . . . , Xn from an unknown finite-support distribution p are passed through n copies of a discrete privatization chan- nel (a.k.a. mechanism) producing outputs Y1, . . . , Yn. The channel law gua ...
2021

Generalization Error Bounds Via Rényi-, f-Divergences and Maximal Leakage

Michael Christoph Gastpar, Amedeo Roberto Esposito, Ibrahim Issa

In this work, the probability of an event under some joint distribution is bounded by measuring it with the product of the marginals instead (which is typically easier to analyze) together with a measure of the dependence between the two random variables. ...
2021

Sequential Domain Adaptation by Synthesizing Distributionally Robust Experts

Daniel Kuhn, Viet Anh Nguyen, Bahar Taskesen

Least squares estimators, when trained on a few target domain samples, may predict poorly. Supervised domain adaptation aims to improve the predictive accuracy by exploiting additional labeled training samples from a source distribution that is close to th ...
2021

Graph Chatbot

Chattez avec Graph Search

Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.

AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.