Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bound that includes part of the family of f-Divergences. The results are then applied to specific settings of interest and compared to other notable results i ...
2022
, ,
Many scientific systems are studied using computer codes that simulate the phenomena of interest. Computer simulation enables scientists to study a broad range of possible conditions, generating large quantities of data at a faster rate than the laboratory ...
In this work, we connect the problem of bounding the expected generalisation error with transportation-cost inequalities. Exposing the underlying pattern behind both approaches we are able to generalise them and go beyond Kullback- Leibler Divergences/Mutu ...
In this work, the probability of an event under some joint distribution is bounded by measuring it with the product of the marginals instead (which is typically easier to analyze) together with a measure of the dependence between the two random variables. ...
Deep neural networks have been empirically successful in a variety of tasks, however their theoretical understanding is still poor. In particular, modern deep neural networks have many more parameters than training data. Thus, in principle they should over ...
Ultrafast ultrasound imaging, characterized by high frame rates, generates low-quality images. Convolutional neural networks (CNNs) have demonstrated great potential to enhance image quality without compromising the frame rate. However, CNNs have been most ...
2023
, ,
The aim of this work is to provide bounds connecting two probability measures of the same event using Rényi α-Divergences and Sibson’s α-Mutual Information, a generalization of respectively the Kullback-Leibler Divergence and Shannon’s Mutual ...
ETHZ2020
, ,
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) and f-divergences. In particular, the bounds are instantiated when the measures considered are a joint distribution and the corresponding product of marginals ...
IEEE2020
We present an improved analysis of the Euler-Maruyama discretization of the Langevin diffusion. Our analysis does not require global contractivity, and yields polynomial dependence on the time horizon. Compared to existing approaches, we make an additional ...
We formulate gradient-based Markov chain Monte Carlo (MCMC) sampling as optimization on the space of probability measures, with Kullback-Leibler (KL) divergence as the objective functional. We show that an under-damped form of the Langevin algorithm perfor ...