Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
The computational prediction of crystal structures has emerged as an useful alternative to expensive and often cumbersome experiments. We propose an approach to the prediction of crystal structures and polymorphism based on reproducing the crystallization ...
In recent years, Machine Learning based Computer Vision techniques made impressive progress. These algorithms proved particularly efficient for image classification or detection of isolated objects. From a probabilistic perspective, these methods can predi ...
The discrete cosine transform (DCT) is known to be asymptotically equivalent to the Karhunen-Loève transform (KLT) of Gaussian first-order auto-regressive (AR(1)) processes. Since being uncorrelated under the Gaussian hypothesis is synonymous with independ ...
Mean Field inference is central to statistical physics. It has attracted much interest in the Computer Vision community to efficiently solve problems expressible in terms of large Conditional Random Fields. However, since it models the posterior probabilit ...
Mean Field inference is central to statistical physics. It has attracted much interest in the Computer Vision community to efficiently solve problems expressible in terms of large Conditional Random Fields. However, since it models the posterior probabilit ...
We introduce a sequence-dependent coarse-grain model of double-stranded DNA with an explicit description of both the bases and the phosphate groups as interacting rigid-bodies. The model parameters are trained on extensive, state-of-the-art large scale mol ...
Source imaging maps back boundary measurements to underlying generators within the domain; e. g., retrieving the parameters of the generating dipoles from electrical potential measurements on the scalp such as in electroencephalography (EEG). Fitting such ...
This paper presents a score that can be used for evaluating probabilistic forecasts of multicategory events. The score is a reinterpretation of the logarithmic score or ignorance score, now formulated as the relative entropy or Kullback–Leibler divergence ...
Recently, an information-theoretical decomposition of Kullback–Leibler divergence into uncertainty, reliability, and resolution was introduced. In this article, this decomposition is generalized to the case where the observation is uncertain. Along with a ...
We study the distributed inference task over regression and classification models where the likelihood function is strongly log-concave. We show that diffusion strategies allow the KL divergence between two likelihood functions to converge to zero at the r ...