Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Deep neural networks have been empirically successful in a variety of tasks, however their theoretical understanding is still poor. In particular, modern deep neural networks have many more parameters than training data. Thus, in principle they should over ...
We derive generalization and excess risk bounds for neural networks using a family of complexity measures based on a multilevel relative entropy. The bounds are obtained by introducing the notion of generated hierarchical coverings of neural networks and b ...
We prove an asymptotic formula for the second moment of a product of two Dirichlet L-functions on the critical line, which has a power saving in the error term and which is uniform with respect to the involved Dirichlet characters. As special cases we give ...
2020
In this study, we develop statistical relationships between radar observables and drop size distribution properties in different latitude bands to inform radar rainfall retrieval techniques and understand underpinning microphysical reasons for differences ...
We present a voxel-wise Bayesian multi-compartment T2 relaxometry fitting method based on Hamiltonian Markov Chain Monte Carlo (HMCMC) sampling. The T 2 spectrum is modeled as a mixture of truncated Gaussian components, which involves the estimation of par ...
We tackle the fundamentally ill-posed problem of 3D human localization from monocular RGB images. Driven by the limitation of neural networks outputting point estimates, we address the ambiguity in the task by predicting confidence intervals through a loss ...
2019
, , ,
Author summary How do humans make prediction when the critical factor that influences the quality of the prediction is hidden? Here, we address this question by conducting a simple psychophysical experiment in which participants had to extrapolate a parabo ...
2020
,
Two characteristics that make convex decomposition algorithms attractive are simplicity of operations and generation of parallelizable structures. In principle, these schemes require that all coordinates update at the same time, i.e., they are synchronous ...
SPRINGER/PLENUM PUBLISHERS2019
,
Small variance asymptotics is emerging as a useful technique for inference in large scale Bayesian non-parametric mixture models. This paper analyses the online learning of robot manipulation tasks with Bayesian non-parametric mixture models under small va ...
Small-variance asymptotics is emerging as a useful technique for inference in large-scale Bayesian non-parametric mixture models. This paper analyzes the online learning of robot manipulation tasks with Bayesian non-parametric mixture models under small-va ...