Strengthened Information-theoretic Bounds on the Generalization Error
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
In this paper, we investigate by means of statistical and information-theoretic measures, to what extent sensory-motor coordinated activity can generate and structure information in the sensory channels of a simulated agent interacting with its surrounding ...
The goal of neural processing assemblies is varied, and in many cases still rather unclear. However, a possibly reasonable subgoal is that sensory information may be encoded efficiently in a population of neurons. In this context, Mutual Information is a l ...
We investigate the spreading of information in a one-dimensional Bose-Hubbard system after a sudden parameter change. In particular, we study the time evolution of correlations and entanglement following a quench. The investigated quantities show a light-c ...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that ...
Understanding the guiding principles of sensory coding strategies is a main goal in computational neuroscience. Among others, the principles of predictive coding and slowness appear to capture aspects of sensory processing. Predictive coding postulates tha ...
We investigate the capacity and mutual information of a broadband fading channel consisting of a finite number of time-varying paths. We show that the capacity of the channel in the wideband limit is the same as that of a wideband Gaussian channel with the ...
We address the issue of how statistical and information-theoric measures can be employed to quantify the categorization process of a simulated robotic agent interacting with its local environment. We show how correlation, entropy, and mutual information ca ...
In this paper we aim to explore what is the most appropriate number of data samples needed when measuring the temporal correspondence between a chosen set of video and audio cues in a given audio-visual sequence. Presently the optimal model that connects s ...
We present a method that exploits an information theoretic framework to extract optimal audio features with respect to the video features. A simple measure of mutual information between the resulting audio features and the video ones allows to detect the a ...
Mutual information is an attractive registration criterion because it provides a meaningful comparison of images that represent different physical properties. In this paper, we review the shortcomings of three published methods for its computation. We iden ...