Strengthened Information-theoretic Bounds on the Generalization Error
Graph Chatbot
Chattez avec Graph Search
Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
We address the issue of how statistical and information-theoric measures can be employed to quantify the categorization process of a simulated robotic agent interacting with its local environment. We show how correlation, entropy, and mutual information ca ...
Understanding the guiding principles of sensory coding strategies is a main goal in computational neuroscience. Among others, the principles of predictive coding and slowness appear to capture aspects of sensory processing. Predictive coding postulates tha ...
We investigate the spreading of information in a one-dimensional Bose-Hubbard system after a sudden parameter change. In particular, we study the time evolution of correlations and entanglement following a quench. The investigated quantities show a light-c ...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that ...
The goal of neural processing assemblies is varied, and in many cases still rather unclear. However, a possibly reasonable subgoal is that sensory information may be encoded efficiently in a population of neurons. In this context, Mutual Information is a l ...
In this paper, we investigate by means of statistical and information-theoretic measures, to what extent sensory-motor coordinated activity can generate and structure information in the sensory channels of a simulated agent interacting with its surrounding ...
Mutual information is an attractive registration criterion because it provides a meaningful comparison of images that represent different physical properties. In this paper, we review the shortcomings of three published methods for its computation. We iden ...
We investigate the capacity and mutual information of a broadband fading channel consisting of a finite number of time-varying paths. We show that the capacity of the channel in the wideband limit is the same as that of a wideband Gaussian channel with the ...
In this paper we aim to explore what is the most appropriate number of data samples needed when measuring the temporal correspondence between a chosen set of video and audio cues in a given audio-visual sequence. Presently the optimal model that connects s ...
We present a method that exploits an information theoretic framework to extract optimal audio features with respect to the video features. A simple measure of mutual information between the resulting audio features and the video ones allows to detect the a ...