The Gray-Wyner Network and Wyner's Common Information for Gaussian Sources
Publications associées (39)
Graph Chatbot
Chattez avec Graph Search
Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
We address the issue of how statistical and information-theoric measures can be employed to quantify the categorization process of a simulated robotic agent interacting with its local environment. We show how correlation, entropy, and mutual information ca ...
Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that ...
We study the rationality of learning and the biases in expectations in the Iowa Experimental Markets. Using novel tests developed in (Bossaerts, P., 1996. Martingale restrictions on equilibrium security prices under rational expectations and consistent bel ...
This correspondence provides bounds on the rate-distortion region for the distributed compression scenario where two (or more) sources are compressed separately for a decoder that has access to side information.Conclusive rate-distortion results are found ...
We consider the problem of compressing a Gaussian source for two decoders, one of which has access to side-information. Kaspi has solved this problem for discrete memoryless sources for the two cases with and without encoder side-information. We focus on t ...
We investigate the capacity and mutual information of a broadband fading channel consisting of a finite number of time-varying paths. We show that the capacity of the channel in the wideband limit is the same as that of a wideband Gaussian channel with the ...
It is well known and surprising that the uncoded transmission of an independent and identically distributed Gaussian source across an additive white Gaussian noise channel is optimal: No amount of sophistication in the coding strategy can ever perform bett ...
In this paper we aim to explore what is the most appropriate number of data samples needed when measuring the temporal correspondence between a chosen set of video and audio cues in a given audio-visual sequence. Presently the optimal model that connects s ...
Mutual information is an attractive registration criterion because it provides a meaningful comparison of images that represent different physical properties. In this paper, we review the shortcomings of three published methods for its computation. We iden ...