Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
This paper considers an additive Gaussian noise channel with arbitrarily distributed finite variance input signals. It studies the differential entropy of the minimum mean-square error (MMSE) estimator and provides a new lower bound which connects the differential entropy of the input, output, and conditional mean. That is, the sum of differential entropies of the conditional mean and output is always greater than or equal to twice the input differential entropy. Various other properties such as upper bounds, asymptotics, Taylor series expansion, and connection to Fisher Information are obtained. An application of the lower bound in the remote-source coding problem is discussed, and extensions of the lower and upper bounds to the vector Gaussian channel are given.
Michael Christoph Gastpar, Alper Köse, Ahmet Arda Atalik