Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
The distributed remote source coding (the so-called CEO) problem is studied in the case where the underlying source, not necessarily Gaussian, has finite differential entropy and the observation noise is Gaussian. The main result is a new lower bound for the sum-rate-distortion function under arbitrary distortion measures. When specialized to the case of mean-squared error, it is shown that the bound exactly mirrors a corresponding upper bound, except that the upper bound has the source power (variance), whereas the lower bound has the source entropy power. Bounds exhibiting this pleasing duality of power and entropy power have been well known for direct and centralized source coding since Shannon’s work. While the bounds hold generally, their value is most pronounced when interpreted as a function of the number of agents in the CEO problem.
Volkan Cevher, Kimon Antonakopoulos, Thomas Michaelsen Pethick, Wanyun Xie, Fabian Ricardo Latorre Gomez
Michael Christoph Gastpar, Alper Köse, Ahmet Arda Atalik
Ali H. Sayed, Stefan Vlaski, Roula Nassif, Marco Carpentiero