Publication

Remote Source Coding Under Gaussian Noise: Dueling Roles of Power and Entropy Power

Michael Christoph Gastpar
2019
Journal paper
Abstract

The distributed remote source coding (the so-called CEO) problem is studied in the case where the underlying source, not necessarily Gaussian, has finite differential entropy and the observation noise is Gaussian. The main result is a new lower bound for the sum-rate-distortion function under arbitrary distortion measures. When specialized to the case of mean-squared error, it is shown that the bound exactly mirrors a corresponding upper bound, except that the upper bound has the source power (variance), whereas the lower bound has the source entropy power. Bounds exhibiting this pleasing duality of power and entropy power have been well known for direct and centralized source coding since Shannon’s work. While the bounds hold generally, their value is most pronounced when interpreted as a function of the number of agents in the CEO problem.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (37)
Entropy
Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory.
Entropy (information theory)
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Mean squared error
In statistics, the mean squared error (MSE) or mean squared deviation (MSD) of an estimator (of a procedure for estimating an unobserved quantity) measures the average of the squares of the errors—that is, the average squared difference between the estimated values and the actual value. MSE is a risk function, corresponding to the expected value of the squared error loss. The fact that MSE is almost always strictly positive (and not zero) is because of randomness or because the estimator does not account for information that could produce a more accurate estimate.
Show more
Related publications (38)

Improving SAM Requires Rethinking its Optimization Formulation

Volkan Cevher, Kimon Antonakopoulos, Thomas Michaelsen Pethick, Wanyun Xie, Fabian Ricardo Latorre Gomez

This paper rethinks Sharpness-Aware Minimization (SAM), which is originally formulated as a zero-sum game where the weights of a network and a bounded perturbation try to minimize/maximize, respectively, the same differentiable loss. We argue that SAM shou ...
2024

Quantization for Decentralized Learning Under Subspace Constraints

Ali H. Sayed, Stefan Vlaski, Roula Nassif, Marco Carpentiero

In this article, we consider decentralized optimization problems where agents have individual cost functions to minimize subject to subspace constraints that require the minimizers across the network to lie in low-dimensional subspaces. This constrained fo ...
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC2023

Differential Entropy of the Conditional Expectation Under Additive Gaussian Noise

Michael Christoph Gastpar, Alper Köse, Ahmet Arda Atalik

The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establishes novel bounds on the differential entropy of the co ...
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC2022
Show more
Related MOOCs (16)
Electrical Engineering I
Découvrez les circuits électriques linéaires. Apprenez à les maîtriser et à les résoudre, dans un premier temps en régime continu puis en régime alternatif.
Electrical Engineering I
Découvrez les circuits électriques linéaires. Apprenez à les maîtriser et à les résoudre, dans un premier temps en régime continu puis en régime alternatif.
Analyse I
Le contenu de ce cours correspond à celui du cours d'Analyse I, comme il est enseigné pour les étudiantes et les étudiants de l'EPFL pendant leur premier semestre. Chaque chapitre du cours correspond
Show more