Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
In 1948, Claude Shannon laid the foundations of information theory, which grew out of a study to find the ultimate limits of source compression, and of reliable communication. Since then, information theory has proved itself not only as a quest to find these limits but also as a toolbox which provides a new machinery and perspectives on problems in various fields. Shannon's original description of the communication problem omitted the semantic aspects. However, modern communication systems necessitate the consideration of semantics, such as fidelity and freshness of data. Shannon did study a problem related to the fidelity of data --- known as the rate-distortion theory, which can be seen as an attempt to incorporate semantics in a weak sense. Yet, freshness has not been widely studied until 2011, when Kaul, Yates and Grueteser introduced a new metric for its assessment, called age of information (AoI).Since 2011, AoI has become a widely studied notion as data freshness becomes increasingly important. But at the same time, not all data is equally important. Aligned with this observation, in Part 1, we study a discrete-time model where each packet has a cost of not being sent, which might depend on the packet content. We study the tradeoff between the age and cost where the sender is confined to packet-based strategies. We show that the optimal tradeoff can be attained with finite-memory strategies and we devise an efficient policy iteration algorithm to find these optimal strategies. Allowing coding across packets significantly extends the packet-based strategies and we show that when the packet payloads are small, the performance can be improved by coding. Furthermore, we study a related problem where some of the equally important packets must be sent in order to control the output rate. `Which packet to send, and when?' is the relevant question of this problem and we show that if the packet arrival process is memoryless, a simple class of strategies attain the optimal tradeoff. The same class of strategies also solve the analogous continuous-time problem, where packets arrive as a Poisson process.In Part 2, we study two distributed hypothesis testing problems: (i) with a centralized architecture and (ii) with a fully decentralized architecture. In the centralized problem, we consider peripheral nodes that send quantized data to the fusion center in a memoryless fashion. The expected number of bits sent by each node under the null hypothesis is kept limited. We characterize the optimal decay rate of the misdetection probability provided that false alarms are rare, and study the tradeoff between the communication rate and maximal misdetection probability decay rate. We use the information theory toolbox and resort to rate-distortion methods to provide upper bounds to the tradeoff curve and we also show that at high rates lattice quantization achieves near-optimal performance. In the decentralized problem, we study a locally-Bayesian scheme where at every time instant, each node chooses to receive information from one of its neighbors at random. We show that under this sparser communication scheme, the agents learn the truth eventually and the asymptotic convergence rate remains the same as the standard algorithms. We also derive large deviation estimates of the log-belief ratios for a special case where each agent replaces its belief with that of the chosen neighbor.
Ali H. Sayed, Virginia Bordignon