Publication

Pliable Index Coding

Abstract

We formulate a new variant of the index coding problem, where instead of demanding a specific message, clients are pliable, and are interested in receiving any t messages that they do not have. We term this problem pliable index coding or PICOD(t). We prove that, with this formulation, although some instances of the problem become simple, in general, the problem of finding the optimal linear code remains NP-hard. However, we show that it is possible to construct pliable index codes that are substantially smaller than index codes in many cases. If there are n clients, the server has m messages, and each client has a side information set of cardinality s > log(2) n, the number of broadcast transmissions required is only linearly dependent on t. We generalize the results to instances where the side information sets are not necessarily of equal cardinality. When m = O(n(delta)), for some constant delta > 0, we show that the codes of size O(min{t log(2) n, t log n + log(3) n}) are sufficient in general. We also consider the scenario when the server only knows the cardinality of the side information sets of the clients and each client is interested in receiving any t messages that it does not have. We term this formulation oblivious pliable index coding or OB-PICOD(t). If the cardinalities of side information sets of all the clients is s (with s

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (32)
Mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Information theory
Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
Variation of information
In probability theory and information theory, the variation of information or shared information distance is a measure of the distance between two clusterings (partitions of elements). It is closely related to mutual information; indeed, it is a simple linear expression involving the mutual information. Unlike the mutual information, however, the variation of information is a true metric, in that it obeys the triangle inequality. Suppose we have two partitions and of a set into disjoint subsets, namely and .
Show more
Related publications (33)

A Functional Perspective on Information Measures

Amedeo Roberto Esposito

Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Other ...
EPFL2022

A Comparison of optimal measurement-system design with engineering judgement for bridge load testing

Ian Smith, Numa Joy Bertola

Due to conservative approaches in construction design and practice, infrastructure often has hidden reserve capacity. When quantified, this reserve has potential to improve decisions related to asset management. Field measurements, collected through load t ...
2021

Feedback and Common Information: Bounds and Capacity for Gaussian Networks

Erixhen Sula

Network information theory studies the communication of information in a network and considers its fundamental limits. Motivating from the extensive presence of the networks in the daily life, the thesis studies the fundamental limits of particular network ...
EPFL2021
Show more
Related MOOCs (11)
Information, Calcul, Communication: Introduction à la pensée informatique
Dans une première partie, nous étudierons d’abord comment résoudre de manière très concrète un problème au moyen d’un algorithme, ce qui nous amènera dans un second temps à une des grandes questions d
Information, Calcul, Communication: Introduction à la pensée informatique
Dans une première partie, nous étudierons d’abord comment résoudre de manière très concrète un problème au moyen d’un algorithme, ce qui nous amènera dans un second temps à une des grandes questions d
Algebra (part 1)
Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.