**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Unit# Information Laboratory in Recall Systems

Laboratory

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related people

Loading

Units doing similar research

Loading

Related research domains

Loading

Related publications

Loading

Related people (25)

Related research domains (94)

Communication channel

A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and compute

Computer network

A computer network is a set of computers sharing resources located on or provided by network nodes. Computers use common communication protocols over digital interconnections to communicate with eac

Wireless sensor network

Wireless sensor networks (WSNs) refer to networks of spatially dispersed and dedicated sensors that monitor and record the physical conditions of the environment and forward the collected data to a ce

Related publications (92)

Loading

Loading

Loading

Units doing similar research (102)

Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Others were the fruit of generalisation and the mathematical genius of bright minds like Rényi, Csizsár and Sibson. These powerful objects allow us to manipulate probabilities intuitively and seem always to be somehow connected to concrete settings in communication, coding or estimation theory. A common theme is: take a problem in one of these areas, try to control (upper or lower-bound) the expected value of some function of interest (often, probabilities of error) and, with enough work, an information measure appears as a fundamental limit of the problem. The most striking example of this is in Shannon's seminal paper in 1948: his purpose was to characterise the smallest possible expected length of a uniquely decodable encoding that compresses the realisations of a random variable. As he brilliantly proved, the smallest expected length one can hope for is the Entropy of the random variable. In establishing this connection, another quantity needed to be implicitly controlled: the Kraft's sum of the code. Seemingly unrelated before, these three objects joined forces in harmony to provide a beautiful and fundamental result. But why are they related? The answer seems to be: duality. Duality is an abstract notion commonly used in linear algebra and functional analysis. It has been expanded and generalised over the years. Several incarnations have been discovered throughout mathematics. One particular instance of this involves vector spaces: given two vector spaces and a "duality pairing" one can jump from one space to the other (its dual) through Legendre-Fenchel-like transforms. In the most common settings in Information Theory, the two spaces and the pairing are, respectively: 1) the space of (probability)measures defined on X; 2) the space of bounded functions defined on X; 3) the Lebesgue integral of the function (the expected value of the function if the measure is a probability measure). Once these are set, Legendre-Fenchel-like transforms allow us to connect a) a functional acting on the space described in 1), b) a functional acting on the space described in 2) and the anchor point is c) the (expected) value described in 3).These three pieces (a), b) and c)) represent the actors of many of the results provided in Information Theory. Once they are found, one usually bounds the functional described in b) and obtains a bound connecting the expected value and the functional of measures (e.g., an information measure). Going back to Shannon's result, fixed a random variable (and thus, a probability measure) and selected the function to be the length of a code: the functional a) is the Shannon Entropy of the source; the functional b) is the Kraft sum of the code; the pairing c) is the expected length of the code. We explore this connection and this pattern throughout the thesis. We will see how it can be found in notable results like Coding Theorems for one-to-one codes, Campbell's Coding Theorem, Arikan's Guessing Theorem, Fano-like and Transportation-Cost Inequalities and so on. Moreover, unearthing the pattern allows us to generalise it to other information measures and apply the technique in a variety of fields, including Learning Theory, Estimation Theory and Hypothesis Testing.

Ahmet Arda Atalik, Michael Christoph Gastpar, Alper Köse

The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establishes novel bounds on the differential entropy of the conditional mean in the case of finite-variance input signals and additive Gaussian noise. The main result is a new lower bound in terms of the differential entropies of the input signal and the noisy observation. The main results are also extended to the vector Gaussian channel and to the natural exponential family. Various other properties such as upper bounds, asymptotics, Taylor series expansion, and connection to Fisher Information are obtained. Two applications of the lower bound in the remote-source coding and CEO problem are discussed.

, ,

This paper considers an additive Gaussian noise channel with arbitrarily distributed finite variance input signals. It studies the differential entropy of the minimum mean-square error (MMSE) estimator and provides a new lower bound which connects the differential entropy of the input, output, and conditional mean. That is, the sum of differential entropies of the conditional mean and output is always greater than or equal to twice the input differential entropy. Various other properties such as upper bounds, asymptotics, Taylor series expansion, and connection to Fisher Information are obtained. An application of the lower bound in the remote-source coding problem is discussed, and extensions of the lower and upper bounds to the vector Gaussian channel are given.