How physical process knowledge adds information to predictions; an Algorithmic Information Theory perspective
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Communications is about conveying information from one point to another subject to certain performance constraints. The information is assumed to be generated by a source and may, for example, represent a voice waveform, the reading of a thermal sensor, or ...
Satellites and ground-based stations have recorded various types of data from the solar-terrestrial system during recent decades. The new type of particle detectors in SEVAN (Space Environmental Viewing and Analysis Network) project will be able to measure ...
With a novel, less classical approach to the subject, the authors have written a book with the conviction that signal processing should be taught to be fun. The threatment is therefore less focused on the mathematics and more on the conceptual aspects, the ...
In 1971, the first microprocessor produced in mass production had 2300 transistor and was able to compute 60'000 operations per second at 740 kHz. Nowadays, even a common gaming console uses a central unit including 243 millions of transistors running at 4 ...
We propose an information theoretic model that unifies a wide range of existing information theoretic signal processing algorithms in a compact mathematical framework. It is mainly based on stochastic processes, Markov chains and error probabilities. The p ...
Signal processing algorithms become more and more complex and the algorithm architecture adaptation and design processes cannot any longer rely only on the intuition of the designers to build efficient systems. Specific tools and methods are needed to cope ...
Many different algorithms developed in statistical physics, coding theory, signal processing, and artificial intelligence can be expressed by graphical models and solved (either exactly or approximately) with iterative message-passing algorithms on the mod ...
Distributed Video Coding (DVC) is a new video coding paradigm based on two major Information Theory results: the Splepian-Wolf and Wyner-Ziv theorems. Recently, practical DVC solutions have been proposed with promising results; however, there is still a ne ...
We examine the problem of multiple sources transmitting information to one or more receivers that require the information from all the sources, over a network where the network nodes perform randomized network coding. We consider the noncoherent case, wher ...
The increasing complexity of signal processing algorithms has lead to the need of developing the algorithms specifications using generic software implementations that become in practice the reference implementation. This fact can be particularly observed i ...