**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Course# COM-621: Advanced Topics in Information Theory

Summary

The class will focus on information-theoretic progress of the last decade. Topics include: Network Information Theory ; Information Measures: definitions, properties, and applications to probabilistic models.

Moodle Page

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Instructors

Loading

Lectures in this course

Loading

Related concepts

Loading

Related courses

Loading

Related MOOCs

Loading

Related courses (10)

Instructors (2)

Related concepts (67)

Related MOOCs (4)

Lectures in this course (3)

COM-406: Foundations of Data Science

We discuss a set of topics that are important for the understanding of modern data science but that are typically not taught in an introductory ML course. In particular we discuss fundamental ideas an

FIN-417: Quantitative risk management

This course is an introduction to quantitative risk management that covers standard statistical methods, multivariate risk factor models, non-linear dependence structures (copula models), as well as p

PHYS-426: Quantum physics IV

Introduction to the path integral formulation of quantum mechanics. Derivation of the perturbation expansion of Green's functions in terms of Feynman diagrams. Several applications will be presented,

PHYS-324: Classical electrodynamics

The goal of this course is the study of the physical and conceptual consequences of Maxwell equations.

CH-222: Coordination chemistry

Fundamental knowledge of coordination compounds.

Michael Christoph Gastpar

Michael Gastpar is a (full) Professor at EPFL. From 2003 to 2011, he was a professor at the University of California at Berkeley, earning his tenure in 2008.
He received his Dipl. El.-Ing. degree from ETH Zürich, Switzerland, in 1997 and his MS degree from the University of Illinois at Urbana-Champaign, IL, USA, in 1999. He defended his doctoral thesis at EPFL on Santa Claus day, 2002. He was also a (full) Professor at Delft University of Technology, The Netherlands.
His research interests are in network information theory and related coding and signal processing techniques, with applications to sensor networks and neuroscience.
He is a Fellow of the IEEE. He is the co-recipient of the 2013 Communications Society & Information Theory Society Joint Paper Award. He was an Information Theory Society Distinguished Lecturer (2009-2011). He won an ERC Starting Grant in 2010, an Okawa Foundation Research Grant in 2008, an NSF CAREER award in 2004, and the 2002 EPFL Best Thesis Award. He has served as an Associate Editor for Shannon Theory for the IEEE Transactions on Information Theory (2008-11), and as Technical Program Committee Co-Chair for the 2010 International Symposium on Information Theory, Austin, TX.

Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.

Entropy (information theory)

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".

2019 Maharashtra Legislative Assembly election

The 2019 Maharashtra Legislative Assembly election was held on 21 October 2019 to elect all 288 members of the state's Legislative Assembly. After a 61.4% turnout in the election, the ruling National Democratic Alliance (NDA) of the Bharatiya Janata Party (BJP) and Shiv Sena (SHS) won a majority. Following differences over the government formation, the alliance was dissolved, precipitating a political crisis. Since a council of ministers had not been formed after no party could manage to form the government, President's rule was imposed in the state.

1998 Winter Olympics

The 1998 Winter Olympics, officially known as the XVIII Olympic Winter Games and commonly known as Nagano 1998 (長野1998), was a winter multi-sport event held from 7 to 22 February 1998, mainly in Nagano, Japan, with some events taking place in the nearby mountain communities of Hakuba, Karuizawa, Nozawa Onsen, and Yamanouchi. The city of Nagano had previously been a candidate to host the 1940 Winter Olympics (which were later cancelled), as well as the 1972 Winter Olympics, but had been eliminated at the national level by Sapporo on both occasions.

Architecture of Liverpool

The architecture of Liverpool is rooted in the city's development into a major port of the British Empire. It encompasses a variety of architectural styles of the past 300 years, while next to nothing remains of its medieval structures which would have dated back as far as the 13th century. Erected 1716–18, Bluecoat Chambers is supposed to be the oldest surviving building in central Liverpool. There are over 2500 listed buildings in Liverpool of which 27 are Grade I and 105 Grade II* listed.

Neuronal Dynamics - Computational Neuroscience of Single Neurons

The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.

Neuronal Dynamics - Computational Neuroscience of Single Neurons

The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.

Neuronal Dynamics 2- Computational Neuroscience: Neuronal Dynamics of Cognition

This course explains the mathematical and computational models that are used in the field of theoretical neuroscience to analyze the collective dynamics of thousands of interacting neurons.

Advanced Information Theory: Random Binning

Explores random binning in advanced information theory, focusing on assigning labels based on typicality and achieving negligible error rates in source coding.

Advanced Information Theory: F-Divergences and Generalization Error

Covers f-divergences and generalization error in advanced information theory.

Maximal Correlation: Information Measures

Explores maximal correlation in information theory, mutual information properties, Renyi's measures, and mathematical foundations of information theory.