Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers advanced topics in information theory, computation, and communication. It delves into the concepts of conditional entropy, Huffman coding, joint entropy, and the IID source. The instructor explains how to compress long strings efficiently using Huffman codes and explores the fundamental compression theorem for IID sources. The lecture also discusses conditional probability, conditional expectation, and the impact of conditioning on entropy reduction. Through examples like the Bit Flipper Channel, students learn about entropy bounds and the theorem that conditioning reduces entropy.