Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture by the instructor covers the concept of conditional entropy, focusing on Huffman coding for efficient data compression. Starting with the observation of ternary trees, the lecture progresses to discuss joint entropy, Shannon-Fano codes, and the compression of long strings using Huffman codes. The instructor explains how to calculate conditional entropy and conditional expectation, using examples like the 'Bit Flipper Channel'. The lecture concludes with a discussion on the behavior of codeword-length as the number of symbols increases and the complexity of finding optimal coding strategies. Students will gain insights into efficient data compression techniques and the trade-offs involved in coding strategies.