Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture by the instructor covers entropy bounds and conditional entropy theorems. It starts by introducing the concept of conditional entropy and then delves into the theorems that provide bounds on the conditional entropy of a discrete random variable X given Y. Through examples like the 'Bit Flipper Channel' and 'Lisa Rolls Two Dice,' the lecture illustrates how to verify and apply these entropy bounds. The chain rule for entropy is also explained, showing how the joint entropy of multiple random variables can be calculated. The lecture concludes with a detailed explanation of the chain rule for entropies, demonstrating how the uncertainty of a collection of random variables can be decomposed into individual uncertainties.