Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers information measures, including entropy, joint entropy, conditional entropy, mutual information, and Kullback-Leibler divergence. It explains the concepts using mathematical formulas and examples, emphasizing the importance of these measures in information theory and data processing applications.