Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of maximal correlation in information theory, focusing on mutual information and its properties. It discusses the calculation of mutual information for discrete variables, the chain rule, and the concept of maximal correlation as a measure of dependence between random variables. The lecture also delves into Renyi's generalized information measures and their applications in characterizing the relationship between variables. Furthermore, it explores the continuity and concavity of mutual information, providing insights into the mathematical foundations of information theory.