Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the chain rule for entropy, which states that the joint entropy of multiple random variables is the sum of individual entropies and conditional entropies. The theorem presented shows how uncertainty in a collection of random variables can be decomposed. Various examples illustrate the application of the chain rule in calculating conditional entropies. The lecture also introduces source models, entropy rates, and fundamental limits in source coding. The instructor emphasizes the importance of specifying joint distributions for random processes and provides insights on evolving sequences and independent events.