Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the demonstration of Shannon's theorem, which states that for any binary code C used to represent a given sequence X, the code length is always greater than or equal to the entropy of X. The lecture also explores the Kraft inequality, binary trees, and the concavity of the log function. It delves into the implications of Shannon's theorem for data compression, highlighting the limitations of lossless compression and the necessity of staying above Shannon's bound. Additionally, it discusses the concept of compression with losses, focusing on image and sound compression techniques, such as reducing image resolution and exploiting psychoacoustic effects in sound encoding.