Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture delves into the quantification of information based on probability, emphasizing the importance of quantifying information in communication systems. It covers topics such as the information content of messages with high and low probabilities, the concept of entropy for a source, and the methods to store the most information under different constraints. The lecture also explores the Lagrange multipliers method for maximizing information content, the relationship between common information and entropy of random sources, and the communication channel capacity in the presence of noise. Overall, it provides a comprehensive understanding of how information is quantified and stored efficiently in various systems.