Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture introduces the concept of high-dimensional generalized linear models and the teacher-student scenario to understand the optimal achievable error in machine learning. The presentation delves into the information theoretical error, phase transitions, and the comparison between the General Approximate Message Passing algorithm and the information theoretical performance. Through examples like compressive sensing and binary perceptron models, the lecture showcases how different algorithms perform in various scenarios, shedding light on the interplay between model architecture and algorithm performance. The analysis provides insights into the gap between optimal and empirical performance, guiding towards better algorithm design and understanding the complexities of machine learning.