Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture by the instructor provides insights on gradient-based algorithms in high-dimensional non-convex learning, focusing on supervised learning, neural networks, and stochastic gradient descent. It discusses the challenges of non-convex problems, the core of machine learning, and the mysteries of deep learning theory. The lecture explores the concepts of overfitting, underfitting, and the double-descent phenomenon, shedding light on the generalization capabilities of modern neural networks. It also delves into the understanding of gradient descent and the importance of assumption-free models in data analysis. The talk presents toy models like the spiked matrix-tensor model and discusses the dynamics of deep learning, emphasizing the need to rethink generalization and the role of gradient descent.