Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture discusses the curse of dimensionality and how modern architectures overcome it by being both local and translational invariant, emphasizing the importance of stability to smooth transformations in deep learning models. It explores the benefits of feature learning in adapting pooling scales and the significance of stability to deformations in network performance. The instructor presents empirical evidence showing that deep learning converges to well-defined algorithms, beating the curse of dimensionality by considering objects as local parts with fluctuating relative positions.