Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the back-propagation algorithm for deep neural networks, including the initialization of weights, forward pass, pre-activations, post-activations, error terms, weight updates, and the use of activation functions. The slides also discuss the importance of locality and translational invariance in convolutional neural networks.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace