Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the training of neural networks using stochastic gradient descent, chain rules for forward and backward propagation, computation of gradients with respect to parameters, weight decay, and the concept of dropout to prevent overfitting by randomly dropping subsets of units in the network.