Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture delves into the intricacies of neural networks, focusing on their training and optimization. The instructor covers topics such as forward and backward passes, stochastic gradient descent, and mini-batch stochastic gradient descent. The lecture also touches on the challenges of training neural networks, including the non-convex nature of the loss function and the presence of local minima. Practical aspects like using Google Colab for faster computation and the environmental implications of training neural networks are also discussed.