Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concepts of neural networks training, stochastic gradient descent (SGD), and backpropagation. The instructor explains the importance of non-linear transformations in the initial layers to represent complex functions. The lecture also delves into the computation of gradients and the reuse of intermediate results in the training process.