Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the fundamentals of multi-layer neural networks, focusing on the structure and training process of fully connected networks with hidden layers. It explains the activation functions, weights initialization, and the role of gradient descent in optimizing the network. The lecture also introduces the concept of non-polynomial activation functions and the universal approximation theorem.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace