Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the derivation of the stochastic gradient descent formula for a simple perceptron from a quadratic loss function. It explains the sigmoidal output unit, supervised learning with sigmoidal output, gradient descent, artificial neural networks, and exercises related to simple perceptrons. The lecture also delves into the calculation of gradient descent for batch and online scenarios, emphasizing the geometric interpretation and differences between the two methods. The instructor concludes by discussing the learning outcomes, including the understanding of classification as a geometrical problem, discriminant functions, linear versus nonlinear discriminant functions, linearly separable problems, the perceptron algorithm, and gradient descent for simple perceptrons.