Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the representation of data through vectorization, bag of words, and histograms, as well as the concepts of missing data, noisy data, and data normalization. It introduces the multilayer perceptron (MLP) model, explaining its training algorithm and the activation functions used in hidden and output layers. The lecture also discusses the challenges of gradient-based learning, backpropagation, and the training process of an MLP using techniques like gradient descent, stochastic gradient descent, and mini-batch gradient descent. It addresses the issues of gradient vanishing, weight initialization, and regularization in deep neural networks.