Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the fundamentals of deep learning for natural language processing (NLP), focusing on word vector composition, embeddings, context representations, and learning techniques like Word2vec and Glove. It delves into continuous bag of words (CBOW) and skip-gram models, explaining the softmax function, and the challenges of vanishing gradients in recurrent neural networks. The lecture also introduces the transformer model, self-attention mechanism, and the integration of structured and unstructured knowledge in NLP tasks. It concludes with discussions on the successes, challenges, and ethical considerations in NLP.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace