Lecture

Deep Learning for NLP

Description

This lecture covers the fundamentals of deep learning for natural language processing (NLP), focusing on word vector composition, embeddings, context representations, and learning techniques like Word2vec and Glove. It delves into continuous bag of words (CBOW) and skip-gram models, explaining the softmax function, and the challenges of vanishing gradients in recurrent neural networks. The lecture also introduces the transformer model, self-attention mechanism, and the integration of structured and unstructured knowledge in NLP tasks. It concludes with discussions on the successes, challenges, and ethical considerations in NLP.

This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.

Watch on Mediaspace
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.