Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the fundamentals of modern Natural Language Processing (NLP), focusing on setting up NLP problems, word embeddings, and neural models for NLP tasks. It discusses how to represent words as vectors, select a vocabulary, and compose word embeddings. The lecture also delves into the challenges of tokenization, vocabulary selection, and word vector composition. Furthermore, it explores the training of word embeddings, sequence labeling, text generation, and predicting labels using logistic regression. The instructor emphasizes the importance of learning embeddings that enable successful task completion and the application of models for tasks beyond classification.