Lecture

Text Representation Learning: Word Embeddings

Description

This lecture covers the fundamental concept of text representation learning, focusing on word embeddings. It explains the process of learning numerical representations for words using techniques like matrix factorization. The lecture delves into the co-occurrence matrix, GloVe model, and learning word representations through matrix factorization. It also discusses word analogies and the training methods involved in learning word embeddings. Additionally, it explores the use of large models like transformers in predicting the next word in a text. The lecture concludes with pointers to resources on word2vec, GloVe, FastText, and sent2vec.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.