Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of word embeddings, focusing on the Glove model and semantic relationships. It explains the learning of parameters using positive and negative examples, regularization techniques, and the computation of derivatives. The instructor discusses the use of subword embeddings, Byte Pair Encoding, and Hierarchical Softmax to improve efficiency. The lecture also delves into Glove's loss function, global co-occurrence counts, and modeling ratios. It explores word analogies, syntactic relationships, and semantic dimensions. Additionally, the lecture highlights the properties of word embeddings, such as clustering similar terms and encoding relationships. References to relevant articles in the field are provided.