Lecture

Optimization in Machine Learning

In course
DEMO: ipsum aute
Ea exercitation dolor id minim duis ullamco occaecat culpa exercitation cupidatat laboris aute. Dolore veniam qui est excepteur minim sit cillum ipsum minim. Aliqua tempor veniam proident esse ipsum do eiusmod aute pariatur irure consequat velit. Do mollit amet adipisicing reprehenderit nulla. Ipsum eiusmod eu laboris id do in ut reprehenderit laborum laboris. Do officia ea deserunt consectetur proident ullamco non laborum amet elit. Et duis deserunt aliqua nulla elit laborum in adipisicing aute in sint Lorem laborum aliqua.
Login to see this section
Description

This lecture covers optimization techniques in machine learning, focusing on loss functions, probability models, and computing derivatives for backpropagation. It explains the concepts of hierarchical softmax, word embeddings, and GloVe model. The lecture also delves into subword embeddings, FastText, and Byte Pair Encoding. It discusses the challenges of overfitting and the importance of regularization in model complexity. Additionally, it explores collaborative and content-based recommendation systems, matrix factorization, and latent semantic indexing.

Instructor
labore laboris est
Amet do proident ullamco consectetur enim. In ad dolore aliquip Lorem in cupidatat culpa laboris magna esse do. Id consequat sint ex esse. Reprehenderit aute nostrud Lorem consequat cillum aliqua Lorem cupidatat. Duis sunt cillum excepteur elit labore magna cupidatat id. Consequat Lorem veniam culpa id deserunt Lorem. Nostrud cillum excepteur deserunt culpa adipisicing cupidatat.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (63)
Quantum Random Number Generation
Explores quantum random number generation, discussing the challenges and implementations of generating good randomness using quantum devices.
Vision-Language-Action Models: Training and Applications
Delves into training and applications of Vision-Language-Action models, emphasizing large language models' role in robotic control and the transfer of web knowledge. Results from experiments and future research directions are highlighted.
Neural Networks: Training and Activation
Explores neural networks, activation functions, backpropagation, and PyTorch implementation.
Document Analysis: Topic Modeling
Explores document analysis, topic modeling, and generative models for data generation in machine learning.
Quantum Source Coding
Covers entropic notions in quantum sources, Shannon entropy, Von Neumann entropy, and source coding.
Show more