Lecture

Optimization in Machine Learning

In course
DEMO: incididunt ad sit
Tempor Lorem anim Lorem consequat ex cillum. In qui labore est cupidatat cupidatat nostrud. Reprehenderit minim excepteur non labore occaecat eiusmod cillum Lorem enim laboris laboris elit elit. Occaecat esse laborum esse adipisicing mollit consectetur quis aute. Enim cupidatat commodo cupidatat sunt magna excepteur in in cupidatat amet reprehenderit in.
Login to see this section
Description

This lecture covers optimization techniques in machine learning, focusing on loss functions, probability models, and computing derivatives for backpropagation. It explains the concepts of hierarchical softmax, word embeddings, and GloVe model. The lecture also delves into subword embeddings, FastText, and Byte Pair Encoding. It discusses the challenges of overfitting and the importance of regularization in model complexity. Additionally, it explores collaborative and content-based recommendation systems, matrix factorization, and latent semantic indexing.

Instructor
magna ipsum adipisicing ea
Cupidatat officia culpa nostrud ut aliqua velit occaecat voluptate consectetur duis est enim ea. Velit ullamco ullamco ipsum magna. Id cupidatat commodo magna nisi anim ullamco tempor.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (63)
Quantum Random Number Generation
Explores quantum random number generation, discussing the challenges and implementations of generating good randomness using quantum devices.
Vision-Language-Action Models: Training and Applications
Delves into training and applications of Vision-Language-Action models, emphasizing large language models' role in robotic control and the transfer of web knowledge. Results from experiments and future research directions are highlighted.
Neural Networks: Training and Activation
Explores neural networks, activation functions, backpropagation, and PyTorch implementation.
Document Analysis: Topic Modeling
Explores document analysis, topic modeling, and generative models for data generation in machine learning.
Quantum Source Coding
Covers entropic notions in quantum sources, Shannon entropy, Von Neumann entropy, and source coding.
Show more