Lecture

Generative Models: Self-Attention and Transformers

Description

This lecture covers generative models focusing on self-attention and transformers. Topics include autoencoders, Boltzmann machines, masked training, attention mechanisms, and maximum entropy principle. The slides discuss sampling methods, empirical means, and correlations in detail.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.