Lecture

Generative Models: Self-Attention and Transformers

In course
DEMO: do elit
Culpa adipisicing exercitation irure pariatur. Velit sint voluptate eiusmod tempor commodo nulla sunt officia esse in eiusmod ad occaecat occaecat. Sint est officia minim fugiat esse excepteur proident aliquip esse laborum adipisicing. Pariatur consectetur proident commodo in nulla deserunt et reprehenderit. Dolore tempor ex deserunt eiusmod nulla est excepteur fugiat ad excepteur.
Login to see this section
Description

This lecture covers generative models focusing on self-attention and transformers. Topics include autoencoders, Boltzmann machines, masked training, attention mechanisms, and maximum entropy principle. The slides discuss sampling methods, empirical means, and correlations in detail.

Instructor
anim dolor Lorem
Non cillum qui eiusmod magna ullamco ex deserunt dolor ea do ullamco exercitation incididunt adipisicing. Adipisicing ad do qui aliquip dolor eu sunt culpa ex incididunt consequat tempor nulla. Fugiat qui cupidatat ut mollit occaecat sit Lorem qui exercitation anim duis consectetur cupidatat Lorem. Pariatur veniam deserunt aliquip non incididunt elit reprehenderit. Laborum non voluptate ad non non sunt ex sit sunt. Duis minim tempor non consequat. Proident quis laborum ad laborum officia voluptate.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (508)
Document Analysis: Topic Modeling
Explores document analysis, topic modeling, and generative models for data generation in machine learning.
Boltzmann Machine
Introduces the Boltzmann Machine, covering expectation consistency, data clustering, and probability distribution functions.
Machine Learning Fundamentals
Introduces fundamental machine learning concepts, covering regression, classification, dimensionality reduction, and deep generative models.
Linear Regression: Basics and Estimation
Covers the basics of linear regression and how to solve estimation problems using least squares and matrix notation.
Introduction to Machine Learning: Supervised Learning
Introduces supervised learning, covering classification, regression, model optimization, overfitting, and kernel methods.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.