Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Variational Autoencoders
Graph Chatbot
Related lectures (30)
Previous
Page 1 of 3
Next
Generative Models: Self-Attention and Transformers
Covers generative models with a focus on self-attention and transformers, discussing sampling methods and empirical means.
Deep Generative Models
Covers deep generative models, including variational autoencoders, GANs, and deep convolutional GANs.
Deep Generative Models: Part 2
Explores deep generative models, including mixtures of multinomials, PCA, deep autoencoders, convolutional autoencoders, and GANs.
Variational Auto-Encoders and NVIB
Explores Variational Auto-Encoders, Bayesian inference, attention-based latent spaces, and the effectiveness of Transformers in language processing.
Document Analysis: Topic Modeling
Explores document analysis, topic modeling, and generative models for data generation in machine learning.
Dimensionality Reduction: PCA & Autoencoders
Explores PCA, Autoencoders, and their applications in dimensionality reduction and data generation.
Natural Language Processing: Understanding Transformers and Tokenization
Provides an overview of Natural Language Processing, focusing on transformers, tokenization, and self-attention mechanisms for effective language analysis and synthesis.
Non-Linear Dimensionality Reduction
Covers non-linear dimensionality reduction techniques using autoencoders, deep autoencoders, and convolutional autoencoders for various applications.
Deep Generative Models
Covers deep generative models, including LDA, autoencoders, GANs, and DCGANs.
Dimensionality Reduction: PCA and Kernel PCA
Covers PCA, Kernel PCA, and autoencoders for dimensionality reduction in data analysis.