Lecture

Mathematics of Data: From Theory to Computation

In course
DEMO: culpa aliquip labore aute
Officia dolore proident fugiat mollit ea. Dolore do dolor qui ad consequat. Laboris anim velit sit aliquip sint.
Login to see this section
Description

This lecture covers key concepts in data mathematics, including automatic differentiation, linear layers, training loops for classification, and attention layers. It also explores the complexity of backpropagation and graphical layers in deep learning models.

Instructor
occaecat id laborum
Sint sint deserunt consequat eu dolor deserunt. Excepteur ea nostrud est pariatur qui sint qui ipsum laboris consequat sit quis do. Veniam tempor ipsum fugiat ut ad veniam ex laboris cillum minim velit officia adipisicing. Cillum qui laborum nisi sunt deserunt enim ipsum irure cillum do in id pariatur cupidatat. Mollit minim sint non ullamco mollit Lorem. Esse ad id reprehenderit qui fugiat culpa elit veniam dolore do Lorem adipisicing ea.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (33)
Neural Networks: Training and Activation
Explores neural networks, activation functions, backpropagation, and PyTorch implementation.
Deep Learning Building Blocks: Linear Layers
Explains the fundamental building blocks of deep learning, focusing on linear layers and activation functions.
Deep Learning Building Blocks
Covers tensors, loss functions, autograd, and convolutional layers in deep learning.
Deep Learning Fundamentals
Introduces deep learning, from logistic regression to neural networks, emphasizing the need for handling non-linearly separable data.
Document Analysis: Topic Modeling
Explores document analysis, topic modeling, and generative models for data generation in machine learning.
Show more