Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the structure and optimization of convolutional networks, focusing on variance-preserving initialization, convolutional layers, weight sharing, pooling, data augmentation, weight decay, and dropout. It explains the importance of learned convolutional filters, skip connections, residuals, and interpretable activations. The lecture also discusses popular architectures like VGG and ResNet, as well as the entangled effects of various methods on datasets like CIFAR10 and ImageNet.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace