Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the fundamentals of Convolutional Neural Networks (CNNs), including the concepts of convolution, pooling, and weight sharing. It explains the importance of non-linearity in CNNs and the role of normalization layers such as Batch Normalization. The lecture also delves into data augmentation techniques and their significance in training deep learning models. Additionally, it discusses weight decay as a regularization method and the use of dropout to prevent overfitting in neural networks.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace