Covers the fundamentals of multilayer neural networks and deep learning, including back-propagation and network architectures like LeNet, AlexNet, and VGG-16.
Covers optimization in machine learning, focusing on gradient descent for linear and logistic regression, stochastic gradient descent, and practical considerations.
Covers Convolutional Neural Networks, including layers, training strategies, standard architectures, tasks like semantic segmentation, and deep learning tricks.