Covers the fundamentals of multilayer neural networks and deep learning, including back-propagation and network architectures like LeNet, AlexNet, and VGG-16.
Covers Multi-Layer Perceptrons (MLP) and their application from classification to regression, including the Universal Approximation Theorem and challenges with gradients.