Covers Multi-Layer Perceptrons (MLP) and their application from classification to regression, including the Universal Approximation Theorem and challenges with gradients.
Covers Convolutional Neural Networks, including layers, training strategies, standard architectures, tasks like semantic segmentation, and deep learning tricks.