Covers Convolutional Neural Networks, including layers, training strategies, standard architectures, tasks like semantic segmentation, and deep learning tricks.
Introduces feed-forward networks, covering neural network structure, training, activation functions, and optimization, with applications in forecasting and finance.
Covers Multi-Layer Perceptrons (MLP) and their application from classification to regression, including the Universal Approximation Theorem and challenges with gradients.