Covers the fundamentals of multilayer neural networks and deep learning, including back-propagation and network architectures like LeNet, AlexNet, and VGG-16.
Covers linear models, including regression, derivatives, gradients, hyperplanes, and classification transition, with a focus on minimizing risk and evaluation metrics.
Covers Multi-Layer Perceptrons (MLP) and their application from classification to regression, including the Universal Approximation Theorem and challenges with gradients.