Covers the fundamentals of multilayer neural networks and deep learning, including back-propagation and network architectures like LeNet, AlexNet, and VGG-16.
Covers Multi-Layer Perceptrons (MLP) and their application from classification to regression, including the Universal Approximation Theorem and challenges with gradients.
Covers regression diagnostics for linear models, emphasizing the importance of checking assumptions and identifying outliers and influential observations.
Covers the fundamentals of deep learning, including data representations, bag of words, data pre-processing, artificial neural networks, and convolutional neural networks.