Covers optimization in machine learning, focusing on gradient descent for linear and logistic regression, stochastic gradient descent, and practical considerations.
Covers the fundamentals of multilayer neural networks and deep learning, including back-propagation and network architectures like LeNet, AlexNet, and VGG-16.
Covers the fundamentals of deep learning, including data representations, bag of words, data pre-processing, artificial neural networks, and convolutional neural networks.