Covers Multi-Layer Perceptrons (MLP) and their application from classification to regression, including the Universal Approximation Theorem and challenges with gradients.
Explores the history, models, training, convergence, and limitations of neural networks, including the backpropagation algorithm and universal approximation.
Covers the BackProp algorithm, including initialization, signal propagation, error computation, weight updating, and complexity comparison with numerical differentiation.
Explores the evolution of CNNs in image processing, covering classical and deep neural networks, training algorithms, backpropagation, non-linear steps, loss functions, and software frameworks.
Covers the fundamentals of deep learning, including data representations, bag of words, data pre-processing, artificial neural networks, and convolutional neural networks.