Covers the foundational concepts of deep learning and the Transformer architecture, focusing on neural networks, attention mechanisms, and their applications in sequence modeling tasks.
Covers the fundamentals of deep learning, including data representations, bag of words, data pre-processing, artificial neural networks, and convolutional neural networks.
Explores perception in deep learning for autonomous vehicles, covering image classification, optimization methods, and the role of representation in machine learning.
Explores deep learning for NLP, covering word embeddings, context representations, learning techniques, and challenges like vanishing gradients and ethical considerations.