Delves into Deep Learning for Natural Language Processing, exploring Neural Word Embeddings, Recurrent Neural Networks, and Attentive Neural Modeling with Transformers.
Explores Seq2Seq models with and without attention mechanisms, covering encoder-decoder architecture, context vectors, decoding processes, and different types of attention mechanisms.