Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Natural Language Generation
Graph Chatbot
Related lectures (32)
Previous
Page 3 of 4
Next
Neuro-symbolic Representations: Commonsense Knowledge & Reasoning
Delves into neuro-symbolic representations for commonsense knowledge and reasoning in natural language processing applications.
Deep Learning Techniques: Recurring Networks and LSTM Models
Discusses the implementation and optimization of recurring networks using LSTM models in deep learning.
Sparse Communication: Transformations and Applications
Explores the evolution from sparse modeling to sparse communication in neural networks for natural language processing tasks.
Deep Learning for NLP
Delves into Deep Learning for Natural Language Processing, exploring Neural Word Embeddings, Recurrent Neural Networks, and Attentive Neural Modeling with Transformers.
Pretraining Sequence-to-Sequence Models: BART and T5
Covers the pretraining of sequence-to-sequence models, focusing on BART and T5 architectures.
Transformer Architecture: Subquadratic Attention Mechanisms
Covers transformer architecture, focusing on encoder-decoder models and subquadratic attention mechanisms for efficient processing of input sequences.
Data Annotation: Collection and Biases in NLP
Addresses data collection, annotation processes, and biases in natural language processing.
Ethical Considerations in Natural Language Processing
Explores ethical challenges in NLP systems, including biases, toxicity, privacy, and disinformation.
Natural Language Generation: Understanding the Task and Techniques
Provides an overview of Natural Language Generation, focusing on tasks, challenges, and methodologies for creating coherent text.
Machine Translation: Attention Mechanism
Explores the attention mechanism in machine translation, addressing the bottleneck problem and improving NMT performance significantly.