Skip to main content
Graph
Search
fr
|
en
Switch to dark mode
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Machine Translation: Attention Mechanism
Graph Chatbot
Related lectures (32)
Previous
Page 3 of 4
Next
Transformer Architecture: The X Gomega
Delves into the Transformer architecture, self-attention, and training strategies for machine translation and image recognition.
Chemical Reaction Prediction: Molecular Transformer
Explores chemical reaction prediction using generative models and molecular transformers, emphasizing the importance of molecular language processing and stereochemistry.
Model Analysis
Explores neural model analysis in NLP, covering evaluation, probing, and ablation studies to understand model behavior and interpretability.
Natural Language Generation: Understanding the Task and Techniques
Provides an overview of Natural Language Generation, focusing on tasks, challenges, and methodologies for creating coherent text.
Language Modelling and Recurrent Neural Networks
Explores language modelling, RNNs, n-gram models, LSTMs, and bidirectional RNNs.
Non conceptual knowledge systems
Explores the impact of Deep learning on Digital Humanities, focusing on non conceptual knowledge systems and recent advancements in AI.
Deep Learning Techniques: Recurring Networks and LSTM Models
Discusses the implementation and optimization of recurring networks using LSTM models in deep learning.
Deep Learning for Question Answering
Explores deep learning for question answering, analyzing neural networks and model robustness to noise.
Language Models: Fixed-context and Recurrent Neural Networks
Discusses language models, focusing on fixed-context neural models and recurrent neural networks.
Neuro-symbolic Representations: Commonsense Knowledge & Reasoning
Delves into neuro-symbolic representations for commonsense knowledge and reasoning in natural language processing applications.