Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Words Tokens: Lexical Level Overview
Graph Chatbot
Related lectures (32)
Previous
Page 1 of 4
Next
Lexicons, n-grams and Language Models
Explores lexicons, n-grams, and language models, emphasizing their importance in recognizing words and the effectiveness of n-grams for various tasks.
Words and Tokens: Language Models and Probabilities
Reviews language models, tokenization, and probability estimation in NLP systems.
Neural Word Embeddings: Learning Representations for Natural Language
Covers neural word embeddings and methods for learning word representations in natural language processing.
Language Models: Fixed-context and Recurrent Neural Networks
Discusses language models, focusing on fixed-context neural models and recurrent neural networks.
Words, tokens, n-grams and Language Models
Explores words, tokens, n-grams, and language models, focusing on probabilistic approaches for language identification and spelling error correction.
Natural Language Processing: Understanding Transformers and Tokenization
Provides an overview of Natural Language Processing, focusing on transformers, tokenization, and self-attention mechanisms for effective language analysis and synthesis.
Prompting and Alignment
Explores prompting, alignment, and the capabilities of large language models for natural language processing tasks.
Modern NLP and Ethics in NLP
Delves into advancements and challenges in NLP, along with ethical considerations and potential harms.
Neural Word Embeddings
Introduces neural word embeddings and dense vector representations for natural language processing.
Classical Language Models: Foundations and Applications
Introduces classical language models, their applications, and foundational concepts like count-based modeling and evaluation metrics.