Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Scaling Language Models: Efficiency and Deployment
Graph Chatbot
Related lectures (31)
Previous
Page 1 of 4
Next
Deep Learning: Principles and Applications
Covers the fundamentals of deep learning, including data, architecture, and ethical considerations in model deployment.
Model Compression: Techniques for Efficient NLP Models
Explores model compression techniques in NLP, discussing pruning, quantization, weight factorization, knowledge distillation, and attention mechanisms.
Deep Learning Techniques: Recurring Networks and LSTM Models
Discusses the implementation and optimization of recurring networks using LSTM models in deep learning.
Language Models: Fixed-context and Recurrent Neural Networks
Discusses language models, focusing on fixed-context neural models and recurrent neural networks.
Prompting and Alignment
Explores prompting, alignment, and the capabilities of large language models for natural language processing tasks.
Machine Learning: Supervised and Unsupervised Learning Techniques
Covers supervised and unsupervised learning techniques in machine learning, highlighting their applications in finance and environmental analysis.
Transformers in Vision: Applications and Architectures
Covers the impact of transformers in computer vision, discussing their architecture, applications, and advancements in various tasks.
Second-Order Model Compression
Explores second-order model compression for massive deep neural networks, showcasing compression techniques and their impact on model accuracy.
Data Annotation: Collection and Biases in NLP
Addresses data collection, annotation processes, and biases in natural language processing.
Natural Language Processing: Understanding Transformers and Tokenization
Provides an overview of Natural Language Processing, focusing on transformers, tokenization, and self-attention mechanisms for effective language analysis and synthesis.