Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Source Coding and Prefix-Free Codes
Graph Chatbot
Related lectures (28)
Previous
Page 3 of 3
Next
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Data Compression and Shannon's Theorem: Entropy Calculation Example
Demonstrates the calculation of entropy for a specific example, resulting in an entropy value of 2.69.
Conditional Entropy: Huffman Coding
Explores conditional entropy and Huffman coding for efficient data compression techniques.
Mutual Information and Entropy
Explores mutual information and entropy calculation between random variables.
Information Theory: Prefix-Free Codes
Covers prefix-free codes, Kraft inequality, Huffman coding, and entropy.
Data Compression and Shannon's Theorem: Huffman Codes
Explores the performance of Shannon-Fano algorithm and introduces Huffman codes for efficient data compression.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Error Correction Codes: Theory and Applications
Covers error correction codes theory and applications, emphasizing the importance of minimizing distance for reliable communication.