Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Data Compression: Shannon-Fano Algorithm
Graph Chatbot
Related lectures (26)
Previous
Page 2 of 3
Next
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.
Data Compression and Entropy: Conclusion
Covers the definition of entropy, Shannon–Fano algorithm, and upcoming topics.
Data Compression and Entropy 2: Entropy as 'Question Game'
Explores entropy as a 'question game' to guess letters efficiently and its relation to data compression.
Data Compression and Shannon's Theorem: Recap
Explores entropy, compression algorithms, and optimal coding methods for data compression.
Data Compression and Shannon's Theorem: Entropy Calculation Example
Demonstrates the calculation of entropy for a specific example, resulting in an entropy value of 2.69.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Data Compression and Shannon's Theorem Summary
Summarizes Shannon's theorem, emphasizing the importance of entropy in data compression.
Data Compression and Shannon's Theorem: Huffman Codes
Explores the performance of Shannon-Fano algorithm and introduces Huffman codes for efficient data compression.
JPEG XS & JPEG XL: Next-Gen Image Compression
Explores the cutting-edge JPEG XS and JPEG XL image compression standards, emphasizing their efficiency and versatility in various applications.
Lossless Compression: Shannon-Fano and Huffman
Explores lossless compression using Shannon-Fano and Huffman algorithms, showcasing Huffman's superior efficiency and speed over Shannon-Fano.