Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Claude Shannon's Story
Graph Chatbot
Related lectures (26)
Previous
Page 2 of 3
Next
Lossless Compression: Shannon-Fano and Huffman
Explores lossless compression using Shannon-Fano and Huffman algorithms, showcasing Huffman's superior efficiency and speed over Shannon-Fano.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Data Compression and Shannon's Theorem: Shannon's Theorem Demonstration
Covers the demonstration of Shannon's theorem, focusing on data compression.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.
Data Compression and Entropy Interpretation
Explores the origins and interpretation of entropy, emphasizing its role in measuring disorder and information content in a system.
Data Compression and Entropy: Conclusion
Covers the definition of entropy, Shannon–Fano algorithm, and upcoming topics.
Compression with Loss: Data and Signals
Explores compression with loss, covering limitations of lossless compression and techniques for compressing images and sound effectively.
Data Compression and Shannon's Theorem: Performance Analysis
Explores Shannon's theorem on data compression and the performance of Shannon Fano codes.
Huffman Coding
Explores Huffman coding by comparing it to organizing a kitchen for efficiency.
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.