Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Data Compression and Entropy: Compression
Graph Chatbot
Related lectures (28)
Previous
Page 2 of 3
Next
Lossless Compression: Shannon-Fano and Huffman
Explores lossless compression using Shannon-Fano and Huffman algorithms, showcasing Huffman's superior efficiency and speed over Shannon-Fano.
JPEG XS & JPEG XL: Next-Gen Image Compression
Explores the cutting-edge JPEG XS and JPEG XL image compression standards, emphasizing their efficiency and versatility in various applications.
Data Compression and Entropy 2: Entropy as 'Question Game'
Explores entropy as a 'question game' to guess letters efficiently and its relation to data compression.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.
Compression: introduction
Introduces data compression, exploring how redundancy in data can be reduced to achieve smaller file sizes without losing information.
Data Compression and Shannon's Theorem: Shannon-Fano Coding
Explores Shannon-Fano coding for efficient data compression and its comparison to Huffman coding.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Entropy and Algorithms
Explores entropy's role in coding strategies and search algorithms, showcasing its impact on information compression and data efficiency.
Data Compression and Shannon's Theorem
Explores lossless data compression, entropy, and data loss thresholds.
Data Compression and Shannon's Theorem Summary
Summarizes Shannon's theorem, emphasizing the importance of entropy in data compression.