Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Lossless Compression: Shannon-Fano and Huffman
Graph Chatbot
Related lectures (26)
Previous
Page 2 of 3
Next
Data Compression: Shannon-Fano Algorithm
Explores the Shannon-Fano algorithm for efficient data compression and its applications in lossless and lossy compression techniques.
Data Compression and Entropy 2: Entropy as 'Question Game'
Explores entropy as a 'question game' to guess letters efficiently and its relation to data compression.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.
Data Compression and Shannon's Theorem: Lossy Compression
Explores data compression, including lossless methods and the necessity of lossy compression for real numbers and signals.
Compression: introduction
Introduces data compression, exploring how redundancy in data can be reduced to achieve smaller file sizes without losing information.
Data Compression and Shannon's Theorem: Performance Analysis
Explores Shannon's theorem on data compression and the performance of Shannon Fano codes.
Data Compression and Shannon's Theorem: Huffman Codes
Explores the performance of Shannon-Fano algorithm and introduces Huffman codes for efficient data compression.
Data Compression and Shannon's Theorem: Entropy Calculation Example
Demonstrates the calculation of entropy for a specific example, resulting in an entropy value of 2.69.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.