Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Concept
Shannon's source coding theorem
Applied sciences
Information engineering
Signal processing
Data compression
Graph Chatbot
Related lectures (27)
Login to filter by course
Login to filter by course
Reset
Previous
Page 2 of 3
Next
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Data Compression and Shannon's Theorem: Entropy Calculation Example
Demonstrates the calculation of entropy for a specific example, resulting in an entropy value of 2.69.
Conditional Entropy and Information Theory Concepts
Discusses conditional entropy and its role in information theory and data compression.
Data Compression and Shannon's Theorem: Performance Analysis
Explores Shannon's theorem on data compression and the performance of Shannon Fano codes.
Source Coding and Prefix-Free Codes
Covers source coding, injective codes, prefix-free codes, and Kraft's inequality.
Polar Codes: Wrapping and Decoding
Covers the wrapping and decoding process of polar codes, exploring the trade-off between quality and efficiency in lossy compression.
Compression: Kraft Inequality
Explains compression and Kraft inequality in codes and sequences.
Data Compression and Shannon's Theorem: Definitions
Explains binary codes, prefix-free codes, and representing letters with codes.
Data Compression: Source Coding
Covers data compression techniques, including source coding and unique decodability concepts.
Information Theory and Coding
Covers source coding, Kraft's inequality, mutual information, Huffman procedure, and properties of tropical sequences.