Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Kraft-McMillan Theorem
Graph Chatbot
Related lectures (23)
Previous
Page 1 of 3
Next
Source Coding: Compression
Covers entropy, source coding, encoding maps, decodability, prefix-free codes, and Kraft-McMillan's inequality.
Compression: Kraft Inequality
Explains compression and Kraft inequality in codes and sequences.
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Shannon-Fano Codes
Covers Shannon-Fano codes, a method for uniquely decodable and prefix-free coding, and discusses their optimality compared to the entropy.
Compression: Strong Connection and Prefix-Free Codes
Explores the relationship between code word length and probability distribution, focusing on designing prefix-free codes for efficient compression.
Entropy and Algorithms: Twenty Questions Problem
Explores the Twenty Questions Problem, Huffman codes, and optimal querying strategies in algorithms, demonstrating efficient outcomes through prefix-free and ternary codes.
Data Compression and Shannon's Theorem: Entropy Calculation Example
Demonstrates the calculation of entropy for a specific example, resulting in an entropy value of 2.69.
Stochastic Processes: Sequences and Compression
Explores compression in stochastic processes through injective codes and prefix-free codes.