Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Stochastic Processes: Sequences and Compression
Graph Chatbot
Related lectures (26)
Previous
Page 1 of 3
Next
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Compression: Kraft Inequality
Explains compression and Kraft inequality in codes and sequences.
Source Coding: Compression
Covers entropy, source coding, encoding maps, decodability, prefix-free codes, and Kraft-McMillan's inequality.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Information Theory: Source Coding & Channel Coding
Covers the fundamentals of information theory, focusing on source coding and channel coding.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Data Compression and Shannon's Theorem: Entropy Calculation Example
Demonstrates the calculation of entropy for a specific example, resulting in an entropy value of 2.69.
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.
Data Compression and Shannon-Fano Algorithm
Explores the Shannon-Fano algorithm for data compression and its efficiency in creating unique binary codes for letters.