Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Category
Information theory
Applied sciences
Information engineering
Information theory
Graph Chatbot
Related lectures (24)
Previous
Page 2 of 3
Next
Source Coding: Compression
Covers entropy, source coding, encoding maps, decodability, prefix-free codes, and Kraft-McMillan's inequality.
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Quantum Corrective Codes
Covers quantum corrective codes, error elimination, redundancy, classical coding, and parity matrices.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Error Correction Codes: Structure and Properties
Explores error correction codes, focusing on structure, properties, and the Singleton bound.
Error Correction Codes: Hamming Code
Explores the Hamming Code for error correction, emphasizing its ability to correct single-bit errors.
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Calderbank-Steane-Shor Codes
Explores Calderbank-Steane-Shor codes, classical codes, error correction, CSS codes, and protecting states from errors.