Skip to main content
Graph
Search
fr
|
en
Switch to dark mode
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Theory Basics
Graph Chatbot
Related lectures (28)
Previous
Page 3 of 3
Next
Chain Rule for Entropy
Explores the chain rule for entropy, decomposing uncertainty in random variables and illustrating its application with examples.
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.
Source Coding Theorem: Fundamentals and Models
Covers the Source Coding Theorem, source models, entropy, regular sources, and examples.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Information Theory: Quantifying Messages and Source Entropy
Covers quantifying information in messages, source entropy, common information, and communication channel capacity.
Information Measures
Covers information measures like entropy and Kullback-Leibler divergence.
Information in Networked Systems: Functional Representation and Data Compression
Explores traditional information theory, data compression, data transmission, and functional representation lemmas in networked systems.