Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Conditional Entropy: Review and Definitions
Graph Chatbot
Related lectures (28)
Previous
Page 2 of 3
Next
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.
Thermodynamic Identity: Entropy and Energy
Explores the thermodynamic identity, entropy-temperature relationship, and pressure definition, illustrating key principles with practical examples.
Information Theory: Entropy and Capacity
Covers concepts of entropy, Gaussian distributions, and channel capacity with constraints.
Entropy and the Second Law of Thermodynamics
Covers entropy, its definition, and its implications in thermodynamics.
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Source Coding Theorem
Explores the Source Coding Theorem, entropy, Huffman coding, and conditioning's impact on entropy reduction.
Information Measures
Covers information measures like entropy and Kullback-Leibler divergence.
Thermodynamics: Entropy and Ideal Gases
Explores entropy, ideal gases, and TDS equations in thermodynamics, emphasizing the importance of the Clausius inequality and the Carnot cycle.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Conditional Entropy: Huffman Coding
Explores conditional entropy and Huffman coding for efficient data compression techniques.