Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Entropy and KL Divergence
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Generalization Error
Explores tail bounds, information bounds, and maximal leakage in the context of generalization error.
Entropy and the Second Law of Thermodynamics
Covers entropy, its definition, and its implications in thermodynamics.
Sunny Rainy Source: Markov Model
Explores a first-order Markov model using a sunny-rainy source example, demonstrating how past events influence future outcomes.
Gibbs Entropy and Information Theory
Explores Gibbs's entropy, information theory, and the information content of events in non-equiprobable scenarios.
Quantifying Randomness and Information in Biological Data
Explores entropy, randomness, and information quantification in biological data analysis, including neuroscience and protein structure prediction.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Data Compression: Entropy Definition
Explores data compression through entropy definition, types, and practical examples, illustrating its role in efficient information storage and transmission.
Heat Transfer: Equilibrium Approach and Thermodynamic Potentials
Reviews heat transfer mechanisms, entropy, and thermodynamic potentials in relation to equilibrium and Carnot cycles.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.