Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Measures: Entropy and Information Theory
Graph Chatbot
Related lectures (28)
Previous
Page 3 of 3
Next
Quantifying Entropy in Neuroscience Data
Delves into quantifying entropy in neuroscience data, exploring how neuron activity represents sensory information and the implications of binary digit sequences.
Random Variables and Information Theory Concepts
Introduces random variables and their significance in information theory, covering concepts like expected value and Shannon's entropy.
Generalization Error
Discusses mutual information, data processing inequality, and properties related to leakage in discrete systems.
Entropy and KL Divergence
Explores entropy, KL divergence, and maximum entropy principle in probability models for data science.
Quantifying Information: Probability, Entropy, and Constraints
Explores quantifying information based on probability, entropy, and constraints in communication systems.
Entropy Bounds: Conditional Entropy Theorems
Explores entropy bounds, conditional entropy theorems, and the chain rule for entropies, illustrating their application through examples.
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.
Probability Theory: Joint Marginals and Granger Causality
Covers joint marginals and Granger causality in probability theory, explaining their implications in predicting outcomes.