Skip to main content
Graph
Search
fr
|
en
Switch to dark mode
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Theory Basics
Graph Chatbot
Related lectures (28)
Previous
Page 2 of 3
Next
Entropy Bounds: Conditional Entropy Theorems
Explores entropy bounds, conditional entropy theorems, and the chain rule for entropies, illustrating their application through examples.
Information Theory: Source Coding, Cryptography, Channel Coding
Covers source coding, cryptography, and channel coding in communication systems, exploring entropy, codes, error channels, and future related courses.
Mutual Information and Entropy
Explores mutual information and entropy calculation between random variables.
Entropy and Algorithms: Applications in Sorting and Weighing
Covers the application of entropy in algorithms, focusing on sorting and decision-making strategies.
Entropy and Information Theory
Explores entropy, uncertainty, coding theory, and data compression applications.
Information Theory: Channel Capacity and Convex Functions
Explores channel capacity and convex functions in information theory, emphasizing the importance of convexity.
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Information Theory: Source Coding & Channel Coding
Covers the fundamentals of information theory, focusing on source coding and channel coding.
Biological Randomness and Data Analysis
Explores randomness in biology, covering thermal fluctuations, random walks, and data analysis techniques.
Quantifying Statistical Dependence: Covariance and Correlation
Explores covariance, correlation, and mutual information in quantifying statistical dependence between random variables.