Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Mutual Information and Entropy
Graph Chatbot
Related lectures (30)
Previous
Page 2 of 3
Next
Random Walks and Moran Model in Population Genetics
Explores random walks, Moran model, bacterial chemotaxis, entropy, information theory, and coevolving sites in proteins.
Quantifying Statistical Dependence: Covariance and Correlation
Explores covariance, correlation, and mutual information in quantifying statistical dependence between random variables.
Information Theory: Channel Capacity and Convex Functions
Explores channel capacity and convex functions in information theory, emphasizing the importance of convexity.
Entropy and Information Theory
Explores entropy, uncertainty, coding theory, and data compression applications.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Random Variables and Information Theory Concepts
Introduces random variables and their significance in information theory, covering concepts like expected value and Shannon's entropy.
Entropy and Mutual Information
On entropy and mutual information explores quantifying information in data science through probability distributions.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.