Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Mutual Information: Understanding Random Variables
Graph Chatbot
Related lectures (31)
Previous
Page 2 of 4
Next
Random Walks and Moran Model in Population Genetics
Explores random walks, Moran model, bacterial chemotaxis, entropy, information theory, and coevolving sites in proteins.
Information Measures
Covers information measures like entropy and Kullback-Leibler divergence.
Mutual Information in Biological Data
Explores mutual information in biological data, emphasizing its role in quantifying statistical dependence and analyzing protein sequences.
Quantifying Statistical Dependence: Covariance and Correlation
Explores covariance, correlation, and mutual information in quantifying statistical dependence between random variables.
Quantifying Randomness and Information in Biological Data
Explores entropy, randomness, and information quantification in biological data analysis, including neuroscience and protein structure prediction.
Entropy and Mutual Information
On entropy and mutual information explores quantifying information in data science through probability distributions.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Quantifying Entropy in Neuroscience Data
Delves into quantifying entropy in neuroscience data, exploring how neuron activity represents sensory information and the implications of binary digit sequences.
Entropy Bounds: Conditional Entropy Theorems
Explores entropy bounds, conditional entropy theorems, and the chain rule for entropies, illustrating their application through examples.
Mutual Information: Continued
Explores mutual information for quantifying statistical dependence between variables and inferring probability distributions from data.