Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Mutual Information: Continued
Graph Chatbot
Related lectures (31)
Previous
Page 1 of 4
Next
Quantifying Statistical Dependence
Delves into quantifying statistical dependence through covariance, correlation, and mutual information.
Mutual Information in Biological Data
Explores mutual information in biological data, emphasizing its role in quantifying statistical dependence and analyzing protein sequences.
Quantifying Statistical Dependence: Covariance and Correlation
Explores covariance, correlation, and mutual information in quantifying statistical dependence between random variables.
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Quantifying Randomness and Information in Biological Data
Explores entropy, randomness, and information quantification in biological data analysis, including neuroscience and protein structure prediction.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Dependence and Correlation
Explores dependence, correlation, and conditional expectations in probability and statistics, highlighting their significance and limitations.
Entropy in Neuroscience and Ecology
Delves into entropy in neuroscience data and ecology, exploring the representation of sensory information and the diversity of biological populations.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.