Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Mutual Information in Biological Data
Graph Chatbot
Related lectures (31)
Previous
Page 1 of 4
Next
Mutual Information: Continued
Explores mutual information for quantifying statistical dependence between variables and inferring probability distributions from data.
Quantifying Statistical Dependence
Delves into quantifying statistical dependence through covariance, correlation, and mutual information.
Quantifying Statistical Dependence: Covariance and Correlation
Explores covariance, correlation, and mutual information in quantifying statistical dependence between random variables.
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Dependence and Correlation
Explores dependence, correlation, and conditional expectations in probability and statistics, highlighting their significance and limitations.
Elements of Statistics: Probability and Random Variables
Introduces key concepts in probability and random variables, covering statistics, distributions, and covariance.
Central Limit Theorem: Properties and Applications
Explores the Central Limit Theorem, covariance, correlation, joint random variables, quantiles, and the law of large numbers.