Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Quantifying Randomness and Information in Biological Data
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Mutual Information in Biological Data
Explores mutual information in biological data, emphasizing its role in quantifying statistical dependence and analyzing protein sequences.
Random Variables and Expected Value
Introduces random variables, probability distributions, and expected values through practical examples.
Quantifying Statistical Dependence
Delves into quantifying statistical dependence through covariance, correlation, and mutual information.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Graph Coloring: Random vs Symmetrical
Compares random and symmetrical graph coloring in terms of cluster colorability and equilibrium.
Maximum Entropy Modeling: Applications & Inference
Explores maximum entropy modeling applications in neuroscience and protein sequence data.
Probability Distributions in Environmental Studies
Explores probability distributions for random variables in air pollution and climate change studies, covering descriptive and inferential statistics.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Propagation of Uncertainty: Estimation and Distribution
Discusses estimation and propagation of uncertainty in random variables and the importance of managing uncertainty in statistical analysis.
Random Walks and Moran Model in Population Genetics
Explores random walks, Moran model, bacterial chemotaxis, entropy, information theory, and coevolving sites in proteins.