Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Mutual Information in Biological Data
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Information Measures
Covers information measures like entropy and Kullback-Leibler divergence.
Elements of Statistics
Introduces key statistical concepts like probability, random variables, and correlation, with examples and explanations.
Elements of Statistics: Probability, Distributions, and Estimation
Covers probability theory, distributions, and estimation in statistics, emphasizing accuracy, precision, and resolution of measurements.
Variance and Covariance: Properties and Examples
Explores variance, covariance, and practical applications in statistics and probability.
Quantifying Randomness and Information in Biological Data
Explores entropy, randomness, and information quantification in biological data analysis, including neuroscience and protein structure prediction.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Quantifying Entropy in Neuroscience Data
Delves into quantifying entropy in neuroscience data, exploring how neuron activity represents sensory information and the implications of binary digit sequences.
Probability Models: Fundamentals
Introduces the basics of probability models, covering random variables, distributions, and statistical estimation.
Random Variables and Information Theory Concepts
Introduces random variables and their significance in information theory, covering concepts like expected value and Shannon's entropy.
Conditional Entropy and Information Theory Concepts
Discusses conditional entropy and its role in information theory and data compression.