Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Quantifying Entropy in Neuroscience Data
Graph Chatbot
Related lectures (30)
Previous
Page 2 of 3
Next
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Random Variables and Information Theory Concepts
Introduces random variables and their significance in information theory, covering concepts like expected value and Shannon's entropy.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Energy Minimization in Biological Systems: Equilibrium Models
Covers energy minimization models in biological systems, focusing on equilibrium and the roles of entropy and hydrophobicity.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Random Walks and Moran Model in Population Genetics
Explores random walks, Moran model, bacterial chemotaxis, entropy, information theory, and coevolving sites in proteins.
Entropy and Mutual Information
On entropy and mutual information explores quantifying information in data science through probability distributions.
Probability Distribution and Entropy
Explains probability distribution, entropy, and Gibbs free entropy, along with the Weiss model.