Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Concept
Entropy (information theory)
Applied sciences
Information engineering
Information theory
Channel capacity
Graph Chatbot
Related lectures (30)
Login to filter by course
Login to filter by course
Reset
Previous
Page 2 of 3
Next
Quantifying Entropy in Neuroscience Data
Delves into quantifying entropy in neuroscience data, exploring how neuron activity represents sensory information and the implications of binary digit sequences.
Conditional Entropy and Information Theory Concepts
Discusses conditional entropy and its role in information theory and data compression.
Random Walks and Moran Model in Population Genetics
Explores random walks, Moran model, bacterial chemotaxis, entropy, information theory, and coevolving sites in proteins.
Entropy and the Second Law of Thermodynamics
Covers entropy, its definition, and its implications in thermodynamics.
Entropy Bounds: Conditional Entropy Theorems
Explores entropy bounds, conditional entropy theorems, and the chain rule for entropies, illustrating their application through examples.
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.
Chapter 3: Entropy of Subsystems
Covers entropy in subsystems, reversible processes, equilibrium, and simple systems.
Quantifying Randomness and Information in Biological Data
Explores entropy, randomness, and information quantification in biological data analysis, including neuroscience and protein structure prediction.
Information Theory: Quantifying Messages and Source Entropy
Covers quantifying information in messages, source entropy, common information, and communication channel capacity.
Heat Transfer: Equilibrium Approach and Thermodynamic Potentials
Reviews heat transfer mechanisms, entropy, and thermodynamic potentials in relation to equilibrium and Carnot cycles.