Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Measures
Graph Chatbot
Related lectures (31)
Previous
Page 2 of 4
Next
Entropy in Neuroscience and Ecology
Delves into entropy in neuroscience data and ecology, exploring the representation of sensory information and the diversity of biological populations.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Mutual Information and Entropy
Explores mutual information and entropy calculation between random variables.
Generalization Error
Discusses mutual information, data processing inequality, and properties related to leakage in discrete systems.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Probability Distribution and Entropy
Explains probability distribution, entropy, and Gibbs free entropy, along with the Weiss model.
Mutual Information in Biological Data
Explores mutual information in biological data, emphasizing its role in quantifying statistical dependence and analyzing protein sequences.
Entropy and KL Divergence
Explores entropy, KL divergence, and maximum entropy principle in probability models for data science.
Random Walks and Moran Model in Population Genetics
Explores random walks, Moran model, bacterial chemotaxis, entropy, information theory, and coevolving sites in proteins.
Random Variables and Information Theory Concepts
Introduces random variables and their significance in information theory, covering concepts like expected value and Shannon's entropy.