Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Category
Channel capacity
Applied sciences
Information engineering
Information theory
Channel capacity
Related lectures (31)
Graph Chatbot
Previous
Page 2 of 4
Next
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Information Measures
Covers variational representation and information measures such as entropy and mutual information.
Mutual Information: Continued
Explores mutual information for quantifying statistical dependence between variables and inferring probability distributions from data.
Information Measures
Covers information measures like entropy and Kullback-Leibler divergence.
Conditional Entropy: Review and Definitions
Covers conditional entropy, weather conditions, function entropy, and the chain rule.
Information Theory: Review and Mutual Information
Reviews information measures like entropy and introduces mutual information as a measure of information between random variables.
Generalization Error
Discusses mutual information, data processing inequality, and properties related to leakage in discrete systems.
Random Walks and Moran Model in Population Genetics
Explores random walks, Moran model, bacterial chemotaxis, entropy, information theory, and coevolving sites in proteins.
Achievable Rate & Capacity
Explores achievable rate, channel capacity, spectral efficiency, and fading channels in wireless communication systems.
Mutual Information in Biological Data
Explores mutual information in biological data, emphasizing its role in quantifying statistical dependence and analyzing protein sequences.