Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Information Theory and Coding
Graph Chatbot
Related lectures (30)
Previous
Page 2 of 3
Next
Achievable Rate & Capacity
Explores achievable rate, channel capacity, spectral efficiency, and fading channels in wireless communication systems.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Compression: Prefix-Free Codes
Explains prefix-free codes for efficient data compression and the significance of uniquely decodable codes.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Information Measures: Estimation & Detection
Covers information measures, entropy, mutual information, and data processing inequality in signal representation.
Information Theory: Source Coding, Cryptography, Channel Coding
Covers source coding, cryptography, and channel coding in communication systems, exploring entropy, codes, error channels, and future related courses.
Generalization Error
Discusses mutual information, data processing inequality, and properties related to leakage in discrete systems.
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Mutual Information in Biological Data
Explores mutual information in biological data, emphasizing its role in quantifying statistical dependence and analyzing protein sequences.
Entropy and Mutual Information
On entropy and mutual information explores quantifying information in data science through probability distributions.