Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Entropy and Mutual Information
Graph Chatbot
Related lectures (31)
Previous
Page 3 of 4
Next
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Entropy and Data Compression: Huffman Coding Techniques
Discusses entropy, data compression, and Huffman coding techniques, emphasizing their applications in optimizing codeword lengths and understanding conditional entropy.
Information Measures: Part 1
Covers information measures, tail bounds, subgaussions, subpossion, independence proof, and conditional expectation.
Information Measures: Estimation & Detection
Covers information measures, entropy, mutual information, and data processing inequality in signal representation.
Source Coding Theorems: Entropy and Source Models
Covers source coding theorems, entropy, and various source models in information theory.
Stationary Sources: Properties and Entropy
Explores stationary sources, entropy, regularity, and coding efficiency, including a challenging problem with billiard balls.
Information Theory: Basics and Applications
Covers the basics of information theory and its applications in various fields.
Information Measures
Covers variational representation and information measures such as entropy and mutual information.
Information Measures: Part 2
Covers information measures like entropy, joint entropy, and mutual information in information theory and data processing.
Quantifying Statistical Dependence
Delves into quantifying statistical dependence through covariance, correlation, and mutual information.