Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Category
Channel capacity
Applied sciences
Information engineering
Information theory
Channel capacity
Graph Chatbot
Related lectures (31)
Previous
Page 1 of 4
Next
Variational Formulation: Information Measures
Explores variational formulation for measuring information content and divergence between probability distributions.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Information Measures: Part 2
Covers information measures like entropy, joint entropy, and mutual information in information theory and data processing.
Mutual Information and Entropy
Explores mutual information and entropy calculation between random variables.
Mutual Information: Understanding Random Variables
Explores mutual information, quantifying relationships between random variables and measuring information gain and statistical dependence.
Entropy Bounds: Conditional Entropy Theorems
Explores entropy bounds, conditional entropy theorems, and the chain rule for entropies, illustrating their application through examples.
Information Measures: Estimation & Detection
Covers information measures, entropy, mutual information, and data processing inequality in signal representation.
Conditional Entropy and Data Compression Techniques
Discusses conditional entropy and its role in data compression techniques.
Probability Theory: Joint Marginals and Granger Causality
Covers joint marginals and Granger causality in probability theory, explaining their implications in predicting outcomes.