Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Maximal Correlation: Information Measures
Graph Chatbot
Related lectures (32)
Previous
Page 3 of 4
Next
Information Theory and Coding
Covers source coding, Kraft's inequality, mutual information, Huffman procedure, and properties of tropical sequences.
Interpretation of Entropy
Explores the concept of entropy expressed in bits and its relation to probability distributions, focusing on information gain and loss in various scenarios.
Protein Residue Coevolution Analysis
Delves into analyzing residue coevolution in protein families to capture native contacts and predict spatial proximity and protein interactions.
Quantum Information: Density Matrices
Explores density matrices, quantum states representation, and entropy in quantum information.
Probability Theory: Joint Marginals and Granger Causality
Covers joint marginals and Granger causality in probability theory, explaining their implications in predicting outcomes.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Information Theory: Quantifying Messages and Source Entropy
Covers quantifying information in messages, source entropy, common information, and communication channel capacity.
Quantifying Statistical Dependence: Covariance and Correlation
Explores covariance, correlation, and mutual information in quantifying statistical dependence between random variables.
Introduction: Course Structure and Fundamentals of Computing
Explores the role of Computing in society and the basics of computing, algorithms, communication systems, and computer security.
Entropy: Examples and Properties
Explores examples of guessing letters, origins of entropy, and properties in information theory.