Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Concept
Information theory
Graph Chatbot
Related lectures (31)
Login to filter by course
Login to filter by course
Reset
Previous
Page 3 of 4
Next
Quantifying Information: Probability, Entropy, and Constraints
Explores quantifying information based on probability, entropy, and constraints in communication systems.
Entropy and Mutual Information
On entropy and mutual information explores quantifying information in data science through probability distributions.
Information Measures
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.
Information Coding: Size Impact
Explores information coding, emphasizing the impact of size on processing and transmission.
Information Theory and Coding: Source Coding
Covers source coding, encoder design, and error probability analysis in information theory and coding.
Huffman Coding
Explores Huffman coding by comparing it to organizing a kitchen for efficiency.
Information Theory: Sampling, Quantization, and Communication Systems
Explores nonuniform sampling, quantization, noise challenges, and communication theories.
Achievable Rate & Capacity
Explores achievable rate, channel capacity, spectral efficiency, and fading channels in wireless communication systems.
Information Theory: Basics and Applications
Covers the basics of information theory and its applications in various fields.
Information Measures: Part 2
Covers information measures like entropy, joint entropy, and mutual information in information theory and data processing.