Lecture

Information Theory and Coding: Source Coding

Related lectures (32)
Random Coding: Achievability and Proof Variants
Explores random coding achievability and proof variants in information theory, emphasizing achievable rates and architectural principles.
Information Theory: Source Coding
Covers source coding, typical sequences, stationarity, and efficient encoding in information theory.
Quantum Information
Explores the CHSH operator, self-testing, eigenstates, and quantifying randomness in quantum systems.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Quantum Source Coding
Covers entropic notions in quantum sources, Shannon entropy, Von Neumann entropy, and source coding.
Information Theory: Entropy and Capacity
Covers concepts of entropy, Gaussian distributions, and channel capacity with constraints.
Information Theory: Prefix-Free Codes
Covers prefix-free codes, Kraft inequality, Huffman coding, and entropy.
Information Theory: Entropy and Information Processing
Explores entropy in information theory and its role in data processing and probability distributions.
Polar Codes: Wrapping and Decoding
Covers the wrapping and decoding process of polar codes, exploring the trade-off between quality and efficiency in lossy compression.
Mutual Information and Entropy
Explores mutual information and entropy calculation between random variables.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.