Lecture

Information Theory: Channel Capacity and Convex Functions

Related lectures (47)
Information Theory: Entropy and Capacity
Covers concepts of entropy, Gaussian distributions, and channel capacity with constraints.
Information Theory Basics
Introduces information theory basics, including entropy, independence, and binary entropy function.
Log-Concave Functions
Covers the concept of log-concave functions and their implications in probability distributions and Gaussian correlation inequalities.
Information Measures: Entropy and Information Theory
Explains how entropy measures uncertainty in a system based on possible outcomes.
Lecture: Shannon
Covers the basics of information theory, focusing on Shannon's setting and channel transmission.
Eigenstate Thermalization Hypothesis
Explores the Eigenstate Thermalization Hypothesis in quantum systems, emphasizing the random matrix theory and the behavior of observables in thermal equilibrium.
Universal Source Coding
Covers the Lempel-Ziv universal coding algorithm and invertible finite state machines in information theory.
Communication Channels: Encoding and Decoding
Explores encoding and decoding techniques in communication systems, focusing on fundamental limits and mutual information computations.
Entropy and Sampling Theory
Explores entropy, exponential family, sampling theory, and statistical inference from samples.
Conditional expectation
Explores the properties of conditional expectation and its extension to positive variables.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.