In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third. For random variables , , and with support sets , and , we define the conditional mutual information as This may be written in terms of the expectation operator: . Thus is the expected (with respect to ) Kullback–Leibler divergence from the conditional joint distribution to the product of the conditional marginals and . Compare with the definition of mutual information. For discrete random variables , , and with support sets , and , the conditional mutual information is as follows where the marginal, joint, and/or conditional probability mass functions are denoted by with the appropriate subscript. This can be simplified as For (absolutely) continuous random variables , , and with support sets , and , the conditional mutual information is as follows where the marginal, joint, and/or conditional probability density functions are denoted by with the appropriate subscript. This can be simplified as Alternatively, we may write in terms of joint and conditional entropies as This can be rewritten to show its relationship to mutual information usually rearranged as the chain rule for mutual information or Another equivalent form of the above is Another equivalent form of the conditional mutual information is Like mutual information, conditional mutual information can be expressed as a Kullback–Leibler divergence: Or as an expected value of simpler Kullback–Leibler divergences: A more general definition of conditional mutual information, applicable to random variables with continuous or other arbitrary distributions, will depend on the concept of regular conditional probability. Let be a probability space, and let the random variables , , and each be defined as a Borel-measurable function from to some state space endowed with a topological structure.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (9)
COM-406: Foundations of Data Science
We discuss a set of topics that are important for the understanding of modern data science but that are typically not taught in an introductory ML course. In particular we discuss fundamental ideas an
COM-621: Advanced Topics in Information Theory
The class will focus on information-theoretic progress of the last decade. Topics include: Network Information Theory ; Information Measures: definitions, properties, and applications to probabilistic
EE-543: Advanced wireless receivers
Students extend their knowledge on wireless communication systems to spread-spectrum communication and to multi-antenna systems. They also learn about the basic information theoretic concepts, about c
Show more
Related concepts (1)
Mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.