Summary
In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the value of a third. For random variables , , and with support sets , and , we define the conditional mutual information as This may be written in terms of the expectation operator: . Thus is the expected (with respect to ) Kullback–Leibler divergence from the conditional joint distribution to the product of the conditional marginals and . Compare with the definition of mutual information. For discrete random variables , , and with support sets , and , the conditional mutual information is as follows where the marginal, joint, and/or conditional probability mass functions are denoted by with the appropriate subscript. This can be simplified as For (absolutely) continuous random variables , , and with support sets , and , the conditional mutual information is as follows where the marginal, joint, and/or conditional probability density functions are denoted by with the appropriate subscript. This can be simplified as Alternatively, we may write in terms of joint and conditional entropies as This can be rewritten to show its relationship to mutual information usually rearranged as the chain rule for mutual information or Another equivalent form of the above is Another equivalent form of the conditional mutual information is Like mutual information, conditional mutual information can be expressed as a Kullback–Leibler divergence: Or as an expected value of simpler Kullback–Leibler divergences: A more general definition of conditional mutual information, applicable to random variables with continuous or other arbitrary distributions, will depend on the concept of regular conditional probability. Let be a probability space, and let the random variables , , and each be defined as a Borel-measurable function from to some state space endowed with a topological structure.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.