In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information.
For simplicity, it will be assumed that all objects in the article are finite-dimensional.
The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables p(x, y), the two marginal distributions are
The classical mutual information I(X:Y) is defined by
where S(q) denotes the Shannon entropy of the probability distribution q.
One can calculate directly
So the mutual information is
Where the logarithm is taken in basis 2 to obtain the mutual information in bits. But this is precisely the relative entropy between p(x, y) and p(x)p(y). In other words, if we assume the two variables x and y to be uncorrelated, mutual information is the discrepancy in uncertainty resulting from this (possibly erroneous) assumption.
It follows from the property of relative entropy that I(X:Y) ≥ 0 and equality holds if and only if p(x, y) = p(x)p(y).
The quantum mechanical counterpart of classical probability distributions are modeled with density matrices.
Consider a quantum system that can be divided into two parts, A and B, such that independent measurements can be made on either part. The state space of the entire quantum system is then the tensor product of the spaces for the two parts.
Let ρAB be a density matrix acting on states in HAB. The von Neumann entropy of a density matrix S(ρ), is the quantum mechanical analogy of the Shannon entropy.
For a probability distribution p(x,y), the marginal distributions are obtained by integrating away the variables x or y. The corresponding operation for density matrices is the partial trace. So one can assign to ρ a state on the subsystem A by
where TrB is partial trace with respect to system B. This is the reduced state of ρAB on system A.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
We discuss a set of topics that are important for the understanding of modern data science but that are typically not taught in an introductory ML course. In particular we discuss fundamental ideas an
The class will focus on information-theoretic progress of the last decade. Topics include: Network Information Theory ; Information Measures: definitions, properties, and applications to probabilistic
Students extend their knowledge on wireless communication systems to spread-spectrum communication and to multi-antenna systems. They also learn about the basic information theoretic concepts, about c
Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Other ...
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) and f-divergences. In particular, the bounds are instantiated when the measures considered are a joint distribution and the corresponding product of marginals ...
IEEE2020
, ,
The aim of this work is to provide bounds connecting two probability measures of the same event using Rényi α-Divergences and Sibson’s α-Mutual Information, a generalization of respectively the Kullback-Leibler Divergence and Shannon’s Mutual ...
Covers information measures like entropy, Kullback-Leibler divergence, and data processing inequality, along with probability kernels and mutual information.