In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information. For simplicity, it will be assumed that all objects in the article are finite-dimensional. The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables p(x, y), the two marginal distributions are The classical mutual information I(X:Y) is defined by where S(q) denotes the Shannon entropy of the probability distribution q. One can calculate directly So the mutual information is Where the logarithm is taken in basis 2 to obtain the mutual information in bits. But this is precisely the relative entropy between p(x, y) and p(x)p(y). In other words, if we assume the two variables x and y to be uncorrelated, mutual information is the discrepancy in uncertainty resulting from this (possibly erroneous) assumption. It follows from the property of relative entropy that I(X:Y) ≥ 0 and equality holds if and only if p(x, y) = p(x)p(y). The quantum mechanical counterpart of classical probability distributions are modeled with density matrices. Consider a quantum system that can be divided into two parts, A and B, such that independent measurements can be made on either part. The state space of the entire quantum system is then the tensor product of the spaces for the two parts. Let ρAB be a density matrix acting on states in HAB. The von Neumann entropy of a density matrix S(ρ), is the quantum mechanical analogy of the Shannon entropy. For a probability distribution p(x,y), the marginal distributions are obtained by integrating away the variables x or y. The corresponding operation for density matrices is the partial trace. So one can assign to ρ a state on the subsystem A by where TrB is partial trace with respect to system B. This is the reduced state of ρAB on system A.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (7)
COM-406: Foundations of Data Science
We discuss a set of topics that are important for the understanding of modern data science but that are typically not taught in an introductory ML course. In particular we discuss fundamental ideas an
COM-621: Advanced Topics in Information Theory
The class will focus on information-theoretic progress of the last decade. Topics include: Network Information Theory ; Information Measures: definitions, properties, and applications to probabilistic
EE-543: Advanced wireless receivers
Students extend their knowledge on wireless communication systems to spread-spectrum communication and to multi-antenna systems. They also learn about the basic information theoretic concepts, about c
Show more
Related publications (17)

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.