Concept

Joint quantum entropy

Résumé
The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states and , represented as density operators that are subparts of a quantum system, the joint quantum entropy is a measure of the total uncertainty or entropy of the joint system. It is written or , depending on the notation being used for the von Neumann entropy. Like other entropies, the joint quantum entropy is measured in bits, i.e. the logarithm is taken in base 2. In this article, we will use for the joint quantum entropy. In information theory, for any classical random variable , the classical Shannon entropy is a measure of how uncertain we are about the outcome of . For example, if is a probability distribution concentrated at one point, the outcome of is certain and therefore its entropy . At the other extreme, if is the uniform probability distribution with possible values, intuitively one would expect is associated with the most uncertainty. Indeed, such uniform probability distributions have maximum possible entropy . In quantum information theory, the notion of entropy is extended from probability distributions to quantum states, or density matrices. For a state , the von Neumann entropy is defined by Applying the spectral theorem, or Borel functional calculus for infinite dimensional systems, we see that it generalizes the classical entropy. The physical meaning remains the same. A maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy. On the other hand, a pure state, or a rank one projection, will have zero von Neumann entropy. We write the von Neumann entropy (or sometimes . Given a quantum system with two subsystems A and B, the term joint quantum entropy simply refers to the von Neumann entropy of the combined system. This is to distinguish from the entropy of the subsystems. In symbols, if the combined system is in state , the joint quantum entropy is then Each subsystem has its own entropy.
À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.
Publications associées

Chargement

Personnes associées

Chargement

Unités associées

Chargement

Concepts associés

Chargement

Cours associés

Chargement

Séances de cours associées

Chargement

MOOCs associés

Chargement