Quantum notions of information and entropy.
We start with quantities of entropy, namely the von Neumann entropy and its derived quantities:
- Quantum conditional entropy,
qConditionalEnt - Quantum mutual information,
qMutualInfo - Coherent information,
coherentInfo - Quantum conditional mutual information,
qcmi. and then prove facts about them.
The second half of the file is quantities of relative entropy, namely the (standard) quantum relative entropy, and generalizations.
The Quantum Conditional Entropy S(ρᴬ|ρᴮ) is given by S(ρᴬᴮ) - S(ρᴮ).
Instances For
The Quantum Mutual Information I(A:B) is given by S(ρᴬ) + S(ρᴮ) - S(ρᴬᴮ).
Equations
- qMutualInfo ρ = Sᵥₙ ρ.traceLeft + Sᵥₙ ρ.traceRight - Sᵥₙ ρ
Instances For
The Coherent Information of a state ρ pasing through a channel Λ is the negative conditional entropy of the image under Λ of the purification of ρ.
Equations
- coherentInfo ρ Λ = -qConditionalEnt ((Λ⊗ₖCPTPMap.id) (MState.pure ρ.purify))
Instances For
The Quantum Conditional Mutual Information, I(A;C|B) = S(A|B) - S(A|BC).
Equations
Instances For
von Neumman entropy is nonnegative.
von Neumman entropy is at most log d.
von Neumman entropy of pure states is zero.
von Neumann entropy is unchanged under SWAP. TODO: All unitaries
von Neumann entropy is unchanged under assoc.
von Neumann entropy is unchanged under assoc'.
von Neumman entropies of the left- and right- partial trace of pure states are equal.
Weak monotonicity of quantum conditional entropy. S(A|B) + S(A|C) ≥ 0
Quantum conditional entropy is symmetric for pure states.
Quantum mutual information is symmetric.
"Ordinary" subadditivity of von Neumann entropy
Araki-Lieb triangle inequality on von Neumann entropy
Strong subadditivity on a tripartite system
Strong subadditivity, stated in terms of conditional entropies. Also called the data processing inequality. H(A|BC) ≤ H(A|B).
Strong subadditivity, stated in terms of quantum mutual information. I(A;BC) ≥ I(A;B).
The quantum conditional mutual information QCMI is nonnegative.
The quantum conditional mutual information QCMI ρABC is at most 2 log dA.
The quantum conditional mutual information QCMI ρABC is at most 2 log dC.
To do relative entropies, we start with the sandwiched Renyi Relative Entropy which is a nice general form. Then instead of proving many theorems (like DPI, relabelling, additivity, etc.) several times, we just prove it for this one quantity, then it follows for other quantities (like the relative entropy) as a special case.
We could even imagine restructuring the file so that relative entropy comes first, then (some) properties about other quantities can be derived, since they can pretty much all be expressed in terms of appropriate special cases of relative entropies.
The Sandwiched Renyi Relative Entropy, defined with ln (nits). Note that at α = 1 this definition
switch to the standard Relative Entropy, for continuity.
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
The Sandwiched Renyi Relative entropy is additive when the inputs are product states
The Data Processing Inequality for the Sandwiched Renyi relative entropy.
Proved in https://arxiv.org/pdf/1306.5920. Seems kind of involved.
Equations
- One or more equations did not get rendered due to their size.
Instances For
The quantum relative entropy is unchanged by MState.relabel
"Formula for conversion from operator inequality to quantum relative entropy", -- Proposition S17 of https://arxiv.org/pdf/2401.01926v2
Relative entropy is lower semicontinuous (in each argument, actually, but we only need in the latter here). Will need the fact that all the cfc / eigenvalue stuff is continuous, plus carefully handling what happens with the kernel subspace, which will make this a pain.
Joint convexity of Quantum relative entropy. We can't state this with ConvexOn because that requires
an AddCommMonoid, which MStates are not. Instead we state it with Mixable.
TODO:
- Add the
Mixableinstance that infers from theCoeso that the right hand side can be written asp [𝐃(ρ₁‖σ₁) ↔ 𝐃(ρ₂‖σ₂)] - Define (joint) convexity as its own thing - a
ConvexOnforMixabletypes. - Maybe, more broadly, find a way to make
ConvexOnwork with the subset ofMatrixthat corresponds toMState.
I(A:B) = 𝐃(ρᴬᴮ‖ρᴬ ⊗ ρᴮ)