nLab max-entropy

Contents

Contents

Idea

A notion of entropy.

In the context of probability theory, the max-entropy or Hartley entropy of a probability distribution on a finite set is the logarithm of the number of outcomes with non-zero probability.

In quantum probability theory this means that the max entropy is the logarithm of the rank of the density matrix.

Properties

Relation to Shannon entropy

The ordinary Shannon entropy of a probability distribution is never greater than the max-entropy.

Relation to Renyi entropy

The Hartley entropy of a finite probability distribution (p i) i=1 n(p_i)_{i = 1}^n is the special case of Renyi entropy

S α(p)=11αln(i=in(p i) α) S_\alpha(p) \;=\; \frac{1}{1 - \alpha} ln \left( \underoverset {i = i} {n} {\sum} (p_i)^\alpha \right)

for α=0\alpha = 0.

order0\phantom{\to} 01\to 12\phantom{\to}2\to \infty
Rényi entropyHartley entropy\geqShannon entropy\geqcollision entropy\geqmin-entropy

References

Review:

Last revised on August 15, 2022 at 08:16:00. See the history of this page for a list of all contributions to it.