nLab information metric

Contents

Contents

Idea

In information geometry, a (Fisher-)information metric is a Riemannian metric on a manifold of probability distributions over some probability space XX (the latter often assumed to be finite).

Definition

On a finite probability space XX \in Set a positive measure is a function ρ:X +\rho : X \to \mathbb{R}_+ and a probability distribution is one such that xXρ(x)=1\sum_{x \in X} \rho(x) = 1.

This space is actually a submanifold of 0 |X|\mathbb{R}_{\geq 0}^{\vert X \vert}. For {x i}\{\frac{\partial}{\partial x^i}\} the canonical basis of tangent vectors on this wedge of Cartesian space, the information metric gg is given by

g(x i,x j)(ρ)=1ρ(x i)δ ij. g(\frac{\partial}{\partial x^i}, \frac{\partial }{\partial x^j})(\rho) = \frac{1}{\rho(x^i)} \delta_{i j} \,.

References

  • L. L. Campbell, An extended Čencov characterization of the information metric Journal: Proc. Amer. Math. Soc. 98 (1986), 135-141. (AMS)

Created on June 17, 2011 at 17:48:10. See the history of this page for a list of all contributions to it.