A Markov chain (named for Andrey Markov) is a sequence of random variable taking values in the state space of the chain, with the property that the probability of moving to the next state depends only upon the current state:
Ian Durham: I assume that a Markov chain can be represented as a directed graph in some way and thus can be used to generate a free category of some sort. Is this a correct assumption?
Eric: I’m pretty sure that is the case for finite Markov chains.
Ian Durham: Hmmm. I’ll have to think about that.
Baez and his students recently defined a generalization, open Markov chains: