MARKOV CHAIN
| Collection | International Encyclopedia of Systems and Cybernetics |
|---|---|
| Year | 2004 |
| Vol. (num.) | 2(1) |
| ID | ◀ 2007 ▶ |
| Object type | Methodology or model |
A sequence of neither purely random, nor purely deterministic transitions from one state to any other in a system.
Another interesting definition by K. KRIPPENDORFF: “The behavior of an informationally closed and generative system that is specified by transition probabilities between the system's states” (1986, p.47).
He adds: “The probabilities of a MARKOV chain are usually entered into a transition matrix indicating which state or symbol follows which other state or symbol. The order of a MARKOV chain corresponds to the number of states or symbols from which probabilities are defined to a successor. Ordinarily, MARKOV chains are state determined, or of the first order. Higher orders are history determined. An unequal distribution of transition probabilities is a mark of a MARKOV chain's redundancy, and a prerequisite of predictability” (Ibid)
I. PRIGOGINE and I. STENGERS state the three general characteristics of Markov chains: “Non-repetitivity, existence of long range correlations and spatial symmetry breaks” (1992, p.90).
Markov chains are “statistically reproductive” and correspond to deterministic chaos “intermediary between pure randomness and redundant order” (Ibid).
See also
(Ergodicity).