Jump to content

MARKOV CHAIN

From glossaLAB
Charles François (2004). MARKOV CHAIN, International Encyclopedia of Systems and Cybernetics, 2(1): 2007.
Collection International Encyclopedia of Systems and Cybernetics
Year 2004
Vol. (num.) 2(1)
ID 2007
Object type Methodology or model

A sequence of neither purely random, nor purely deterministic transitions from one state to any other in a system.

Another interesting definition by K. KRIPPENDORFF: “The behavior of an informationally closed and generative system that is specified by transition probabilities between the system's states” (1986, p.47).

He adds: “The probabilities of a MARKOV chain are usually entered into a transition matrix indicating which state or symbol follows which other state or symbol. The order of a MARKOV chain corresponds to the number of states or symbols from which probabilities are defined to a successor. Ordinarily, MARKOV chains are state determined, or of the first order. Higher orders are history determined. An unequal distribution of transition probabilities is a mark of a MARKOV chain's redundancy, and a prerequisite of predictability” (Ibid)

I. PRIGOGINE and I. STENGERS state the three general characteristics of Markov chains: “Non-repetitivity, existence of long range correlations and spatial symmetry breaks” (1992, p.90).

Markov chains are “statistically reproductive” and correspond to deterministic chaos “intermediary between pure randomness and redundant order” (Ibid).

See also

(Ergodicity).

This website only uses its own cookies for technical purposes; it does not collect or transfer users' personal data without their knowledge. However, it contains links to third-party websites with third-party privacy policies, which you can accept or reject when you access them.