Axiomatics for Algorithmic Information

From glossaLAB
Collection GlossariumBITri
Author Mark Burgin
Editor Mark Burgin
Year 2016
Volume 2
Number 1
ID 3
Object type Theory
Domain Ait
Coding Theory
Complexity Theory
Computer Science
es Axiomatica de la Información Algorítmica
fr Axiomatique de l'information algorithmique
de Axiomatik für die algorithmische Information

The existence of a variety of algorithmic measures of information brought forth a necessity for a unifying approach (Algorithmic information theory, and Algorithmic information). This approach has been called axiomatic information theory.

Algorithmic information measures information that is necessary to construct (build) a constructive object by means of some system of algorithm A. That is why algorithmic information is dual to a static complexity measure (static direct information size) a of a constructive object x and is called by the name (A, α)- information size of x. Dual information size is constructed from direct information size by the minimization operation. For instance, a natural direct information size for algorithms/programs is the length of their description (symbolic representation), and the same is true for data. It is possible to measure length in such natural units of information as bits and bytes. When taking the dual to this measure in the class of recursive algorithms, we obtain Kolmogorov complexity or recursive information size (Kolmogorov complexity).

The axiomatic description of dual information size uses axiomatic descriptions of direct complexity measures suggested by Blum (1967; 1967a) and further developed by Burgin (1982; 2005; 2010).

Direct information sizes

. All kinds of direct information sizes are divided into three classes:

  1. Direct static information size depends only on an algorithm/program that is measured. Direct static information size usually reflects information in the algorithm/program representation. The length of a text (of an algorithm) measures information in bits. If we say that a memory has the capacity 1 gigabytes, it means that it is possible to store 8×109 bits of information in this memory.
  2. Direct functional information size depends both on an algorithm/program that is measured and on the input. Examples of a direct functional information size are such popular measures as time of a computation or space used in a computation.
  3. Direct Processual information size depends on an algorithm/program, its realization, and on the input. Examples of a direct processual information size are time that it takes to process given input or the number of data transactions between memories of different type used in this process.

Dual complexity measures

. Information size, or algorithmic complexity, can be defined for different classes of algorithms, resulting in different measures. However, all these measures are constructed by a similar technique. As a result, it is possible to axiomatize this approach. The result of this axiomatization is called dual complexity measures (Burgin, 2005). As before, we are going to call these measures by the name dual information size as they reflect information necessary to compute (construct) a given object. These measures give much more opportunities to estimate information size of words and infinite strings than conventional types of information size (Kolmogorov complexity).

Let A = {Ai ; i ∈ I} be a class of algorithms, A be an algorithm that works with elements from I as inputs and α: I → be a direct static information size of algorithms from a class A that satisfies axioms from (Blum, 1967) or (Burgin, 2005, Ch.5). Elements of I are usually treated as programs for the algorithm A.

The dual to α complexity measure or (A, α) – information size αAo of an object (word) x with respect to the algorithm A is the function from the codomain (the set of all outputs) Y of A that is defined as

αAo(x){{{1}}}min {α(p); p ∈ I and A(p) = x}

When there is no such p that A(p) = x, the value of αAo at x' is undefined.

When there is no such p that A(p) = x, the value of αAo at x is undefined.

When the class A has universal algorithms, the invariance theorem is proved stating that αUo(x) where U is a universal algorithm in the class A is an optimal – in some sense – measure in the class of all measures αAo(x) with A from the class A (Burgin, 2010). This allows one to take αUo(x) as an axiomatic complexity measure (axiomatic information size) in the class A.

Origin and advantages of the axiomatic approach to algorithmic information

An axiomatic approach to algorithmic information theory was introduced by Burgin in a paper presented for publication by Kolmogorov (Burgin 1982) and further developed in the paper (Burgin, 1990) and in books (Burgin 2005; 2010). The axiomatic approach encompasses other approaches in the algorithmic information theory. It is possible to treat different measures of algorithmic information as particular cases of axiomatically defined measures of algorithmic information. Instead of proving similar theorems, such as the basic invariance theorem, for each particular measure, it is possible to easily deduce all such results from one corresponding theorem proved in the axiomatic setting. This is a general advantage of the axiomatic approach in mathematics.

References

  • BLUM, M. (1967). On the Size of Machines. Information and Control, v. 11, pp. 257–265
  • BLUM M. (1967a). A Machine-independent Theory of Complexity of Recursive Functions. Journal of the ACM, v. 14, No.2, pp. 322–336
  • BURGIN, M. (1982). Generalized Kolmogorov complexity and duality in theory of computations. Soviet Math. Dokl., v.25, No. 3, pp. 19–23
  • BURGIN, M. (1990). Generalized Kolmogorov Complexity and other Dual Complexity Measures. Cybernetics, No. 4, pp. 21–29
  • BURGIN, M. (2005). Super-recursive algorithms. Monographs in computer science, Heidelberg: Springer.
  • BURGIN, Mark (2010). Theory of Information: Fundamentality, Diversity and Unification. Singapore: World Scientific Publishing.