Jump to content

INFORMATION (Amount of

From glossaLAB
Charles François (2004). INFORMATION (Amount of, International Encyclopedia of Systems and Cybernetics, 2(1): 1654.
Collection International Encyclopedia of Systems and Cybernetics
Year 2004
Vol. (num.) 2(1)
ID 1654
Object type Discipline oriented
“A logarithmic measure of the statistical unexpectedness (reciprocal of probability) of the message concerned” (D. Mac KAY 1969, p.79).

Mac KAY states: “Since the unexpectedness of a message need not have direct connection with its semantic content or meaning, SHANNON wisely insisted that the concept of ”meaning“ was outside the scope of his theory” (Ibid).

Basically and inasmuch as SHANNON's work is related to information, it should be definitely understood that it is about quantitative information and its measurement.

“As an example let us suppose that we must select one object out of a collection, on the base of a specified criterion: for example the letter O within the alphabet. The most economical method is to divide the alphabet in two and ask: is O in the first half or in the second one? The answer provides us with one bit of information, corresponding to the fact that O is in the second half. Following in the same way, we now divide this second half in two and repeat our question. The answer:”O is in the first half of the second half“ provides us with a second bit of information. Finally, we will find out that the number of bits we need to ”place“ O in the alphabet, is the logarithm base 2 of 26, i.e. 5 when rounded off to the next whole number of bits, taking in account that bits are necessarily whole numbers” (Ibid.).

This is in accordance with SHANNON and WEAVER, who consider the amount of information as a basically technical feature, which: “… characterizes the whole statistical nature of the information, and is not concerned with individual messages” (1949, p.104).

In this quantitative sense, information is thus defined as the logarithm to the base 2 of the possible combinations C, i.e.: I = log$_{2}$C. As logarithms are exponents, quantitative information is an additive measure of complexity.

Obviously, apart from the quantitative aspect of information, as expressed for example in bits, the qualitative aspect is quite a different matter, as it relates to meanings, which “make sense” only in exchanges between sentient and thinking beings, and not, for example, for the telephone sets themselves.

This website only uses its own cookies for technical purposes; it does not collect or transfer users' personal data without their knowledge. However, it contains links to third-party websites with third-party privacy policies, which you can accept or reject when you access them.