Jump to content

INFORMATION MEASUREMENT

From glossaLAB
Charles François (2004). INFORMATION MEASUREMENT, International Encyclopedia of Systems and Cybernetics, 2(1): 1682.
Collection International Encyclopedia of Systems and Cybernetics
Year 2004
Vol. (num.) 2(1)
ID 1682
Object type Methodology or model

J.de ROSNAY explains with great clarity what is meant by information measurement, why and how to measure it and how the concept of information quantity is totally independent of its meaning (and is purely technical):

“In order to define conveniently what a certain quantity of information represents, one must place oneself in the situation of an observer seeking information about some ill-known system. It may be for example the number of possible answers to a question; or the number of solutions of a problem.
“Obtaining informations on the unknown system may lead the observer to reduce the number of possible answers. A total information could even lead immediately to only one possible answer: the good one. Information is thus a function of the relation between the number of possible answers before the reception of the message (P$_{0}$) and the number of remaining possible answers after reception (P$_{1}$)
“A very simple example: the unknown system is a pack of 32 cards. Question: what is the chance to select some predefined card?
“This question introduces an uncertainty, which can be meeted by a relation; i.e. the number of favorable cases respect to the number of possible ones (which means: the probability to sort out the good card) As there is only one favorable case (the chosen card), this probability is of one chance in 32.
“Now, how to meet the quantity of information obtained by sorting one card? Before sorting it, there are 32 possible cases with the same probability (P$_{0}$). After sorting it, two different situations may appear:
“Either the good card was sorted. In this case, there is only one possible answer (the one in hand). The quantity of information thus obtained corresponds to the relation 32/1, and the information is total.
“Or a useless card was sorted and there are still 31 possible answers. The quantity of information now obtained corresponds to the relation 32/31. The information is partial.
“The information obtained in the first situation solves definitively the problem by reducing to 1 the number of possible cases. In the second, the number of possible cases is only slowly reduced. It thus reduces the denominator of the relation P$_{0}$/P$_{1}$: the relation increases and the information too. This is to say that information increases when uncertainty decreases. This is because uncertainty expresses the lack of information about the unknown system.
“Finally, to meet information and to define unities, two conventions are made: information is preferently defined in a substractive way, better than by a relation, as information is the difference between two uncertainties (before and after the message). We substitute thus to the relation P$_{0}$/P$_{1}$ a difference of their logarithms. The second convention: as the most practical and used code to translate a message is made of two signs: 0 and 1, the binary language and the base 2 logarithms will be adopted. When using these convention, the quantity of information in a message is meted in bits (i.e. binary digits)…
“Information (quantitative) appears thus as an abstract entity, objective and devoid of any human significance. It is easier to understand what is a quantity of information by assimilating it to material units circulating in some channel: for example as water molecules in a pipe. The flow in the pipe is limited by its section. It is just the same for any transmitting line… as for example a telephone wire. This quantity of information is totally independent of the significance of the message: a song, the results of the horse races or the market's quotations” (1975, p.170-71).

From a different viewpoint I. PRIGOGINE and I. STENGERS state: “We may use the algorithmic theory of information proposed by CHAITIN and KOLMOGOROV: The measure of information should be the length of the program that would have to be given to a computer… in order to make it able to achieve the structure that we wish (to obtain)” (1992, p.83). Even in this case however, information is still related to an observer.

This website only uses its own cookies for technical purposes; it does not collect or transfer users' personal data without their knowledge. However, it contains links to third-party websites with third-party privacy policies, which you can accept or reject when you access them.