Entropy or Amount of Information
Collection | GlossariumBITri |
---|---|
Author | José María Díaz-Nafría |
Editor | José María Díaz-Nafría |
Year | 2016 |
Volume | 2 |
Number | 1 |
ID | 6 |
Object type | Concept |
Domain | MTC |
es | entropía/cantidad de información |
fr | entropie/quantité d’information |
de | Entropie/ Informationsgehalt |
The entropy of amount of information of a discreet information source, characterised by the probability pj, of sending each of its symbols, j, is the statistical average:
being bounded within the limits , where N is the number of symbols.
In case the source might adopt various states i, being Pi the state probability, and pi(j) the probability of sending symbol j when the source is in state i, then the entropy is defined as the average of the entropies of each state:
According to Floridi (2005), the entropy H might designate three equivalent quantities in the ideal case of a noiseless channel: 1) “the average amount of information per symbol produced by the informer”; 2) the “average amount of data deficit (Shannon´s uncertainty) that the informee has before inspection of the output of the informer”; 3) “informational potentiality”.
Since the first two interpretations assume that a defined uncertainty corresponds to each symbol (whether it is in the emission or reception), it implies a certain tactical agreement regarding to the alphabet or the informational game in which the agents are immersed. In both cases, the information can be quantified under the condition that the probability distribution can be specified.
Concerning the third interpretation, entropy might be understood in terms of a physical magnitude related to the amount of disorder in processes or systems conveying energy or information. The larger the entropy, the higher the number of physical states in which the system can be found, consequently, the more information it can refer to, or in other words, the specification of the state in which a certain system is requires more information as its entropy increases. Numerically, this is equivalent to the amount of information or data that has to be given in order to specify the state.
References
- FLORIDI, L. (2005). Semantic Conceptions of Information. In E. N. Zahlta (ed.) Stanford Encyclopedia of Philisophy. Stanford: The Metaphysics Research Lab [online] http://plato.stanford.edu/entries/information-semantic/ > [accessed: 12/11/09]