Information

From glossaLAB

[gL.edu] This article gathers contributions being developed by Vedat Selimi, José María Díaz Nafría, within the context of the Conceptual clarification about "Information, Knowledge and Philosophy", under the supervisión of J.M. Díaz Nafría.

Teacher's Comments: This article requires the corrections indicated below:
  • This article has been extracted from the contribution developed by Vedat Selimi in Information ethics which has been modified to avoid redundancies. In addition some adds and corrections has been introduced by José María Díaz Nafría in order to offer a wider outlook.
  • It is worth enlarging the current article, addressing some of the understandings pointed out in the paper, any many other that has not even been mentioned.
  • Some of the works by Capurro, Florido, Segal and Díaz-Nafría referred to in the article can be used to deepen the understanding of information.

Information is a term that has different meanings in different contexts and disciplines, but in its most common meaning it is understood as something that conveys knowledge or reduces uncertainty in a communication process. Though many of the scientific notions cannot be reduced to this epistemological perspective. As soon as the interaction does not involve conscious agents, we cannot speak of knowledge or certainty, but still we can speak of information.

Historical development of the notion of information

The origin of the word "information" lies in Latin: "informatio" means concept or outline, and "informare" means to teach or mould; where the teaching is beyond the understanding of mere instruction, and refers also to the shaping of behaviour, which obviously connects to ethics. On the other hand, the moulding can be understood in both an epistemological and ontological understandings, i.e., moulding the mind, the soul, or the matter. These distinctions partially exhibit the variety and complexity of its understanding in antiquity, which in turn takes up fundamental concepts of Greek philosophy such as eidos, idea, morphe or typos; and contains the seed of the modern understandings of the information concept, which is not restricted to the epistemological understandings but also the ontological ones.[1][2]

In modern language, information is often used in the sense of "communicating something to someone", but since the development of the scientific concept of information the ontological understanding is also supported in natural and technical sciences. A turning point in the understanding of information was carried out through the development of its scientific concept, which in turn was linked to the development of several emergent scientific disciplines, such as thermodynamics, statistical and quantum physics, genetics and telecommunications at the beginning of the 20th century.[3] In its maturity, it was Claude Shannon's approach to the telecommunication problem that is often considered as a foundation for the scientific concept of information in 1948.[4] However, despite the success of the reception of his theory throughout the scientific realm, as Shannon himself complained later, his theory was not ready to embrace essential aspects of information.

The mathematical approach of Claude Shannon

Claude Shannon, a pioneer of information theory, offers a mathematical approach to defining information. He defines information as the receiver's uncertainty of a message sent from a sender to a receiver. Shannon gives information a numerical or mathematical value based on probability, known as information entropy or Shannon entropy, corresponding to the measure of the reduction of uncertainty for the receiver. According to Shannon, the information content of a message is high if the reported event is surprising, i.e. has a low probability of happening. If, on the other hand, everything is completely predictable, the message has no information value whatsoever (Shannon & Weaver, 1976)[5].

As a consequence, novelty represents a crucial aspect of information within Shannon's theory, which has indeed been present in the different historical understandings of information. The newer the information is, the more informative and relevant it becomes. Old information that is already widely known or guaranteed is considered less informative. This aspect implies a clear difference between information and knowledge, at the same time that it provides a connection between them, allowing the contemplation of information as source of new knowledge.[6]

However Shannon's approach was focused, as he acknowledges at the beginning of his seminal work, on a single aspect the syntactic one, which is the relevant one for the technical problem of telecommunication, while the semantic and pragmatic sides are also essential to a full understanding of information, as he states.[4] Weaver tries to cover these aspect in the co-authored work of 1949,[5] though using the very same model to embrace the breadth of information, which did not suffice (s. the final paragraph of "Entropy and Information" in glossariumBITri's article about Shannon).

Different definitions of information

In many scientific and philosophical discussions, there is no standardised definition of information, as thoroughly discussed by Díaz-Nafría.[7] One significant difference separates the ontological and epistemological understandings, and these are also divided into a variety of concepts highly dependent on the context of application. In ontology, information is seen as being and the entities that exist, while in epistemology, information is defined in connection to knowledge.[6] On the ontological side the physical and chemical notions are more related to structure, in biology to function. In cognitive sciences the focus is on epistemological and psychological processes, while in social sciences, social interaction is in the spotlight. In this variety of understandings, it is the sheer and fuzzy definition provided by the anthropologist and cybernetician Gregory Bateson, which better reconciles the different understandings: "Information is a difference that makes a difference".[8]

At the epistemological pole there is also some consensus about the existing relation between data, information and knowledge (referred to in the article Data-Information-Knowledge-Wisdom Model), though the nature and extension of this relation is disputed.[9][6] The connection to data is relevant for both the technical sciences but also of general and global concern due to the centrality of the data-driven economy.[10] This connection is stressed by Floridi in his General Definition of Information in which information consists of well-formed and meaningful data (2010)[11]; while data, according to Floridi's diaphoric definition, is not interpreted. For example, 100 questionnaires can provide data, but it is only when it is aggregated and analysed in a study that it becomes information. Hansson (2002) emphasises that the processing and presentation of data is necessary to make it accessible as information.

References

  1. Capurro, R. (1978). Information. Ein Beitrag zur etymologischen und ideengeschichtlichen Begründung des Informationsbegriffs. Munich: Saur, who also offers a briefer but sharp account in his co-authored work Capurro, R. & Hjørland, B. (2003). The Concept of Information. Annual Review of Information Science and Technology 37(8), 343-411. [Draft version available online] <http://www.capurro.de/infoconcept.html > [Accessed: 25/07/2024].
  2. Capurro, R. (2009). Past, present and future of the concept of information. TripleC, 7(2): 125-141. DOI: https://doi.org/10.31269/triplec.v7i2.113
  3. Segal, J. (2011). Le Zéro et le Un. Histoire de la notion d'information au XXe siècle. Paris: Éd. Metéorologiques.
  4. Jump up to: 4.0 4.1 Shannon, C. E. (1948). A Mathematical Theory of Communication. The Bell System Technical Journal, 27, 379–423, 623–656.
  5. Jump up to: 5.0 5.1 Shannon, C. E., & Weaver, W. (1949). The Mathematical Theory of Communication (6th ed.). Urbana: University of Illinois Press.
  6. Jump up to: 6.0 6.1 6.2 Díaz-Nafría, J.; Pérez-Montoro, M. (2011). Is Information a Sufficient Basis for Cognition? Part 1: Critique of Dretske’s Approach. TripleC, 9(2): 358-366. DOI: https://doi.org/10.31269/triplec.v9i2.285
  7. Díaz-Nafría, J.M. (2010). What is Information? A multidimensional concern. TripleC, 8(1): 77-108. DOI: https://doi.org/10.31269/triplec.v8i1.76
  8. Bateson, G. (1972). Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology. University of Chicago Press.
  9. Díaz-Nafría, J.M.; Zimmermann, R. (2013). Emergence and evolution of meaning. Triple C, 11(1): 13-35. DOI: https://doi.org/10.31269/triplec.v11i1.334
  10. Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30: 75–89.
  11. Floridi, L. (2010). Information. A Very Short Introduction. London: Oxford University Press.