Determinism: Difference between revisions
→Information and Determinism
Line 158: | Line 158: | ||
== '''Information and Determinism''' == | == '''Information and Determinism''' == | ||
=== Information by Claude Shannon === | |||
As stated in the [[Information|article about information]], Claude Shannon defines information as how suprising a message is to a reciever it is sent to. He quantifies that value of surprise as the amount of reduced uncertainty of the reciever, which is called information entropy (also Shannon entropy) and is probabilistic. | |||
According to Shannon, the less likely an event is to occur, the higher the informative content of a message that is delivered to the reciever, and vice versa. | |||
In a deterministic world, however, any discussion about the probability of an event happening would become unnecessary, because all events would happen with certainty, leaving no room for alternative events in the same place and time, which would imply that, by the idea of Shannon, the role of information could be drastically diminished. | |||
How much the concept of information, as defined by Shannon, would lose its significance, depends on the receiver's ability to deduce the content of a message independently, relying on his knowledge about natural laws, his logical reasoning and how much he knows about previous events and their causal relationships to the event in question. | |||
In a deterministic world, where humans would have the ability to deduce any event if given enough data, no amount of information would be obtained from the message about any event. | |||
Every outcome would be entirely predictable, rendering the concept of information, in Shannon's sense, obsolete. | |||
== '''Knowledge and Determinism''' == | == '''Knowledge and Determinism''' == |