Alphabet
Collection | GlossariumBITri |
---|---|
Author | José María Díaz-Nafría |
Editor | José María Díaz-Nafría |
Year | 2010 |
Volume | 1 |
Number | 1 |
ID | 3 |
Object type | Concept Resource |
Domain | ICT Transdisciplinary |
es | alfabeto |
fr | alphabet |
de | alphabet |
The term (from Latin alphabētum, and this from Greek ἄλφα, alfa, and βῆτα, beta) has been originally used to refer to the writing system whose symbols (letter) are in relative correspondence with phonemes of the spoken language, in contrast to those writings in which the correspondence is established withmorphemes or syllables. However, the usage has been extended to refer to the set of symbols employed in a communication system. This is the sense normally used in communication theory and particularly in the model of information transmission (especially in its syntactic level, such as in the case of MTC), labelling the finite set of symbols or messages that make up the code which must be known for both the emitter and receiver.
There are two fundamental features to characterise the alphabet with regard to its performance in communication efficiency: 1) its adequacy to the constraints of the communication channel (e.g., that the stroke could be continuous or not, or that the spectral content had to be limited to a given range); 2) thedifferentiability of its component symbols. The former because it will just be effective whatever succeeds in crossing the channel; the latter because depending on it the reception in noisy environments will be better or worse. Indeed, Kotelnikov (1959) proved that the detection error probability is a function of such differences (measured in terms of energy with respect to the noise spectral density).
Concerning alphabets coming from natural languages, they exhibit relevant features regarding an efficient coding for transmission through artificial channels: 1) the statistical frequency of each symbol, and 2) the statistical dependence between a symbol and its adjacent ones (i.e., the transmission probability of a symbol j when the previous was i or a given sequence). The observation -by Alfred Vail- of the first feature in the development of the Morse code played a major role in the success of Morse Telegraph (Oslin 1992) and probably, it played an important heuristic role in the forging of the concept of information measure, especially in Hartley and Shannon work (Lundheim 2002, Segal 2003). The last one accounts for both features in his famous “Mathematical Theory of Communication” in order to determine the entropy (or information amount) of a source (Shannon 1948).
References
- KOTELNIKOV, V.A. (1959). The Theory of Optimum Noise Immunity. Russia 1956, EE.UU.: McGraw Hill.
- LUNDHEIM, L. (2002). “On Shannon and “Shannon's Formula””, Telektronikk (special issue on “Information theory and its applications”) vol. 98, no. 1-2002, pp. 20-29.
- OSLIN, G.P. (1992). The story of telecommunications. Macon GA (EE.UU.): Mercer University Press.
- SEGAL, J. (2003). Le Zéro et le Un. Histoire de la notion scientifique d'information, Paris: Syllepse.
- SHANNON, C. E. (1948), “A Mathematical Theory of Communication”. The Bell System Technical Journal, Vol. 27 (July, October), pp. 379–423, 623–656.