PARALLELISM (Different types of)
| Collection | International Encyclopedia of Systems and Cybernetics |
|---|---|
| Year | 2004 |
| Vol. (num.) | 2(2) |
| ID | ◀ 2470 ▶ |
| Object type | Discipline oriented, General information, Methodology or model |
(What is described hereafter are the various degrees of parallelism that may be conceived for computers. However, these notions seem very valuable for the understanding of any parallel process in very complex systems, as for example the brain or human societies).
“'''''Replicative parallelism:… Each transputer… replicate and execute the same code, but on a different set of data. This form of independent parallelism (the processes do not communicate much) is quite trivial” ('T. MUNTEAN, 1988, p.1319). Indeed, this is practically a “multi-sequentiality”.
“'''''Geometric parallelism: or ”'data structure oriented“ maintains more or less the same code, carried out by each processor endowed with the responsability of one data subset only… The price to be paid for geometrical optimization of a data in a machine is an extensive communication. Indeed local computing in an area may obtain data resulting from computing in nearby areas or on farther away processors” (Ibid.).
The word “resulting” shows that this is organized sequentiality from multiple simultaneities, defined by a “Principle of locality”, as it is called by the author.
- “Algorithmic parallelism: Each transputer carries out only a small part of the algorithm or program. And the data cross the net from transputer to transputer (data flow). The communication between processors must be elaborated and efficient and the work load must be correctly distributed” (Ibid.).
Obviously, in this last case, communication must be constrained by the algorithm to obtain a net sufficiently organized to be coherent and yet, not totally overconnected. It remains to be seen if it would be feasible to create self-organizing algorithms, able to develop themselves from some initial configuration, using the interactions between the system and its environment for internal learning.