<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.glossalab.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Antonio.Lischke</id>
	<title>glossaLAB - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.glossalab.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Antonio.Lischke"/>
	<link rel="alternate" type="text/html" href="https://www.glossalab.org/wiki/Special:Contributions/Antonio.Lischke"/>
	<updated>2026-04-30T20:19:54Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.43.6</generator>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=30772</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=30772"/>
		<updated>2026-01-09T11:14:03Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: changed the feedback link&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:In review&lt;br /&gt;
}}&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
Unlike an adaptive system, a [[IESC:SYSTEM (Static)|static system]] does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from [[gB:Cybernetics|cybernetics]], [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] cannot decrease. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[gB:Feedback|feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[gB:Feedback|feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management [[gB:Cybernetics|cybernetics]] demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;[[IESC:SYSTEM (Viable)|viable systems]]&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. [[gB:Cybernetics|Cybernetics]], as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[gB:Feedback|feedback]] mechanism. [[gB:Feedback|Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[gB:Feedback|feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[gB:Feedback|feedback]] occurs to reverse this. Positive [[gB:Feedback|feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[gB:Feedback|feedback]] mechanism, demonstrates the elegant simplicity of [[gB:Feedback|feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System 1 variety is accessible to System 3 at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Systems 1-3 manage daily operations, System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The 3-4 Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[gB:Feedback|Feedback]] loops connect action to consequences. Negative [[gB:Feedback|feedback]] stabilizes by reversing deviations, positive [[gB:Feedback|feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=30759</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=30759"/>
		<updated>2026-01-08T15:20:49Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: added hyperlinks&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike an adaptive system, a [[IESC:SYSTEM (Static)|static system]] does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from [[gB:Cybernetics|cybernetics]], [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] cannot decrease. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management [[gB:Cybernetics|cybernetics]] demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;[[IESC:SYSTEM (Viable)|viable systems]]&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. [[gB:Cybernetics|Cybernetics]], as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System 1 variety is accessible to System 3 at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Systems 1-3 manage daily operations, System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The 3-4 Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations, positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=30751</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=30751"/>
		<updated>2026-01-08T11:25:56Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: /* Adaptive Systems and Entropy */ fixed a wrong statement&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike an adaptive system, a [[IESC:SYSTEM (Static)|static system]] does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] cannot decrease. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;[[IESC:SYSTEM (Viable)|viable systems]]&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System 1 variety is accessible to System 3 at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Systems 1-3 manage daily operations, System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The 3-4 Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations, positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29269</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29269"/>
		<updated>2025-12-27T15:02:27Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: corrected a wrong statement&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike an adaptive system, a [[IESC:SYSTEM (Static)|static system]] does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;[[IESC:SYSTEM (Viable)|viable systems]]&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System 1 variety is accessible to System 3 at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Systems 1-3 manage daily operations, System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The 3-4 Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations, positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29268</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29268"/>
		<updated>2025-12-27T15:01:25Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: /* Hierarchical Organization and Emergence */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;[[IESC:SYSTEM (Viable)|viable systems]]&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System 1 variety is accessible to System 3 at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Systems 1-3 manage daily operations, System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The 3-4 Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations, positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29267</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29267"/>
		<updated>2025-12-27T14:54:41Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: /* Cybernetic Principles: Adaptive Systems beyond Biology */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;[[IESC:SYSTEM (Viable)|viable systems]]&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29266</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29266"/>
		<updated>2025-12-27T14:51:29Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: /* Continuous Evolution and Steady State */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29265</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29265"/>
		<updated>2025-12-27T14:47:28Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29259</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29259"/>
		<updated>2025-12-27T14:36:44Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Figure 2: Schematic representation of a Watt-Governor [Created with Microsoft Copilot]]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29258</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29258"/>
		<updated>2025-12-27T14:35:13Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is self-modifying. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Schematic representation of a Watt-Governor created with Microsoft Copliot]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29254</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29254"/>
		<updated>2025-12-27T13:33:58Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: /* Continuous Evolution and Steady State */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is modular on its own. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039; -whether biological organisms, factories, or economies - share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
[[File:Watt-Governor AI.png|thumb|234x234px|Schematic representation of a Watt-Governor created with Microsoft Copliot]]&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course. The picture of the Watt-Governor is created with Microsoft Copilot.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=File:Watt-Governor_AI.png&amp;diff=29252</id>
		<title>File:Watt-Governor AI.png</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=File:Watt-Governor_AI.png&amp;diff=29252"/>
		<updated>2025-12-27T13:32:28Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Schematic representation of a Watt-Governor&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29249</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29249"/>
		<updated>2025-12-27T13:13:18Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: /* Why this Matters for Understanding Complexity */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is modular on its own. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
The clarification of adaptive systems addresses the central challenge of &amp;quot;[[Understanding Complexity|Understandig Complexity]]&amp;quot;: navigating Weaver&#039;s &amp;quot;[[Draft:Organised complexity|organized compexity]]&amp;quot;, the critical middle region between simplistic two-variable problems and disorganized statistical chaos. Adaptive systems exemplify this domain through their many strongly coupled variables organized through relational structures requiring neither reduction analysis nor probabilistic averaging, but recursive regulation and transformation architectures.&lt;br /&gt;
&lt;br /&gt;
This conceptualization directly supports the interdisciplinary mission of &amp;quot;[[Understanding Complexity]]&amp;quot; by establishing conceptual bridges across domains:&lt;br /&gt;
&lt;br /&gt;
* From [[IESC:SYSTEM (Static)|static systems]] to [[IESC:SYSTEM (Dynamic)|dynamic systems]] to adaptive systems&lt;br /&gt;
* From [[IESC:ENTROPY|entropy]] in [[IESC:SYSTEM (Closed)|closed systems]] to [[IESC:SYSTEM (Open)|open systems]]&lt;br /&gt;
* From [[Draft:Organised complexity|organized complexity]] to the [[IESC:SYSTEM (Viable)|VSM]] &lt;br /&gt;
&lt;br /&gt;
For complexity education, adaptive systems serve as the ideal case study. They show concretely how [[IESC:VARIETY (Requisite) (Law of)|requisite variety]], recursive organization and multi-level learning solve [[Draft:Organised complexity|organized complexity]] challenges.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29247</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29247"/>
		<updated>2025-12-27T12:38:33Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: added hyperlinks&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is modular on its own. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an [[IESC:SYSTEM (Isolated)|isolated system]], [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with [[IESC:SYSTEM (Closed)|closed systems]] - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a [[IESC:SYSTEM (Dynamic)|dynamic system]] is a system which operates by processing external inputs and producing an output. The Central element of [[IESC:SYSTEM (Dynamic)|dynamic systems]] is the interaction between the elements in a [[IESC:SYSTEM (Dynamic)|dynamic system]], if there is no interaction in between, then it&#039;s not a [[IESC:SYSTEM (Dynamic)|dynamic system]] &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a [[IESC:SYSTEM (Dynamic)|dynamic system]] but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A [[IESC:SYSTEM (Static)|static system]] maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: [[IESC:SYSTEM (Static)|static systems]] show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike [[IESC:SYSTEM (Closed)|closed systems]] that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that [[IESC:SYSTEM (Dynamic)|dynamic systems]] involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29246</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29246"/>
		<updated>2025-12-27T12:32:43Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is modular on its own. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the [[IESC:SYSTEM (Viable)|Viable System Model (VSM)]], firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which operates by processing external inputs and producing an output. The Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29245</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29245"/>
		<updated>2025-12-27T12:29:25Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: added hyperlinks&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive [[gB:System|system]] does not display a change in state nor in the structural property. The adaptive system is modular on its own. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, [[IESC:GENERAL SYSTEMS THEORY|general systems theory]], open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems [[gB:Entropy or amount of information|entropy]] relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low [[gB:Entropy or amount of information|entropy]] locally by continuously importing low-[[gB:Entropy or amount of information|entropy]] materials and exporting high-[[gB:Entropy or amount of information|entropy]] waste. The total [[gB:Entropy or amount of information|entropy]] of the universe still increases, the second law is never violated, but the local [[gB:Entropy or amount of information|entropy]] of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which operates by processing external inputs and producing an output. The Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on [[feedback]] and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on [[feedback]] from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of [[gB:Self-regulation vs. Automatic regulation|self-regulation]] are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the [[feedback]] mechanism. [[Feedback]] describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative [[feedback]] helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural [[feedback]] occurs to reverse this. Positive [[feedback]] on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived [[feedback]] mechanism, demonstrates the elegant simplicity of [[feedback]]. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] occur through two mechanisms: regulation (maintaining fixed transformation) and [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external [[IESC:ADAPTATION and ADAPTABILITY|adaptation]] through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
[[Feedback]] loops connect action to consequences. Negative [[feedback]] stabilizes by reversing deviations; positive [[feedback]] produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
[[IESC:ADAPTATION and ADAPTABILITY|Adaptation]] operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29244</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29244"/>
		<updated>2025-12-27T12:19:40Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: link&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is modular on its own. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by [[Stafford Beer]] in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which operates by processing external inputs and producing an output. The Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local adaptation and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and adaptation occur through two mechanisms: regulation (maintaining fixed transformation) and adaptation (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external adaptation through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
Feedback loops connect action to consequences. Negative feedback stabilizes by reversing deviations; positive feedback produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
Adaptation operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29243</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29243"/>
		<updated>2025-12-27T12:17:53Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is modular on its own. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which operates by processing external inputs and producing an output. The Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;The Viable System Model https://i2insights.org/2023/01/24/viable-system-model/&amp;lt;/ref&amp;gt; &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local adaptation and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and adaptation occur through two mechanisms: regulation (maintaining fixed transformation) and adaptation (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external adaptation through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
Feedback loops connect action to consequences. Negative feedback stabilizes by reversing deviations; positive feedback produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
Adaptation operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Why this Matters for Understanding Complexity ==&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Notes on Artificial Intelligence ==&lt;br /&gt;
I used Perplexity (Comet version 143.0.7499.110) for research purposes and fact-checking, and ChatGPT (Model GPT-5.2) to identify typographical and grammatical errors, as English is not my native language. The text itself is my own, based on the references and materials provided in the context of this course.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29242</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29242"/>
		<updated>2025-12-27T11:49:55Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: changing grammatical errors&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], an adaptive system does not display a change in state nor in the structural property. The adaptive system is modular on its own. The adaptive system exhibits self-organizing properties that enable it to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understanding [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what an &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a wide range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple. Adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all living organisms on Earth need to adapt to changing conditions over time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For a better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which operates by processing external inputs and producing an output. The Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change of each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system does not depend only on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
One step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast, adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local adaptation and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and adaptation occur through two mechanisms: regulation (maintaining fixed transformation) and adaptation (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external adaptation through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
Feedback loops connect action to consequences. Negative feedback stabilizes by reversing deviations; positive feedback produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
Adaptation operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29241</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29241"/>
		<updated>2025-12-27T11:34:02Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: /* Understanding Adaptive Systems: A Framework for Complexity */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to [[Network|networks]], differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local adaptation and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and adaptation occur through two mechanisms: regulation (maintaining fixed transformation) and adaptation (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external adaptation through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
Feedback loops connect action to consequences. Negative feedback stabilizes by reversing deviations; positive feedback produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
Adaptation operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29240</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29240"/>
		<updated>2025-12-27T11:31:49Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive Systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The Core Definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local adaptation and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and adaptation occur through two mechanisms: regulation (maintaining fixed transformation) and adaptation (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external adaptation through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
Feedback loops connect action to consequences. Negative feedback stabilizes by reversing deviations; positive feedback produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
Adaptation operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29239</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29239"/>
		<updated>2025-12-27T11:29:05Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: changed typos&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continuous learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for Complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists for centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibrium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibrium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continuously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and energy flows in response to the environmental pressures while preserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continuous Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continuous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continuouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpoint where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itself. The system continuously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a homeostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve which admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying recursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local adaptation and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and adaptation occur through two mechanisms: regulation (maintaining fixed transformation) and adaptation (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external adaptation through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
Feedback loops connect action to consequences. Negative feedback stabilizes by reversing deviations; positive feedback produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
Adaptation operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29237</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29237"/>
		<updated>2025-12-27T10:51:58Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibirium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibirium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and enery flows in response to the environmental pressures while perserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continious Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpont where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itsekf. The system continously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a hoemostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve wich admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supply is reduced. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying re cursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local adaptation and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and adaptation occur through two mechanisms: regulation (maintaining fixed transformation) and adaptation (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external adaptation through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
Feedback loops connect action to consequences. Negative feedback stabilizes by reversing deviations; positive feedback produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
Adaptation operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29187</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29187"/>
		<updated>2025-12-25T17:43:12Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibirium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibirium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and enery flows in response to the environmental pressures while perserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continious Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpont where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itsekf. The system continously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a hoemostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve wich admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supplies will be increased. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying re cursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. &lt;br /&gt;
&lt;br /&gt;
The law of cohesion states that System One variety is accessible to System Three at one recursion equals the variety disposed by the metasystems at the next recursion for every recursive pair. This ensures appropriate variety management at each hierarchical level without overwhelming higher management with operational detail. Industrial applications demonstrate this clearly: heavy industry contains sectors like iron and steel, which contain individual steel works, which contain production processes. Each level maintains viability through its own management structure while contributing to higher-level functions. Emergence occurs as autonomous subsystems interact, generating collective behaviors that transcend individual capabilities and cannot be predicted from knowledge of lower-level components alone. The management metasystem (Systems 2-5) emerge to coordinate operations. While each operational unit pursues its activities, higher-level functions for coordination, control, strategy, and policy arise that cannot be reduced to individual unit operations but represent genuinely emergent organizational capabilities. This hierarchical organization through recursion thus provides a powerful mechanism for managing complexity while preserving autonomy at each level, enabling both local adaptation and global coherence in viable systems.&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Learning and Adaptation Mechanisms ===&lt;br /&gt;
Learning and adaptation occur through two mechanisms: regulation (maintaining fixed transformation) and adaptation (changing transformation). Every system either regulates or adapts, otherwise it loses identity.​&lt;br /&gt;
&lt;br /&gt;
System 4 embodies the intelligence function for organizational learning. It manages two tasks: switching information between System 5 and lower levels and capturing all relevant environmental information. This enables feedforward control by integrating internal and external information, allowing anticipation rather than mere reaction.​​&lt;br /&gt;
&lt;br /&gt;
Inside-and-now versus outside-and-then: Systems 1-3 manage daily operations; System 4 manages future in different environmental recursion levels. System 4 must maintain models of both the system itself and its embedded position in larger environments, this enables self-awareness.​​&lt;br /&gt;
&lt;br /&gt;
The Three-Four Homeostat balances internal stability against external adaptation through continuous variety management. The Second Axiom of Management: variety disposed by System 3 equals variety disposed by System 4. Resources must be homeostatically balanced: too much in System 4 endangers daily operations; too much in System 3 risks making obsolete products.​​&lt;br /&gt;
&lt;br /&gt;
Feedback loops connect action to consequences. Negative feedback stabilizes by reversing deviations; positive feedback produces growth or contraction, potentially leading to structural change. Requisite variety is fundamental: controllers need at least as much variety as the systems and environments they control.​&lt;br /&gt;
&lt;br /&gt;
Adaptation operates on multiple timescales and encompasses first-order learning (improving within frameworks) and second-order learning (changing frameworks themselves). &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Diagnosing the System&amp;quot; 1985&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29186</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29186"/>
		<updated>2025-12-25T16:10:33Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibirium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibirium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and enery flows in response to the environmental pressures while perserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continious Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpont where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Cybernetic Principles: Adaptive Systems beyond Biology ==&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itsekf. The system continously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Homeostasis in Technical Systems ===&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a hoemostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== The Law of Requisite Variety ===&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Technical Implementation of Cybernetic Principles ===&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve wich admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supplies will be increased. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Hierarchical Organization and Emergence ===&lt;br /&gt;
Hierarchical organization represents a fundamental principle through which adaptive systems manage complexity and maintain viability. This extends beyond traditional organizational charts embodying re cursive structures where similar patterns repeat at different scales. Recursion describes systems where parts exhibit the same organizational principles as the whole: viable systems contain viable systems and are embedded in larger viable systems, creating nested structures like Russian dolls. The same organizational principles apply at all levels regardless of scale, from molecules to cells to organisms to ecosystems. &amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;&amp;lt;ref&amp;gt;J. Walker, &amp;quot;The Viable Systems Model&amp;quot; 1998&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The [[IESC:SYSTEM (Viable)|Viable System Model]] explicitly incorporates recursion as a management tool that enables organizations to cope with both vertical and horizontal interdependencies. Each recursion level exhibits the same five functional systems (Systems 1 - 5), whether examining a multinational corporation, its divisions, companies or departments. System 1 operational units are themselves viable systems at the next lower recursion, meaning each division or department must possess complete management functions to maintain autonomy while contributing to higher level coherence. [?]&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29185</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29185"/>
		<updated>2025-12-25T15:24:35Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibirium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibirium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and enery flows in response to the environmental pressures while perserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continious Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpont where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Cybernetic Principles: Adaptive Systems beyond Biology ===&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itsekf. The system continously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Homeostasis in Technical Systems ====&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a hoemostat is a control device for holding some variable between desired limits. While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== The Law of Requisite Variety ====&lt;br /&gt;
A fundamental cybernetic principle is Ashby&#039;s [[IESC:VARIETY (Requisite) (Law of)|Law of Requisite Variety]]: the variety of the controller must be greater than, or equal to, the variety of the system to be controlled, or the environment to be dealt with. This must be achieved if the system is to have a guarantee of remaining under control. This principle applies universally to both biological and technical systems. Variety can be used as a measure of the number of possible distinguishable states of a system, an environment, or the control element of a system. A control system must have adequate variety to guarantee effective regulation. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Technical Implementation of Cybernetic Principles ====&lt;br /&gt;
Engineering systems routinely implement cybernetic principles. The Watt governor, regarded as the first deliberately contrived feedback mechanism, demonstrates the elegant simplicity of feedback. As the engine speed increases, weighted arms rise by centrifugal force and operate a valve wich admits power to the engine, closing in proportion as the arms rise and the speed grows. This creates a homeostat: the more the machine tends to exceed given speed, its energy supplies will be increased. The desired output is attained by self-regulation, the input to the machine is adjusted by the output itself.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29184</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29184"/>
		<updated>2025-12-25T14:06:41Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibirium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibirium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and enery flows in response to the environmental pressures while perserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continious Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium.&lt;br /&gt;
&lt;br /&gt;
Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpont where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed. &amp;lt;ref&amp;gt;N. Wiener &amp;quot;Cybernetics&amp;quot;, 1948&amp;lt;/ref&amp;gt;&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While biological systems provide the clearest examples of adaptive, irreversible processes, cybernetic theory recognizes that these principles extend far beyond organic life. Beer&#039;s work in management cybernetics demonstrated that industrial organizations, economic systems, and social institutions all exhibit the same fundamental characteristics: they are composed of numerous coupled elements, operate through nonlinear interactions, and undergo irreversible transformations as they adapt to their environments. Stafford Beer explicitly argued that &#039;viable systems&#039;—whether biological organisms, factories, or economies—share identical structural properties in their need to maintain homeostasis while responding to environmental perturbations. This universality of cybernetic principles across different domains established the theoretical foundation for applying biological models of regulation to non-biological complex systems. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Cybernetic Principles: Adaptive Systems beyond Biology ===&lt;br /&gt;
Adaptive systems are by no means exclusively of biological origin. Cybernetics, as the science of communication and control in living organisms and machines, demonstrates that the fundamental principles of self-regulation are universally applicable.&lt;br /&gt;
&lt;br /&gt;
The heart of cybernetic systems is the feedback mechanism. Feedback describes where the influence of an element impacts on other elements, but through a series of relationships the effect of its initial influence feeds back on itsekf. The system continously measures the difference between its current state and the target state and corrects its actions accordingly. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The negative feedback helps to achieve defined objectives as set in control parameters. If a system moves out of its steady state, either control action is taken or natural feedback occurs to reverse this. Positive feedback on the other hand, helps to achieve contained contraction or replication and growth, or leads to uncontained and unstable contraction or growth. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Homeostasis in Technical Systems ====&lt;br /&gt;
The principle of homeostasis - the ability of a system to maintain stability - extends far beyond biological organisms. Beer describes homeostasis as a critical variable held at a desirable level by a self-regulatory mechanism. A thermostat is a machine for holding temperature between desired limits, while a hoemostat is a control device for holding some variable between desired limits. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While the classic biological example is blood regulation, the same mechanism can be implemented in technical systems. The homeostasis of animal populations demonstrates this principle in ecological systems: caterpillars feed birds, which controls the caterpillar population, while caterpillars eat vegetation, which controls vegetation growth. &amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== The Law of Requisite Variety ====&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29181</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=29181"/>
		<updated>2025-12-25T13:12:07Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
The fundamental distinction between static and adaptive systems lies in their response to environmental change and internal dynamics. A static system maintains a fixed structure and behavior, showing no significant alteration in state or organizational properties over time. Such systems exist in closed equilibirium, with predetermined responses to inputs and no capacity for learning or structural modification. Examples include simple mechanical devices, closed chemical systems at equlibirium, or rigid organizational structures that follow fixed protocols without adjustment. &amp;lt;ref&amp;gt;G. J. Klir, &amp;quot;Trends in General Systems Theory&amp;quot; 1972&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In contrast adaptive systems possess internal mechanisms that enable them to modify their behavior and structure based on feedback from their environment and previous experiences. This capacity is not limited to biological organisms - it extends to organizations, ecosystems and technical systems. An adaptive organization, for instance, continously adjusts its operational structures, resource allocation and strategic responses while maintaining its essential identity. Similarly, an ecosystem modifies species composition and enery flows in response to the environmental pressures while perserving system stability. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The mathematical distinction is expressed through differential equations: static systems show time-independent parameters and converge to fixed equilibria, while adaptive systems exhibit time-varying parameters and multiple potential  steady states. Critically, adaptive systems are characterized by equifinality: the capacity to reach the same functional state from different initial conditions through different pathways. This property, impossible in closed static systems, emerges from dynamic interaction and continuous exchange with the environment.&lt;br /&gt;
&lt;br /&gt;
=== Continious Evolution and Steady State ===&lt;br /&gt;
Aristotle introduced the concept of &#039;&#039;dynamis&#039;&#039; to denote potentiality - the capacity for change inherent in matter. A seed for example possesses &#039;&#039;dynamis&#039;&#039; to become a tree. This ancient distinction illuminates a fundamental property of open systems: their capacity through continous transformation. &amp;lt;ref&amp;gt;[https://www.spektrum.de/lexikon/philosophie/dynamis/495] Dynamis - Metzler Lexikon&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Yet this continouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium.&lt;br /&gt;
&lt;br /&gt;
Unlike closed systems that inevitably reach thermodynamic equilibrium - a static endpont where dynamis is exhausted - open systems exchanging matter and energy with their environment attain dynamic steady states. Here, the system&#039;s composition remains approximately constant despite continuous material throughput, embodying a paradoxical constancy through change. The organism continuously imports material and energy, transforms them internally and exports degraded products, maintaining organizational structure through metabolic flux. Over long periods of time, the vast majority of atoms in the human body are replaced through metabolism, food intake and respiration. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This steady state exhibits equifinality - the capacity to reach the same functional configuration from different initial conditions through different pathways. Such path-independence, impossible in closed equilibrium systems where trajectories are strictly determined by initial states, reflects the system&#039;s capacity to actualize multiple potentialities toward the same organizational form. Crucially, steady states exist far from thermodynamic equilibrium and require continuous energy dissipation. This distance from equilibrium enables work performance, adaptive response, and the preservation of dynamis - the unrealized potentials that permit future evolution. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This temporal process is fundamentally irreversible. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time. But this is not true for adaptive systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, Norbert Wiener explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. Consider biological development: a fertilized egg developing into an organism cannot reverse the process. The system has explored paths through state space that cannot be retraced - information has been dissipated, possibilities have been foreclosed.&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft_talk:Adaptive_System&amp;diff=28980</id>
		<title>Draft talk:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft_talk:Adaptive_System&amp;diff=28980"/>
		<updated>2025-12-23T20:43:45Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: edited bla bla bla out&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28979</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28979"/>
		<updated>2025-12-23T20:42:47Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange matter and energy with their environment. This means they can maintain low entropy locally by continuously importing low-entropy materials and exporting high-entropy waste. The total entropy of the universe still increases, the second law is never violated, but the local entropy of the system can decrease and be maintained &amp;lt;ref&amp;gt;E. Schrödinger, &amp;quot;What is Life?&amp;quot; 1967&amp;lt;/ref&amp;gt;. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Continious evolution over time ===&lt;br /&gt;
The word &amp;quot;dynamic&amp;quot; itself derives from the Greek dynamis, meaning power or potential. But what makes a system truly dynamic is not merely the presence of change, but rather the continuous, structured, and observable transformation of measurable quantities over time. A dynamic system is not a static configuration but a process unfolding. This process is described through state variables - specific, measurable quantities that characterize the system at any given moment. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;N. Wiener, &amp;quot;Cybernetics&amp;quot; second edition 1948&amp;lt;/ref&amp;gt;. But this is not true for adaptive systems in general, and certainly not for living systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. This creates what is called irreversibilty: the future state of the system depends on the entire history leading to the present state, not just the present moment. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
This has a profound consequence for how we understand evolution in time. A system at any given time &amp;lt;math&amp;gt;t_2&lt;br /&gt;
&amp;lt;/math&amp;gt;cannot simply be reversed to return to its state at time &amp;lt;math&amp;gt;t_1&lt;br /&gt;
&amp;lt;/math&amp;gt;, even if we reverse all the forces, because information has been dissipated, possibilities have been foreclosed, and the system has explored paths through state space that cannot be retraced. Consider biological development: a fertilizing egg developing into an organism. You cannot reverse the process. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Yet this continouos evolution presents a paradox. An organism observed at one moment and then a week later appears unchanged, the same size, the same structure, the same composition. How can a system be in constant change and still subjectively remain stable and the same? The answer lies in understanding that living systems maintain themselves in what is called a steady state or dynamic equilibrium. In this state, measurable properties remain constant, yet material continouosly flows through the system. Over long periods of time, the vast majority of the atoms in the human body are replaced through metabolism, food intake and respiration.  &lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28969</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28969"/>
		<updated>2025-12-23T19:27:00Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. Yet living systems maintain themselves in states of remarkable order, complexity and organization. How is this possible?&lt;br /&gt;
&lt;br /&gt;
The answer is simple: adaptive systems are not isolated, they are open systems. Classical thermodynamics deals with closed systems - isolated from their environment, exchanging neither matter nor energy. In such systems entropy relentlessly increases.&lt;br /&gt;
&lt;br /&gt;
Open systems, by contrast, exchange This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== Continious evolution over time ===&lt;br /&gt;
The word &amp;quot;dynamic&amp;quot; itself derives from the Greek dynamis, meaning power or potential. But what makes a system truly dynamic is not merely the presence of change, but rather the continous, structured, and observable transformation of measurable quantities over time. A dynamic system is not a static configuration but a process unfolding. This process is described through state variables - specific, measurable quantities that characterize the system at any given moment. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;N. Wiener, &amp;quot;Cybernetics&amp;quot; second edition 1948&amp;lt;/ref&amp;gt;. But this is not true for adaptibe systems in general, and certainly not for living systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. This creates what is called irreversibilty: the future state of the system depends on the entire history leading to the present state, not just the present moment. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
This has a profound consequence for how we understand evolution in time. A system at any given time &amp;lt;math&amp;gt;t_2&lt;br /&gt;
&amp;lt;/math&amp;gt;cannot simply be reversed to return ro its state at time &amp;lt;math&amp;gt;t_1&lt;br /&gt;
&amp;lt;/math&amp;gt;, even if we reverse all the forces, because information has been dissipated, possibilities have been foreclosed, and the system has explored paths through state space that cannot be retracted. Consider biological development: a fertilizing egg developing into an organism. You cannot reverse the process. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28962</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28962"/>
		<updated>2025-12-23T18:58:59Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Adaptive Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;adaptive&amp;quot; or &amp;quot;dynamic&amp;quot; system is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and adaptive systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
For better understanding we need to look at [[IESC:SYSTEM (Dynamic)|dynamic systems]] first. Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. Central element of dynamic systems is the interaction between the elements in a dynamic system, if there is no interaction in between, then it&#039;s not a dynamic system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now one step further, there is the &#039;&#039;&#039;adaptive system&#039;&#039;&#039;. An adaptive system is a dynamic system but with the special feature of possessing internal mechanisms to change its behavior, based on feedback and environmental changes.&lt;br /&gt;
&lt;br /&gt;
=== Adaptive systems and Entropy ===&lt;br /&gt;
At the heart of understanding adaptive systems lies a profound paradox that occupied scientists vor centuries. The second law of thermodynamics - one of the most fundamental laws of physics - states that in an isolated system, [[IESC:ENTROPY|entropy]] must always increase. This law by itself states, that all the living organisms here on earth need to adapt to certain conditions and the change in time. Therefore they need to be adaptive.&lt;br /&gt;
&lt;br /&gt;
=== Continious evolution over time ===&lt;br /&gt;
The word &amp;quot;dynamic&amp;quot; itself derives from the Greek dynamis, meaning power or potential. But what makes a system truly dynamic is not merely the presence of change, but rather the continous, structured, and observable transformation of measurable quantities over time. A dynamic system is not a static configuration but a process unfolding. This process is described through state variables - specific, measurable quantities that characterize the system at any given moment. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;N. Wiener, &amp;quot;Cybernetics&amp;quot; second edition 1948&amp;lt;/ref&amp;gt;. But this is not true for adaptibe systems in general, and certainly not for living systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. This creates what is called irreversibilty: the future state of the system depends on the entire history leading to the present state, not just the present moment. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
This has a profound consequence for how we understand evolution in time. A system at any given time &amp;lt;math&amp;gt;t_2&lt;br /&gt;
&amp;lt;/math&amp;gt;cannot simply be reversed to return ro its state at time &amp;lt;math&amp;gt;t_1&lt;br /&gt;
&amp;lt;/math&amp;gt;, even if we reverse all the forces, because information has been dissipated, possibilities have been foreclosed, and the system has explored paths through state space that cannot be retracted. Consider biological development: a fertilizing egg developing into an organism. You cannot reverse the process. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
=== Static vs Adaptive Systems ===&lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28961</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28961"/>
		<updated>2025-12-23T17:32:53Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a [[IESC:SYSTEM (Static)|static system]], which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand [[IESC:COMPLEXITY (Organized)|organized complexity]].&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Dynamic Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;dynamic system&amp;quot; is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and dynamic systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. This means it adapts, or reacts to the circumstances around over time. Central element of systems is the interaction between the elements in a system, if there is no interaction in between, then it&#039;s not a system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
=== Continious evolution over time ===&lt;br /&gt;
The word &amp;quot;dynamic&amp;quot; itself derives from the Greek dynamis, meaning power or potential. But what makes a system truly dynamic is not merely the presence of change, but rather the continous, structured, and observable transformation of measurable quantities over time. A dynamic system is not a static configuration but a process unfolding. This process is described through state variables - specific, measurable quantities that characterize the system at any given moment. In classical Newtonian mechanics, the mathematical equations describing planetary motion are perfectly reversible in time &amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;N. Wiener, &amp;quot;Cybernetics&amp;quot; second edition 1948&amp;lt;/ref&amp;gt;. But this is not true for dynamic systems in general, and certainly not for living systems. If you filmed a cloud forming and then ran it backwards, you would see something physically impossible. The reason, [[Draft:Norbert Wiener|Norbert Wiener]] explains, is that dynamic systems involve enormous numbers of coupled elements and non-linear interactions. This creates what is called irreversibilty: the future state of the system depends on the entire history leading to the present state, not just the present moment. &amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
This has a profound consequence for how we understand evolution in time. A system at any given time &amp;lt;math&amp;gt;t_2&lt;br /&gt;
&amp;lt;/math&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28959</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28959"/>
		<updated>2025-12-23T16:47:25Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a static system &amp;lt;ref&amp;gt;[[IESC:SYSTEM (Static)]]&amp;lt;/ref&amp;gt;, which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref&amp;gt;R.L. Flood, E.R. Carson, &amp;quot;Dealing with Complexity&amp;quot;, 1993&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref&amp;gt;S. Beer, &amp;quot;Cybernetics and Management&amp;quot; 1959&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand organized complexity &amp;lt;ref&amp;gt;[[IESC:COMPLEXITY (Organized)]]&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Dynamic Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;dynamic system&amp;quot; is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and dynamic systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
[[File:Differential Equation.png|thumb|245x245px|Figure 1: Differential Equation]]&lt;br /&gt;
Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. This means it adapts, or reacts to the circumstances around over time. Central element of systems is the interaction between the elements in a system, if there is no interaction in between, then it&#039;s not a system &amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;L. v. Bertalanffy; &amp;quot;General System Theory&amp;quot; 1968&amp;lt;/ref&amp;gt;. The critical point is that the behaviour of each element is fundamentally shaped by the behaviour of other elements, not merely influenced by them in a linear fashion. This contrasts sharply with what system theorists call an aggregation - a mere collection of parts without genuine coupling. Consider a pile of stones: each stone is present, but its existence does not fundamentally alter the behaviour of the others. This is captured mathematically through systems of coupled differential equations, where the rate of change each variable depends explicitly on the values of multiple variables simultaneously. &lt;br /&gt;
&lt;br /&gt;
In Figure 1 you can see a differential equation inspired by Bertalanffy, where for all i the system doesn&#039;t only depend on its own variable. This has a profound implication: you cannot understand a system by isolating its parts. &amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
=== Continious evolution over time ===&lt;br /&gt;
A dynamic system is not a static configuration but a process unfolding in time. This process is described through state variables - specific, measurable quantities that characterize the system at any given moment.  &lt;br /&gt;
&lt;br /&gt;
== References and Material ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=File:Differential_Equation.png&amp;diff=28956</id>
		<title>File:Differential Equation.png</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=File:Differential_Equation.png&amp;diff=28956"/>
		<updated>2025-12-23T15:59:45Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Describes the dependence between elements&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=File:Figure_1-_Differential_Equation.png&amp;diff=28955</id>
		<title>File:Figure 1- Differential Equation.png</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=File:Figure_1-_Differential_Equation.png&amp;diff=28955"/>
		<updated>2025-12-23T15:57:16Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Describes the interaction betweeen elements&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28942</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28942"/>
		<updated>2025-12-23T14:17:37Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a static system &amp;lt;ref&amp;gt;[[IESC:SYSTEM (Static)]]&amp;lt;/ref&amp;gt;, which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience. &amp;lt;ref&amp;gt;Flood-Carson-1993: Dealing with Complexity&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Another core principle to be looked at is the Viable System Model (VSM), firstly formulated by Stafford Beer in 1959, which provides a normative framework for understanding how adaptive systems achieve viability through distributed decision-making and multi-level coordination &amp;lt;ref&amp;gt;Beer, Stafford 1959 - Cybernetics and Management&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
By clarifying how adaptive systems achieve viability in uncertain environments while maintaining functional integrity across organization levels, this article contributes to understand organized complexity &amp;lt;ref&amp;gt;[[IESC:COMPLEXITY (Organized)]]&amp;lt;/ref&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;br /&gt;
&lt;br /&gt;
== Understanding Dynamic Systems: A Framework for complexity ==&lt;br /&gt;
If you search online for, what a &amp;quot;dynamic system&amp;quot; is, you will quickly realize that it is a huge range of topics from living organisms, to networks, differential equations and vibration analysis. In order to understand the difference between static and dynamic systems we need to dive deeper and analyze different topics.&lt;br /&gt;
&lt;br /&gt;
=== The core definition ===&lt;br /&gt;
Briefly explained, a dynamic system is a system which works by processing external inputs and producing an output. This means it adapts, or reacts to the circumstances around. Central element of systems is the interaction between the elements in a system, if there is no interaction in between, then it&#039;s not a system &amp;lt;ref&amp;gt;Bertalanffy-1968-General System Theory&amp;lt;/ref&amp;gt;. Bertalanffy describes this topic &lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28698</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28698"/>
		<updated>2025-12-21T17:24:24Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Abstract ==&lt;br /&gt;
{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;br /&gt;
Unlike a static system &amp;lt;ref&amp;gt;[[IESC:SYSTEM (Static)]]&amp;lt;/ref&amp;gt;, which doesn&#039;t display  a change in state nor in the structural property, the adaptive system is modular in its own. The adaptive system exhibits self-organizing properties that enable them to maintain viability and effectiveness through continious learning and structural adjustment. Core principles include hierarchical recursion, requisite variety (Ashby&#039;s law), learning mechanisms and resilience.&lt;br /&gt;
&lt;br /&gt;
This article clarifies the conceptual foundations of adaptive systems across domains by integrating insights from cybernetics, general systems theory, open systems thermodynamics and complexity science.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28697</id>
		<title>Draft:Adaptive System</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Adaptive_System&amp;diff=28697"/>
		<updated>2025-12-21T15:27:40Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: Created page with &amp;quot;{{Proposal |Was created on date=2025-12-21 |Belongs to clarus=Understanding Complexity |Has author=Antonio Lischke (Antonio.Lischke) |Has publication status=glossaLAB:Open }}&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Proposal&lt;br /&gt;
|Was created on date=2025-12-21&lt;br /&gt;
|Belongs to clarus=Understanding Complexity&lt;br /&gt;
|Has author=Antonio Lischke (Antonio.Lischke)&lt;br /&gt;
|Has publication status=glossaLAB:Open&lt;br /&gt;
}}&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft_talk:Information&amp;diff=27291</id>
		<title>Draft talk:Information</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft_talk:Information&amp;diff=27291"/>
		<updated>2025-11-06T16:36:22Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: Created page with &amp;quot;I think information mustn&amp;#039;t be new.&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;I think information mustn&#039;t be new.&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=User:Antonio.Lischke&amp;diff=27284</id>
		<title>User:Antonio.Lischke</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=User:Antonio.Lischke&amp;diff=27284"/>
		<updated>2025-11-06T16:30:34Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Person&lt;br /&gt;
|Given name=Antonio&lt;br /&gt;
|Family name=Lischke&lt;br /&gt;
|Image filename=5D3_6397.jpg&lt;br /&gt;
|Sex=Male&lt;br /&gt;
|Country=Germany&lt;br /&gt;
|Institution=Hochschule München (HM) – University of Applied Sciences&lt;br /&gt;
|Professional category=Technicians and associate professionals&lt;br /&gt;
|Academic degree=High School Diploma (secondary)&lt;br /&gt;
|Current academic institution=Hochschule München (HM) – University of Applied Sciences&lt;br /&gt;
|Current academic level=Bachelor’s Degree&lt;br /&gt;
|Current academic degree=B.Sc.&lt;br /&gt;
|input language=EN (English)&lt;br /&gt;
}}&lt;br /&gt;
In 2022 I completed my apprenticeship as an technical modelmaker. Currently I&#039;m studying automotive engineering with focus on expert services and vehicle body engineering.&lt;br /&gt;
[[Category:Person]]&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=File:5D3_6397.jpg&amp;diff=27282</id>
		<title>File:5D3 6397.jpg</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=File:5D3_6397.jpg&amp;diff=27282"/>
		<updated>2025-11-06T16:30:28Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=User:Antonio.Lischke&amp;diff=27276</id>
		<title>User:Antonio.Lischke</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=User:Antonio.Lischke&amp;diff=27276"/>
		<updated>2025-11-06T16:27:20Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Person&lt;br /&gt;
|Given name=Antonio&lt;br /&gt;
|Family name=Lischke&lt;br /&gt;
|Sex=Male&lt;br /&gt;
|Country=Germany&lt;br /&gt;
|Institution=Hochschule München (HM) – University of Applied Sciences&lt;br /&gt;
|Professional category=Technicians and associate professionals&lt;br /&gt;
|Academic degree=High School Diploma (secondary)&lt;br /&gt;
|Current academic institution=Hochschule München (HM) – University of Applied Sciences&lt;br /&gt;
|Current academic level=Bachelor’s Degree&lt;br /&gt;
|Current academic degree=B.Sc.&lt;br /&gt;
|input language=EN (English)&lt;br /&gt;
}}&lt;br /&gt;
In 2022 I completed my apprenticeship as an technical modelmaker. Currently I&#039;m studying automotive engineering with focus on expert services and vehicle body engineering.&lt;br /&gt;
[[Category:Person]]&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=User:Antonio.Lischke&amp;diff=17602</id>
		<title>User:Antonio.Lischke</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=User:Antonio.Lischke&amp;diff=17602"/>
		<updated>2025-10-13T14:07:15Z</updated>

		<summary type="html">&lt;p&gt;Antonio.Lischke: create user page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Person}}[[Category:Person]]&lt;/div&gt;</summary>
		<author><name>Antonio.Lischke</name></author>
	</entry>
</feed>