Nonlinear Dynamics, aka “Chaos Theory”, is most often associated with the “Butterfly Effect”, but frankly that idea misses the point completely. Chaos may be unpredictable, but it is also endlessly creative.
Nonlinear Dynamics, aka Chaos Theory, studies the “Creative Power of Incompressible Dynamics”…
What is Chaos Theory?
The common misconceptions about Chaos Theory have not been helped by it popular association with the so-called “Butterfly Effect”, which has merely served to perpetuate the idea that chaos theory is about unpredictable chaos (a misconception obviously further reinforced by the theory’s name).
The scientific parlance/phraseology for the butterfly effect is “Sensitive Dependence on Initial Conditions” (SDIC). However this phraseology is extremely misleading because it projects the false illusion that everything is fundamentally predictable given accurate “initial conditions”. The truth, as it turns out, is that only dynamics with minimal feedback are “compressible to linearity” and thereby predictable…
Chaos Theory studies the dynamics of the stuff that is not compressible to linearity. So Chaos Theory is not the mathematics of pandemonium and disarray. Chaos Theory actually studies the various types of dynamics that can occur in recursive mathematical systems. Chaos Theory studies feedback in self-stabilizing systems.
Most people know that good shock absorbers can damp out a bumpy ride in a car, by dissipating excess vibrational energy, and thus maintain the car’s ride at a smooth stable equilibrium.
However, damping and stabilization does not only occur as a result of the dissipation of energy, it can also occur as a result of the natural re-distribution of energy (or matter).
Many systems in nature are natural equilibrium seeking systems (i.e. left to themselves they will spontaneously mix). Thermal systems naturally seek thermal equilibrium which is the name given to the settled warm mixture of hot and cold fluids. And dynamic systems naturally seek dynamic equilibrium which is the name given to the settled end state of a self-mixing process (e.g. ink and water).
Now any system that is a spontaneous equilibrium seeking system (be it hot coffee and cold milk in a mug, or buyers and sellers in a market) can also be thought of as a self-stabilizing system. But just because a system is self-stabilizing does not mean it will always be able to find a completely stable thermal or dynamic equilibrium…
A self-stabilizing system is akin to a negative feedback system. In any naturally self-stabilizing system, it is the internal activity that acts to damp down diversity and damp out internal differentials, by ultimately micro-adjusting its way to a uniform and stable equilibrium. So in any such system if its internal activity can actively “micro-adjust” then there will ultimately be no chaos because the system will have been able to “fine-tune” its way to a uniform and stable equilibrium.
If, however, the system’s internal self-stabilizing activity is limited or restricted in its ability to micro-adjust, then it will be unable to fully damp down diversity and damp out internal differentials and thus the system behavior will ultimately remain unstable and unpredictably, as the it futilely tries to “coarse-tune” its way to a uniform and stable equilibrium.
Chaos Theory tells us that “coarse adjustments” to feedback within recursive mathematical systems causes “coarse-stabilization”, and this “coarse damping to equilibrium” renders the finding of a uniform and stable equilibrium difficult, leaving residual incompressible diversity and instability to remain in its stead. [And obviously the coarser the process of adjustment and stabilization, the more unstable will be the residual equilibrium].
This inability to compress diversity and converge on a mathematical equilibrium suggests that there is a striking similarity between Chaos Theory and Information Theory. It suggests that “Chaotic Behaviour” is, in effect, the mathematical equivalent of “Information Entropy”. It suggests that
“Chaos” is simply “Information Entropy” generated by “Incompressible Dynamics”…
In the late 1940’s, some 20 years before the birth of Chaos Theory, the American mathematician Claude Shannon developed “Information Theory”. This work was to turn out to be one of the cornerstones of all of computer science. His key idea was basically to try to figure out how much actual information is contained within a stream of data.
As it turns out most things that contain a lot of data, actually contain very little real information. (Think of how often a 300 page book is simply a 3,000 word essay fleshed out to 100,000 words.) According to Shannon most data in a data stream is “redundant” and can be “compressed” to its pure information content. Shannon called this pure compressed content “information entropy”, and the more incompressible a system is, the higher will be its information entropy.
Information Entropy is in effect, “the limit of compression of redundant information”; it is pure information undiluted by repetitive redundancy. For our purposes we can use this concept of information entropy to think about the structure and dynamics of linear and nonlinear systems.
The description (mathematical or otherwise) of the structure or dynamics of a simple linear system will, after compression, have a low information entropy; while that of a more complex nonlinear system will, after compression, have a higher information entropy. This higher information entropy makes complex systems harder to mathematically express (and thus harder to predict).
So is any this important? Absolutely it is. We live in a universe of nonlinear dynamics, some of it mathematically compressible, some of it not. For 350 years we studied the compressible stuff, but chaos theory was to be our first real introduction to the “Creative Power of Incompressible Dynamics”.
Unfortunately Physics obsession with, and myopic focus on, linearity and prediction meant that when Chaos Mathematics was first discovered, we concentrated solely on the fact that all its internally-generated nonlinearity makes a system difficult to predict.
But for too long now has Chaos Theory been held back by its association with the idea of SDIC. Chaos Theory, is more than just the Butterfly Effect. The time has come for the world to realise that
Nonlinear Dynamics, are the “Incompressible Dynamics of Diversity”
and it is these dynamics when combined with “Incompressible Mutual Reinforcement” that are the universe’s chosen receipt for the “Emergence of Incompressible Complexity”…