We know from the Reverse Law of Large Numbers (RLLN) that as the ratio of sample-size to number-of-options gets smaller, a normally stable system becomes unstable. This statement can be rephrased to say that a normally stable system will diverge from stability as the ratio of number-of-options to sample-size gets larger. This has very interesting implications for all fluid-like systems…
PART 1 – THERMODYNAMIC ENTROPY
There is a concept in physics called the “Second Law of Thermodynamics” (SLOT). The SLOT is the law of physics that deals with the spontaneous behaviour of systems. In everyday terms the SLOT is simply the fact that cold milk and hot coffee, if left un-stirred, will spontaneously mix themselves (in both composition and temperature). The SLOT states that left undisturbed all systems gravitate towards “Thermal Equilibrium”.
In 1850 the German Physicist Rudolf Clausius introduced a theoretical quantity called “Entropy” and suggested that entropy would constantly increase until “Maximum Entropy” was reached at thermal equilibrium.
Some 20 years later Austrian Physicist Ludwig Boltzmann associated the overall state of a system with the internal activity of its “atoms” and suggested that the spontaneous movement to maximum entropy is in fact nothing more than a system naturally gravitating towards its “most probable state”, and this state is, on the macro-level, the “most mixed together state”, and, on the micro-level, the “state of maximum disorder”. This is often referred to as “the probabilistic version of the SLOT”.
An interesting thing about the probabilistic version of the SLOT that is rarely, if ever, highlighted is the fact that it relies heavily on the damping effect of the “Law of Large Numbers” (LLN)…
Traditionally it is understood that maximum entropy and thermal equilibrium go hand in hand, but that is not necessarily the case.
The SLOT pulls all systems towards maximum entropy and this has the effect of neutralising, internal energy differentials within the system. Maximum entropy is therefore, the limit of compression of energy differentials; however this does not necessarily mean that thermal equilibrium is always the end result. Even at maximum entropy a system can still be held away from equilibrium by forces too strong for the SLOT to suppress.
In Boltzmann’s original treatment of the probabilistic version of the SLOT he treated energy as being made up of bundle of “indivisible units of energy”. Thus a system with more energy was treated as a system with a larger number of units. More units of energy means that there are more options available to each particle in the system.
Think of it in term of dollar bills. If there are $50 in a room of 100 people, each person only has 51 possible dollar amounts that he/she can have (i.e. $0, $1, $2, … $50). If however we introduce another $150 into the room, now there are 201 possible dollar amount available to each person (of course it is highly unlikely that one person will have all $200, but nevertheless it is a possibility, even if it is a very low possibility).
In a fluid system, there is constant interaction, and so energy amounts among particles is constantly changing. In a very large system the LLN damps out energy differential and ensures that every region of the system has “on average” the same amount of energy. However as we increase the energy in the system we make it harder and harder for the LLN to neutralize global diversity (so that all parts of the system have the same average energy) and at some point we will start to see this lack of uniformity appear on the surface.
So although maximum entropy in all fluid (and fluid-like) systems is always the limit of compression of differentials, the limit of this compression can manifest itself in different ways depending on the energy content of the system.
Thus for any fluid (or fluid-like) system, it is the energy content of the system which dictates the limits of physical suppression and mathematical compression, and it should thus be no surprise that excessive internal energy can act to resist the natural damping of diversity and thereby hold the system away from thermal equilibrium.
So the LLN may pull a system towards a central place but that central place does not necessarily have to be “a calm nonvolatile equilibrium”. To actually truly understand a system behaviour at maximum entropy, we would need to be able to identify/measure the limits of compression. So this begs the question, “is there any precedent for this way of thinking?”
Well yes! We can borrow from computer science the concept of “Information Entropy”…
PART 2 — INFORMATION ENTROPY
There is in science actually two types of “entropy”. The concept of thermodynamics entropy having been developed in the mid 19th century, and the concept of information entropy in the mid 20th century. Information Entropy is a concept used in computer science to measure how much actual information is contained inside a steam of data. According to Claude Shannon (the founder of Information Theory) most data in a data stream is redundant and can be compressed to its pure information content. The limit of this compression of redundant information he called the systems “information entropy” and the more incompressible a system is, the higher will be its information entropy.
Related or Unrelated?
Despite the joint use of the term “entropy”, the two concepts of thermodynamics entropy and information entropy are generally considered to be unrelated — simply the name was borrowed by information theory because something was being maximised.
As it turns out however Thermodynamic Entropy and Information Entropy are in fact very closely related, in that they both have one significant thing in common: they are both “The Limit of Compression”…
The Limit of Compression
Maximum Thermal Entropy and Information Entropy are both the limit of compression; but just as the limit of data compression does not always mean a simple mathematical formula, so too the limit of compression of energy differentials does not always mean thermal equilibrium.
The reality is that maximum thermodynamic entropy simply means that the system has spontaneously compressed, or squeezed out, as much diversity as possible; and consequently even at maximum entropy any system with excessive energy can still exhibit turbulent, or chaotic, macroscopic behaviour…
Consequently, often what we see in fluid (and fluid-like) system at maximum entropy is not equilibrium, but what remains despite the maximum compression of energy differentials (i.e. the maximum compression of diversity). When viewed through the lens of the limits of compressibility, there are, in fact, 2 ways to think about maximum thermal entropy.
Systems with high thermal entropy and high information entropy are ubiquitous, we see them everywhere. We see them in physics, in biology, in economies, in politics and even in religions; systems with high information entropy are systems that have difficulty in suppressing instability and diversity…
In Physics this sort of incompressible behaviour is called “Turbulence”; but in Economics, or Politics, or for that matter any fluid-like system that can experience excessive internal feedback, this incompressible behaviour is usually referred to as “Chaos”…