You might not of heard of entropy, but according to some it will be the death of us all. Entropy is defined as the natural tendency of all systems to evolve towards ever increasing disorder. But where did this belief come from?
Well, as with so many other things, it all comes down to the story of how the concept got formulated in the very first place…
In 1690, the French inventor, Denis Papin produced an interesting curiosity; he developed a crude piston engine to be used to pump water. In 1697, the English engineer, Thomas Savery constructed a rudimentary steam engine; and some 70 years later, in 1769, the Scottish inventor, James Watts patented a design for the first truly effective, modern, working device.
The introduction of the steam engine was to be one of the key factors for the onset of the industrial revolution. Here for the first time in history was a contrivance that made possible the use of “heat” to produce “mechanical motion”.
Sadi Carnot
In the years following, many efforts were made to improve the efficiency from these new-fangled machines. In 1824 a young French engineer named Nicholas Léonard Sadi Carnot, wrote a paper on the efficiencies of so-called heat engines (steam engines). In this paper, Carnot identified that for any engine to repeatedly convert “Heat” into “Mechanical Work” , it is necessary for a constant “Temperature Differential” to exist.
Carnot subscribed to a prevailing theory of the time, that Heat was actually the external manifestation of the internal presence of some form of “hot fluid”. Consequently, Carnot assumed that heat must flow, in a similar fashion to how water flows. He believed that just as the flow of water can drive a water mill, so too could the flow of a hot fluid across a temperature differential (from a hot source to a so-called cold sink) cause Mechanical Work to be done. It was this line of thinking that led Carnot to equate the “Movement of Heat” with the “Flow of Heat” (from high to low) and from there to deduce the “One-Way Nature of the Movement of Heat” . Carnot declared that:
Heat always spontaneously flows in the direction from hot to cold and will continue to do so while any temperature differential exists.
This declaration was the original “seed idea” for what was later to become the now-famous “Second Law of Thermodynamics”. Carnot understood that cold bodies never export their coldness, it is always hot bodies that export their hotness, and will do so until all things are at the same temperature.
James Prescott Joule
In 1845, the English physicist, James Prescott Joule showed how a certain amount of mechanical work done by the vigorous stirring of a paddlewheel in water, could produce a directly proportional rise in the temperature of the water. With this work, Joule gave birth to the concept of the “Mechanical Equivalence of Heat”. This work proved to Joule (an idea that had been first suggested by the English philosopher Francis Bacon as early as 1620) that the quantities “Heat” and “Work” are actually different “forms” of the same thing.
The “everyday concept of energy” is deeply ingrained in the human psyche. The concept of “liveliness” or “the energy of life” has been around for as long as mankind has been able to communicate. However, this abstract concept of “energy” only very gradually crept into scientific consciousness.
In the 17th century, the English philosopher and mathematician, Sir Isaac Newton had used the term “vis viva” (living force) to describe the “energy of life”. In 1807, the English physicist, Thomas Young, who was studying the wave nature of light, referred to the “energy of light”. Some time later yet another Englishman, the brilliant experimentalist Michael Faraday, discovered the “energy” associated with both electric and magnetic fields. Science was slowly beginning to realize that there were many forms of energy.
The concept of the “Mechanical Equivalence of Heat” was a major turning point in the history of our scientific understanding of the universe. It ultimately opened up a whole new area of study in the science of physics – the science of “Thermodynamics” is the branch of physics that studies the concept of “Energy”.
=======
Baron Kelvin of Largs
At a meeting of the British Association for the Advancement of Science in Oxford in 1847, Joule communicated his work and ideas to, the Belfast born mathematician and physicist, William Thomson. Thomson (who was later given the peerage Baron Kelvin of Largs) was astonished by Joule’s assertions, which seemed to contest a fundamentally-held principle of science. Carnot’s declaration had been based on the principle of the “Conservation of Heat” ; Joule was now refuting this widely-held scientific principle.
Up until this point in time, Kelvin had been a strong believer that Newton’s concept of “Force” was the bedrock of the science of physics, and consequently that physics was the “Science of Force”. With the deduction of the equivalence between heat and motion, the concept of Energy now seemed to take center stage. Kelvin began to understand that physics is in fact the “Science of Energy”.
In the aftermath of the 1847 meeting, Kelvin set to work on reconciling the apparently conflicting theories of Carnot and Joule. He slowly began to realize that these two theories might not be fighting over the same territory. Instead of one fundamental principle, maybe there could be two.
Kelvin now understood that “Heat” and “Work” are just “different forms of energy”. He realized that in the operation of an engine, heat is not being conserved, but actually being “converted” into work. Thus, Carnot’s principle regarding the conservation of heat was effectively overthrown and replaced with the principle of “The Conservation of Energy”.
Now here (at what was virtually the dawn of our scientific understanding of the concept of energy), Kelvin made, what has turned out to be, a truly momentous evaluation. Carnot had stated that all engines require a temperature differential (consisting of both a hot source and a so-called cold sink) in order to operate. Kelvin assumed that the cold sink is necessary in order to absorb some heat that must be discharged. Based on this evaluation Kelvin therefore declared that “no process can occur where the sole result is the complete conversion of heat into work.” [This statement is sometimes referred to as the engine statement of the Second Law of Thermodynamics.]
Kelvin thus believed that there was a fundamental dissymmetry in nature. Although it is possible to completely convert work into heat (by friction), Kelvin believed it impossible to completely convert heat into work, without having some “useless” heat left over. This leftover heat, which Carnot had previously described as the cold sink, was now being characterized by Kelvin as “Waste Heat”. Kelvin described this perceived inevitable loss of heat as the principle of “The Dissipation of Energy”.
These two principles of “The Conservation of Energy” and “The Dissipation of Energy” were to form the basis for the two great laws of thermodynamics. These principles taken together state that although energy is always conserved, the distribution of energy always occurs in an irreversible manner. And because of Kelvin’s evaluation, this irreversibility of the distribution of energy was associated with degraded energy loss or so-called waste energy. Kelvin’s explanation was that in the operation of an engine, energy is always conserved, but it is also always degraded. Kelvin’s momentous evaluation introduced the concept of “degraded or wasted energy”.
Rudolf Clausius
Although Kelvin had put forward the two principles of “The Conservation of Energy” and “The Dissipation of Energy” ; it was the German physicist Rudolf Julius Emmanuel Clausius who was the first person to rigorously formulate the Laws of Thermodynamics. In 1850, Clausius published a paper in which he described the universal principle of the conservation of energy as the “First Law of Thermodynamics”, stating that:
Energy can be neither created nor destroyed; it can only be converted from one form to another.
The principle of the conservation of energy states that the total energy in the universe remains constant. [Some 60 years later, thanks to Einstein equating matter with energy, this principle was revised to state that the total of “matter and energy” in the universe remains constant.]
Clausius also addressed the significance of the one-way restriction on the spontaneous movement of heat. He declared that “no process can occur where the sole result is the transfer of energy from a cooler to a warmer environment”. Since heat can only spontaneously move from hot to cold, the reverse process (sometimes known as the refrigeration process) requires work to be done. Clausius believed as did Kelvin that, “in any thermodynamic process, there is always some degradation of the energy available”. This belief inspired Clausius to propose a concept, which he defined in terms of the “Direction of Spontaneous Change”. This concept, which he called “Entropy”, he characterized as: “The property of a system that measures the degradation, of energy availability, associated with spontaneous change”. This definition is sometimes known as the “Thermodynamic Definition of Entropy”.
Clausius explained that entropy in a system never decreases as a result of spontaneous change. Entropy in a system can only increase or stay the same. Consequently, spontaneous change will continue to occur until entropy reaches a maximum. So although, Carnot’s original declaration had used the concept of the one-way nature of the movement of heat, and Kelvin’s reworked principle of “The Dissipation of Energy” had used the concept of wasted energy, when it came to Clausius formally formulating the “Second Law of Thermodynamics” he used his own newly fashioned concept of entropy, and stated that:
All Spontaneous Change in a System is accompanied by an increase in the Entropy of the System.
This first formal statement of the Second Law, in basically acknowledging Carnot’s original declaration, states that spontaneous change occurring in one direction only is measurable by an increase in Claudius’ newly formulated theoretical quantity, entropy. Nevertheless, despite this ever-increasing universal quantity accompanying all change, this formulation of the Second Law does not rule out the possibility of change that is accompanied by a decrease in entropy; it merely states that such change will not occur “spontaneously”.
This concept of a constantly increasing entropy would seem to imply that the universe is moving in an irreversible direction over time. Scientists were puzzled by this inference because; Classical Newtonian Physics actually pays no regard to the direction of time. Newton’s equations of motion work equally well whether we move forward or backward in time; and as a result, Newtonian motion is described as reversible. This problem of the obvious disconnect between the reversible laws of classical mechanics and the irreversible laws of classical thermodynamics found an explanation through the application of “Statistical Mechanics”.
James Clerk Maxwell
The Scotsman James Clerk Maxwell was a brilliant theoretical physicist and a committed devotee of the Atomic Theory of Matter. Maxwell thought (as many had done before him) that within any system the microscopic particles of the system must follow Newton’s Laws of Motion. In 1859, Maxwell developed his “Kinetic Theory of Gases”.
Maxwell brilliantly showed that “statistics” anchors thermodynamics to mechanics. He explained that while it is obviously impossible to determine the movements of individual microscopic particles because of their infinitesimal size and their infinite number, it is nevertheless possible, by using statistical averages, to determine the properties of the “system as a whole”.
Ludwig Boltzmann
During the 1870’s the Austrian physicist, Ludwig Boltzmann, inspired by Maxwell’s Kinetic Theory, equated the overall state of a system with the internal activity of its atoms. He formulated an alternative to the thermodynamic definition of entropy, which he defined in terms of mathematical probability; Boltzmann stated that, “Entropy is a measure of the ‘probability’ of finding any given system configuration”. This definition is known as the “Statistical Definition of Entropy”.
Along with this new definition of entropy, Boltzmann also proposed a formula for its mathematical calculation. This formula meant that entropy could at last be quantified. According to Boltzmann’s statement, entropy will continually increase until it reaches a maximum. This maximum is determined as the point where the atoms of the system adopt the most random arrangement (any additional activity could not increase the degree of randomization of the system any further). This maximum value of Boltzmann’s entropy can be calculated using probabilities fed into Boltzmann’s formula. This definition of entropy led to a yet another way of looking at The Second Law of Thermodynamics: This “new interpretation” implies that:
The State of any System will always move from the Less Probable to the More Probable and will continue to do so until the Most Probable State is reached.
It was this “interpretation” of the “Second Law” that ultimately led to the universal understanding that
“entropy increases as a system spontaneously moves from order to disorder!”
=======
This explanation of the Second Law of Thermodynamics would seem to imply that entropy is merely a statistical fact. The so-called apparent irreversibility of all change in the universe could now be explained as the result of the fact that, the initial condition of the universe could have been one of an improbable high state of order. Thus, it could be concluded that as a consequence of this initial improbable state of high order, the universe has ever since been moving towards a more probable state of disorder.
The Missing Arrow of Time
Although the concept of entropy is barely known outside the world of academia, ever since its initial conception it has been a heavyweight concept within the scientific community. Over the years, this fundamental concept has been adopted, into many other scientific disciplines not directly related to physics. In all of these fields, the basic “interpretation” of the concept is always the same. This “interpretation” is summed up by the standard dictionary definition, which is; “Entropy is a measure of disorder”.
This explanation of the ever-increasing entropy in the universe means that, we have to view all natural or spontaneous processes as having a natural tendency to move from order to disorder. As a result, all order in the universe seems destined to ultimately eventually disintegrate into chaos. Consequently, ever since the formulation of the concept of entropy, and its association with the one-way nature of the dissipation of energy, it has always been assumed that the natural tendency of the universe is towards turmoil, disorder, and destruction.
This theory of irreversible decay however, goes against our intuitive sense of everything that is going on around us. We do indeed sense that we live in an irreversible universe, but our intuitive irreversible universe is going in the opposite direction. When we look at our world, we see a world of progression; we see a world not decaying into chaos, but progressing to ever-increasing complexity. Our universe would seem to be constantly evolving, replacing old worn out structures with new, ever more complex ones. There seems to be a direct conflict between Boltzmann’s evolutionary regression towards bland uniformity and Darwin’s evolutionary progression towards rich diversity. This disagreement between physicists and evolutionary biologists has created a fundamental scientific paradox. Evolutionary Biologists argue that the irreversible arrow in time (implied by the concept of entropy) is pointing in the wrong direction.
In 1977, the Belgian chemist Ilya Prigogine won the Noble Prize for chemistry, for his work on his “Theory of Dissipative Structures”. This theory would seem to go some way towards solving the paradox of how complex ordered structures can come into being in a universe that is described as constantly moving towards turmoil, disorder, and destruction. Prigogine suggested that such structures could indeed come into existence if these entities have the capability of “exporting” their internal disorder to the external environment.
However, while this theory might seem to negate any evolutionary decay into chaos, it still does not explain the fundamental question as to why the universe actually “progresses” to ever-greater complexity. If all progress must take place in the face of universal entropic decay, then mankind’s technological progress must be the result of our ability to tame nature; but then the question arises, how does nature tame itself?
There would still seem to be missing a progressive arrow of time !