# Planck entropy

A visual of the "Planck entropy", developed by Max Planck in 1898 to 1909, generally based on Boltzmann's model of entropy (see: Boltzmann entropy), but with modification.

In thermodynamics, Planck entropy (LH:7) refers to a "disorder" model of entropy, developed by Max Planck, in the years 1897 to 1909, which resulted form his attempt to find a heat equation for the brightness of the light of bulbs as a function of temperature, wherein he had resort to the theory of probability and the hypothesis that elements and atoms, at the micro-scale, are, by nature, "disordered". The Planck model is confused in principle, in that Planck did not understand where exactly the "irreversibility" occurs in the heat engine. Specifically, irreversibility occurs in the "working body", at the end of a Clausius cycle, whereas Planck thought it occurred in the two heat reservoirs (i.e. hot body and cold body).

## Overview

### Early studies

In 1875, Max Planck studied physics at the University of Munich under the supervision of Philipp Jolly, who, to note advised him not to go into physics, because everything in physics had already been discovered, give or take a few holes. Then, in 1877, he studied for a year in Berlin, at the Friedrich Wilhelms University, where he attended the lectures of Hermann Helmholtz, who he said was never prepared, spoke hesitantly, and kept making mistakes, and Gustav Kirchhoff, who was extensively prepared, but gave dry and monotonous lectures. While here he began to study the thermodynamics of Clausius, on the side.

In Feb 1879, Planck, back at the University of Munich, submitted his dissertation on “The Second Law of the Mechanical Theory of Heat”, and his reviewers were: Jolly (physics), Bauer (mathematics), and Adolf Baeyer (chemistry), who said that he “achieved far more than is generally required of an inaugural dissertation.” In Jun 1879, after giving a public lecture on the “Development of the Concept of Heat”, he received his PhD. In Jun 1880, Planck, age 22, after submitting his habilitation thesis on the “Equilibrium of Isotropic Bodies at Different Temperatures”, wherein he used general knowledge from his dissertation to solve various physicochemical problems, and after giving a trial public lecture on the “Principles of the Mechanical Theory of Heat”, with subsequent discussion, he became a private lecturer at the Munich University. Over the next three years, he devoted his research in the field of heat theory and devoted himself to entropy, examining changes in the state of aggregation, gas mixtures and solutions.[1] In 1884, his monograph Principle of Conservation of Energy, based on the models of Helmholtz, one some kind of prize at the University of Keil.

In Apr 1885, at the University of Kiel, he became associate professor of theoretical physics, during which time he proposed a thermodynamic basis for Svante Arrhenius' theory of electrolytic dissociation. IN Apr 1889, Planck, at Friedrich Wilhelms University, Berlin, succeeded Gustav Kirchhoff, as professor of theoretical physics, becoming a full professor in 1892.

### Entropy spook?

Throughout all of this, Planck seems to never been able to get his head around the concept of entropy; later commenting:

“In those days [c.1889] I was essentially the only theoretical physicist there, whence things were not so easy for me, because I started mentioning entropy, but this was not quite fashionable, since it was regarded as a mathematical spook.[2]
— Max Planck (c.1940), “Commentary on joining the local Physical Society, University of Berlin”[3]

Planck, remained confused about entropy for the next twenty years, producing in his work to follow a misunderstanding of the "preference" in nature for the final state over the initial state, in respect to entropy, and his probability model of entropy, which is erroneous. The reason for this comes out in his 1909 Columbia University Lectures on Theoretical Physics, discussed below.

### Bright light bulbs | Minimum energy

A visual of Planck (1897) thinking [?]about the "light bulb power problem", i.e. how to formula a brightness as as a function function of temperature, so to lower electricity costs?

In 1894, Planck, after being commissioned by electric companies to create maximum light from light bulbs with minimum energy, turned his attention to the problem of blackbody radiation. His starting point was the following:

“How does the electromagnetic radiation emitted by a blackbody (a perfect absorber, also known as a cavity radiator) depend on the frequency of radiation, i.e. the color of the light) and the temperature of the body.”
— Gustav Kirchhoff (1895), Publication[3]

Partial solutions had been proposed, e.g. Wein’s laws which gave correct predictions at high frequencies, and Rayleigh-Jeans law, which failed in the ultraviolet range.[4] In 1899, Planck decided that he could solve the problem by what he called the “hypothesis of elementary disorder” concerning the entropy of the "ideal oscillator".[3] In order to solve this problem, he had to change his model of entropy, particularly about the entropy of heat radiation[5] and black bodies, and to begin thinking about entropy in respect to the disorder of the microstates or probability distributions of the particles of the system.

### Boltzmann

See main: Boltzmann entropy, universe tends towards disorder

In 1896, Ludwig Boltzmann, in gas theory, began to state that his "minimum theory" (aka H-theorem or heat theorem), of gas theory, was tied to an "assumption of disorder".[6]

“It is not a defect that the minimum theorem [H-theorem] is tied to the assumption of ‘disorder’, rather it is a merit that this theorem has clarified our ideas so that one recognizes the necessity of this assumption.”
— Ludwig Boltzmann (1896), Lectures on Gas Theory (pg. 42) [7]

In 1898, Boltzmann, in his last chapter of his gas theory lectures, extrapolated his model to the entire universe:

“Assume that an enormously complicated mechanical system represents a good picture of the world, and at all or at least most of the parts of it starting us are initially in a very ordered, therefore very probable, state. When this is the case, then whenever two or more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state, and when left to itself and rapidly proceeds to the disordered most probable state.”
— Ludwig Boltzmann (1898), Lectures on Gas Theory (pg. 443)

Boltzmann and Planck, however, on certain things. Boltzmann thought a system could be reversed, whereas Planck did not believe in reversibility. Moreover, Boltzmann's own confusion about entropy increase in respect to what he believed was the negative value of the decrease in his "H function" is a different subject.

### Formulations

Modern example of a black body (a TIRFI black body), light goes in the hole, bounces around, and is assumed to be absorbed 100% into the black walls of the inside container. This is the system Planck has in mind when he says "entropy depends on disorder", in respect to his solution to the ultraviolet catastrophe problem.

In 1897, Max Planck, in his Treatise on Thermodynamics (§2:86-), building on Clausius and Ostwald, derives the following formula for the entropy of unit mass of ideal gas:[8]

${\displaystyle \phi =c_{v}\log \theta +{\frac {R}{m}}\log v+{\text{const}}}$

in modern notation:

${\displaystyle S=c_{v}\log T+{\frac {R}{m}}\log V+k}$

In 1901, Planck, in his “On the Law of Distribution of Energy in the Normal Spectrum”, derives the entropy of an "irradiated, monochromatic, vibrating resonator as a function of its vibrational energy U", shown below:[9]

${\displaystyle S_{N}=k\log W+{\text{const}}}$

where N is the number of resonators and W is the probability. The problem thus reduces, as he says, to finding the probability W so that the N resonators together possess the vibrational energy UN. To do this, he resorts to probability theory. This paper, to note, is where Planck, building on Boltzmann earlier idea that energy could be quantized, introduces his "energy element"[10] model, which Einstein later expanded upon to make his "light quanta" hypothesis, which therein launched quantum mechanics. Success in this area, however, did little for for Planck's understanding of the "tendency in nature for entropy to increase".

### What is entropy debate?

In 1903, Planck took part in the famous "what is entropy debate?"[11], which spanned from 1897 to 1907, via publishing two commentary articles in the Electrician. Specifically, in Jan 1903, Oliver Lodge wrote a short tutorial on entropy, printed in The Electrician.[12] This, however, satisfied no one. The debate, by this time having grown so large and involving so many people, that outside "expert opinion" was requested from Planck and Henri Poincare.

On 13 Feb 1903, Planck published a commentary article in The Electrician.[13] The following are excerpts:

“Astonishment at seeing a man so well-known and so eminent in science as Oliver Lodge putting forward ideas on thermodynamics which I have combated ever since the commencement of my studies in that science. Swinburne, conversely, has written one of the best and clearest expositions of the subject that has ever been written, especially when he points out that Nature never undertakes any change unless her interests are served by an increase in entropy.”
— Max Planck (1903), “Article”, The Electrician, Feb 13

On 20 Feb, Oliver Heaviside responded to Planck:

“I should like to ask professor Max Planck whether the view he expresses that: ‘nature never undertakes any change unless her interests are served by an increase of entropy’ is to be taken with or without any particular reservation or with any special interpretation of ‘her interests’. My thermodynamic ideas are somewhat old-fashioned—viz that there is invariably a dissipation of energy or loss of availability of energy due to imperfect or total want of reversibility in natural processes. This entirely agrees in effect with the way of expressing things in terms of increase of ‘entropy’, although that subtle quantity is certainly ‘ghostly’[2], and is somewhat too evasive to be regarded as a physical state even though it be a function of the physical state referred to a standard state. But the question is how the interests of nature are served by imperfect reversibility? Professor Planck’s words suggest a ‘choice’ on nature’s part, as if nature had any choice. Goethe said god himself could not alter the course of nature. That was truly scientific. Then, again, what are to be considered the interests of nature? Are we to take exactly things exactly as we find them, and define the interests in that way? If so, it carries us no further. Or is there a ‘theorem of greatest entropy’, showing how any variation from the proper course of nature would tend to reduce the rate of increase of the entropy?”
— Oliver Heaviside (1903), “Article”, The Electrician, Feb 20

On 6 Mar 1903, Planck responded to Heaviside as follows:

“Whether entropy has any ‘ghostly’[2] attributes, is a question I will not open, but I am for the present quite content to know that it is a quantity which can be measured without ambiguity. I do emphatically deny, and always have combated the proposition adduced by Mr. Heaviside, of the universal dissipation of energy.”
— Max Planck (1903), “Article”, The Electrician, Mar 6

Planck, here, is stating his denial of the so-called "Thomson entropy" model (Thomson, 1852), i.e. that of a universal tendency in nature to the dissipation of mechanical energy.[14] The actual Thomson model of entropy, however, did not arise until May 1854.[15]

### Eight Lectures

In 1909, Planck, in his Eight Lectures on Theoretical Physics, given at Columbia University, says the following (§1:15-16):[16]

“An example will make this clear. Heat conduction is an irreversible process, or as Clausius expresses it: Heat cannot without compensation pass from a colder to a warmer body. What now is the work which in accordance with definition is lost when the quantity of heat Q passes through direct conduction from a warmer body at the temperature T1 to a colder body at the temperature T2? In order to answer this question, we make use of the heat transfer involved in carrying out a reversible Carnot cyclical process between the two bodies employed as heat reservoirs. In this process a certain amount of work would be obtained, and it is just the amount sought, since it is that which would be lost in the direct passage by conduction; but this has no definite value so long as we do not know whence the work originates, whether, e. g., in the warmer body or in the colder body, or from somewhere else. Let one reflect that the heat given up by the warmer body in the reversible process is certainly not equal to the heat absorbed by the colder body, because a certain amount of heat is transformed into work, and that we can identify, with exactly the same right, the quantity of heat Q transferred by the direct process of conduction with that which in the cyclical process is given up by the warmer body, or with that absorbed by the colder body. As one does the former or the latter, he accordingly obtains for the quantity of lost work in the process of conduction:
${\displaystyle Q{\frac {T_{1}-T_{2}}{T_{1}}}}$
or
${\displaystyle Q{\frac {T_{1}-T_{2}}{T_{2}}}}$
We see, therefore, that the proposed method of expressing mathematically the irreversibility of a process does not in general effect its object, and at the same time we recognize the peculiar reason which prevents its doing so. The statement of the question is too anthropomorphic. It is primarily too much concerned with the needs of mankind, in that it refers directly to the acquirement of useful work.”

Here, we see the error in Planck's conception of entropy, namely that he has discarded the "working body" (or working substance), of the standard three body model of the heat engine (or steam engine), defined by a hot body (e.g. boiler), cold body (e.g. spray of cold water to outside of piston and cylinder), and a "working body" (e.g. water in the piston and cylinder), and thinks, incorrectly, that "irreversibility" is a function the quantity of heat Q and the temperature of the hot body T1 and cold body T2, in other words, Planck believed the following:

${\displaystyle {\text{Irreversibility}}=f(T_{1},T_{2})}$

This formula, however, is incorrect. Correctly, the above formulas, represent what is called "efficiency"[17], specifically "Carnot efficiency"[18] (Carnot, 1824); namely:

${\displaystyle {\text{Efficiency}}=f(T_{1},T_{2})}$

Carnot, in fact, when he formulated this expression believed, based on Lavoisier model of heat, that the working substance (or working body), i.e. body of water expanded and contracted in the piston and cylinder, did so in a completely "reversible" manner and that NO change occurred in the working substance, per reason that he believed that "caloric" was a conserved indestructible type of heat particle, like an atom of heat, so to say (see: Carnot cycle).

#### Property?

Planck, confused about his conceptual model entropy, marches forward attempting to find a "property" that distinguishes state A from state B? Specifically, he says (§1:16-17):

“Let us consider any typical process occurring in nature. This will carry all bodies concerned in it from a determinate initial state, which I designate as state A, into a determinate final state B. The process is either reversible or irreversible. A third possibility is excluded. But whether it is reversible or irreversible depends solely upon the nature of the two states A and B, and not at all upon the way in which the process has been carried out; for we are only concerned with the answer to the question as to whether or not, when the state B is once reached, a complete return to A in any conceivable manner may be accomplished. If now, the complete return from B to A is not possible, and the process therefore irreversible, it is obvious that the state B may be distinguished in nature through a certain property from state A. Several years ago, I ventured to express this as follows: that nature possesses a greater ‘preference’ for state B than for state A. In accordance with this mode of expression, all those processes of nature are impossible for whose final state nature possesses a smaller preference than for the original state. Reversible processes constitute a limiting case; for such, nature possesses an equal preference for the initial and for the final state, and the passage between them takes place as well in one direction as the other. We have now to seek a ‘physical quantity’ whose magnitude shall serve as a general measure of the preference of nature for a given state.”

Here, firstly Planck employs the "it is obvious" phrase. This, more often than not, tends to be a red flag in argument, a signal that the speaker is under-confident in their argument, and are compensating with overt zeal. Prigogine famously used the same red flag phrase during his Nobel Lecture on his entropy model (see: Prigogine entropy). Secondly, the preference in nature for certain states is true, but the way Planck understands this in terms of his blurred conception of entropy, is incorrect. Planck continues (§1:16-17):

Clausius actually found this ‘quantity’ and called it ‘entropy’. Every system of bodies possesses in each of its states a definite entropy, and this entropy expresses the preference of nature for the state in question. In accordance with it, the entropy of a system of bodies is simply equal to the sum of the entropies of the individual bodies, and the entropy of a single body is, in accordance with Clausius, found by the aid of a certain reversible process.”

Planck, at this point, in his phrase "every system of bodies possesses in each of its states a definite entropy", has completely lost conception of what is going on.

#### Planck model?

The following diagram shows the "Planck model", below right, as compared to the Papin model (1690), Clausius model (1823), and standard Clausius model (1865):

Here, in Planck's model, he speaks about "work", but the correlative discussion of the "working body" is absent? In other words, Planck has completed removed the working body from the picture, which is where the entropy changes occur, and instead only speaks about the entropy decrease or increase of the hot body and the cold body? Read the following slowly:

“Returning to the example mentioned above, in which the quantity of heat Q is conducted-from a warmer body at the temperature T1, to a colder body at the temperature T2 in accordance with what precedes, the entropy of the warmer body decreases in this process, while, on the other hand, the entropy of the colder increases, and the sum of both changes, that is, the change of the total entropy of both bodies, is:
${\displaystyle -{\frac {Q}{T_{1}}}+{\frac {Q}{T_{2}}}>0}$
This positive quantity furnishes, in a manner free from all arbitrary assumptions, the measure of the irreversibility of the process of heat conduction. Such examples may be cited indefinitely. Every chemical process furnishes an increase of entropy.”

Here, we see grand confusion! Planck here is speaking about the entropy change IN or OF the hot body and the entropy change IN or OF the cold body. These are of no concern. The formula Planck cites above refer to the entropy change IN the "working body", which is where the irreversibility occurs, related to the nature of the way the molecules of the body do work on each other. The discussion of the heat change in the so-called "reservoirs", i.e. the fire or the cold water, i.e. the heat changes in the "surroundings" are of no concern, whatsoever in thermodynamics. Nevertheless, this is where Planck thinks irreversibility occurs?

#### Reservoir | Entropy changes?

Planck directly says, that the entropy change of the heat reservoirs is where irreversibility occurs. He says this directly in his statement that the "sum of the changes in the entropy of the ALL the heat reservoirs must be positive or or zero", as follows:

“We shall here consider only the most general case treated by Clausius: an arbitrary reversible or irreversible cyclical process, carried out with any physico-chemical arrangement, utilizing an arbitrary number of heat reservoirs. Since the arrangement at the conclusion of the cyclical process is the same as that at the beginning, the final state of the process is to be distinguished from the initial state solely through the different heat content of the heat reservoirs, and in that a certain amount of mechanical work has been furnished or consumed. Let Q be the heat given up in the course of the process by a heat reservoir at the temperature T, and let A be the total work yielded (consisting, e. g., in the raising of weights); then, in accordance with the first law of thermodynamics:
${\displaystyle \sum Q=A}$
In accordance with the second law, the sum of the changes in entropy of all the heat reservoirs is positive, or zero. It follows, therefore, since the entropy of a reservoir is decreased by the amount Q/T through the loss of heat Q that:
${\displaystyle \sum {\frac {Q}{T}}\leqq 0}$
This is the well-known inequality of Clausius.”

We see Planck here speaking about the work done in the raising of the weight, but he has discarded the "working body" from which the picture?

#### Atomic theory of matter

A summary of the Planck dice model of entropy.[19]

Planck, in his third lecture "The Atomic Theory of Matter", building on the former confuses basis, tells us (§3:44):

“The step to have completed the emancipation of the entropy idea from the experimental art of man and the elevation of the second law thereby to a real principle, was the scientific life's work of Ludwig Boltzmann. Briefly stated, it consisted in general of referring back the idea of entropy to the idea of probability. Thereby is also explained, at the same time, the significance of the above (p. 17) auxiliary term used by me; "preference" of nature for a definite state. Nature prefers the more probable states to the less probable, because in nature processes take place in the direction of greater probability. Heat goes from a body at higher temperature to a body at lower temperature because the state of equal temperature distribution is more probable than a state of unequal temperature distribution. Through this conception the second law of thermodynamics is removed at one stroke from its isolated position, the mystery concerning the preference of nature vanishes, and the entropy principle reduces to a well understood law of the calculus of probability.”

Here, Planck has basically invented his own conception of a thermodynamics system, where where he has completely removed the "working body" from the system, which is where the "irreversibility" occurs in nature, and is now solely concerned with putting a hot body and a cold body in contact, and to therefore conclude that "entropy increase" means that nature tends to "equal temperature" distribution, because that state is more "probable", and hence therein attempt to define all atomic movement in terms of probabilities:

On this faulty basis, Planck then, to prove his point further, goes on to describe "atoms" of a gas as being akin to "dice", and that entropy is a function of playing craps, such that, e.g. the number of ways to roll a four with two dice is three different dice rolls, namely: 1 and 3, 2 and 2, 3 and 1, which he calls "complexions" or configurations. He therefore goes on to argue that atoms have to be completely disordered and of the same time in order for entropy to hold and for entropy increase to be realized.

Then, in later into the lecture (§3:51-54), he goes on to talk about his so-called "hypothesis of elementary disorder", referring to how atoms in a gas move according to the "law of accidents", and how you can only apply entropy to systems with more than 1,000 or so atoms, among other objectionable statements, and then says that the probability of two independent configurations, or dice rolls, W1 and W2, is defined as the product of the individual probabilities:

${\displaystyle W=W_{1}\times W_{2}}$

and that the total entropy is represented by the sum of the individual entropies:

${\displaystyle S=S_{1}+S_{2}}$

the values S1 and S2, in Planck's mind, presumably being the entropy of the hot body and the cold bodies, i.e. the entropies of the two heat reservoirs, which makes no sense? Then he says that the entropy is proportional to the logarithm of the probability:

${\displaystyle S=k\log W}$

and he says this applies for atomic configurations, which he says is configuration 1, and for radiation configurations, is configuration 2. He then goes on to say that a single "moving material point" depends on "6 variables", which he likens to the six sides of a dice, moving in 3 generalized coordinates. In the following lecture, he expands on this logic, using more dice arguments, so to attempt to define the "probability of a thermodynamics state", for the case of radiant energy as well as that for material substances. The entire argument is a a confused mess, from the ground up.

## General | Difficulties

### Conserved forces?

A photo of Boltzmann, Arrhenius, and Nernst.

Another salient problem in Planck's model, is his statement that the ideal particles, i.e. single molecules in the gas phase, of his system are "subject to conservative forces"? Presumably he is employing the billiard ball model as his conceptual model. Now the actual forces of atoms are the affinities, first outlined by Newton in his Query 31. These affinity forces, as Goethe has said, are not subject to the rules of dice throwing or card playing:

“Crebillon treats the passions like playing cards, that one can shuffle, play, reshuffle, and play again, without their changing at all. There is no trace of the delicate, chemical affinity, through which they attract and repel each other, reunite, neutralize each other, separate again and recover.”
Johann Goethe (1799), “Comment to Schiller on the works of Prosper Crebillon”, Oct 23

Moreover, in 1882, Helmholtz, who Planck formerly studied under, proved that these affinity forces were correlated with the "free energy" of the system, not the entropy, by itself. This affinity force A, using modern notation, from the characteristic function table, is formulated as follows:

${\displaystyle A=T\Delta S-\Delta H}$

whereby the change in entropy of a system ΔS on going to an initial state S1 to a finial state S2, reads correct as follows:

${\displaystyle \Delta S={\frac {A+\Delta H}{T}}}$

Thereafter, as proved by Walther Nernst, the state to which the "preference in nature" is directed, is not towards greater "probability" as Planck suggests, but rather, owing to the entropy increase, as defined by Clausius, not by Planck, is towards a minimization of free energy in the system:

“Since every chemical process, like every process of nature, can only advance without the introduction of external energy only in the sense in which it can perform work; and since also for a measure of the chemical affinity, we must presuppose the absolute condition, that every process must complete itself in the sense of the affinity—on this basis we me may without suspicion regard the maximal external work of a chemical process (i.e. the change of free energy), as the measure of affinity. Therefore, the clearly defined problem of thermochemistry is to measure the amounts of the changes of free energy associated with chemical processes, with the greatest accuracy possible. When this problem shall be solved, then it will be possible to predict whether or not a reaction can complete itself under the respective conditions. All reactions advance only in the sense of a diminution of free energy, i.e. only in the sense of the affinity.”
— Walther Nernst (1893), Theoretical Chemistry from the standpoint of Avogadro's rule and Thermodynamics (pgs. 586-88)[20]

Also, Nernst, as shown pictured with Boltzmann adjacent, was more aligned with Boltzmann's views, it would seem than Planck? Nernst says that entropy is a function whose magnitude increases in all automatically occurring processes in nature. Moreover he does not employ the term "disorder". Planck, in fact is the only thermodynamicist who employs the term "disorder"?

### Universe | Tends towards disorder?

In the period after Planck began to speculate about entropy and dice, Einstein began to say: "god does not play dice with the universe". Supposedly, the reference was mainly directed at Heisenberg, but it could have been also directed at Planck and his entropy dice model?

In 1925, Planck's misunderstood model of entropy, began to be regurgitated into the translation that it is a law of nature that the "universe tends towards disorder", the following is one example found in a standard physics textbook:

“Entropy is closely related to another, somewhat more familiar quantity called ‘probability’. When the two cylinders just described are connected, the molecules of each wander back and forth between the cylinders. The most probable distribution of molecules occurs when the two pressures are equal. The original state signified a certain ‘order’ in the arrangement, the final uniform state, a greater ‘disorder’. The second law asserts that the universe is moving toward greater disorder and that no step in that direction may be retraced. Energy becomes progressively more degraded. It has been pointed out that a sound wave represents a certain order in the motion of the molecules. This tends to degenerate into the disorder of random thermal agitation for which the entropy, the probability, is greater. Such a process means one more step toward the complete ‘unavailability’ of the energy of the universe.”
— Harrison Randall (1925), Physics: Mechanics, Sound, and Heat (pg. 168)[21]

The following is another example, wherein Planck's "hypothesis of elementary disorder" is incorrectly mis-attributed to Boltzmann and his his gas theory:

“Such a typical random distribution of collisions is assumed in the second law of energetics. This appears in Boltzmann's gas theory in the ‘hypothesis of elementary disorder[22]. I shall have occasion later to question this assumption that the individual molecules are entirely independent of each other’s motion. The doctrine that entropy tends to a maximum, according to Nernst’s interpretation of the second law of thermodynamics, is equivalent to the statement, with regard to the reactions which proceed at constant temperature and volume, the change goes on until the free energy of the system reaches a maximum.”
— Author (1926), “Article” (pgs. 428, 431), The Journal of Philosophy[23]

In 1937, Mark Zemansky, in his Heat and Thermodynamics, citing the Planck model, aka "statistical mechanics" as he crudely referred to things, was championing the disorder tendency model:

“It is possible to regard all natural processes from this point of view, and in all cases the result is obtained that there is a tendency on the part of nature to proceed towards a state of greater disorder.”
— Mark Zemansky (1937), Heat and Thermodynamics (pg. 169)[24]

In 1943, Erwin Schrodinger gave his famous What is Life Lecture, referring to this as the "Boltzmann order-disorder principle", and that because of this "life feeds on negative entropy", which only added to the confusion.

Subsequently, in the decades to follow, people began think that entropy = disorder, that entropy increase means the tendency of systems towards disorder, and that the universe is headed towards disorder. All of this, in short, is but perpetuated confusion

## Quotes

The following are quotes:

“The equation S = k log W + const appears without an elementary theory — or however one wants to say it — devoid of any meaning from a phenomenological point of view.”
Albert Einstein (c.1910), Source [25]
“A stupid man's report of what a clever man says is never accurate, because he unconsciously translates what he hears into something that he can understand.”
Bertrand Russell (1945), History of Western Philosophy (pg. 45)[26]
“But it must be noted that what is misleadingly called the state of ‘maximum disorder’ is in fact the state of ‘maximum equipartition’.”
— Lancelot Whyte (1949), The Unitary Principle in Physics and Biology (pg. 22)
“I have previously warned against an over interpretation of entropy as a measure of disorder, and I stress that caution again. To be sure, an animal definitely seems more ordered than the sum of its atoms, loosely distributed, and it does probably have a lower entropy. But then, what is the entropy of an animal (see: entropy of a mouse)? Or let us ask the easier question: what is the entropy of a molecule like hemoglobin, one of the simpler proteins with only about 500 amino acids? Maybe, molecular biologists can come up with the answer; If so, I do not know about it. But I do know that surely it must be a case of simplism when Schrodinger says that animals maintain their highly ordered state, because they eat highly ordered food. Indeed, before the animal body makes use of the food and anyway, and sets about to create order, it breaks the food down too much less ordered fragments than those which it ingests.”
Ingo Muller (2007), A History of Thermodynamics [27]
“And you know, I get a lot of grief out there. People say, ‘How can you be a scientist and believe that god created the earth? Obviously, you know [they say] we developed from a puddle of promiscuous biochemicals [?]. And if you believe in anything other than that, you’re a moron.’ I don’t criticize them. I say, ‘Can you tell me how something came from nothing?’ And of course they can’t. They say ‘well, we don’t understand everything.’ I say ‘ok, no problem’. ‘I’m just going to give you that there’s something’. And now you’re going to tell me there’s a big bang, and it comes into perfect order? So that we can predict seventy-years hence when a comet is coming, that kind of precision. And they say, ‘Well, yeah.’ And I say, ‘But don’t you also believe in entropy, that things move toward a state of disorganization?’ [they say] ‘Well yah’. [I say] ‘So how does that work? “And they say, ‘We don’t understand everything.’ And I said ‘I’m not sure you understand anything! ‘ But, I said, ‘I’m not going to be critical of you, not a problem. You’re entitled to believe what you believe, even though it requires a lot more faith than what I believe. But everybody believe what you want to believe.”
Ben Carson (2015), “US Presidential Campaign Speech” (0:08-1:42), Liberty University, Nov 11 [28]
“Now, every seventh grader, even the dumb ones, know the second law of thermodynamics: All ordered systems tend toward disorder.”
— Ed Solomon (2016), Now You See Me 2 (character: Lula) [29]

## End matter

### References

1. Max Planck (German → English) – Wikipedia.
2. Entropy ghost – Hmolpedia 2020.
3. Strickland, Jeffrey. (2011). Weird Scientists: the Creators of Quantum Physics (pg. 22). Publisher.
4. Ultraviolet catastrophe – Wikipedia.
5. Gilman, Daniel; Peck, Harry; Colby, Frank. (1905). The New International Encyclopedia, Volume Seven (§: Dissipation of Energy, Irreversiblity, Entropy, pg. 73). Dodd.
6. Boltzmann, Ludwig. (1896). ''Lectures on Gas Theory'' (disorder, pg. 42). Dover.
7. Boltzmann, Ludwig. (1896). Lectures on Gas Theory (translator: Stephen Brush) (disorder, pg. 42). Dover, 1964.
8. Planck, Max. (1897). Treatise on Thermodynamics (translator: Alexander Ogg). Longmans, 1903.
9. Planck, Max. (1901). “On the Law of Distribution of Energy in the Normal Spectrum” (pdf) (txt), Annalen der Physik, 4:553.
10. Energy element – Hmolpedia 2020.
11. What is entropy debate? – Hmolpedia 2020.
12. Lodge, Oliver. (1903). “Title”, The Electrician, 50: 560-63, Jan 23.
13. Planck, Max. (1903). “Article”, The Electrician, 50: 694, Feb 13.
14. Thomson, William. (1852). “On a Universal Tendency in Nature to the Dissipation of Mechanical Energy”, Proceedings of the Royal Society of Edinburgh, Apr 19; in: Philosophical Magazine, Oct 1852; in: Mathematical and Physical Papers (pgs. 511-14), 1:59, Publisher.
15. Thomson, William. (1854). “On the Dynamical Theory of Heat. Part V: Thermo-electric Currents”, The London, Edinburgh and Dublin Philosophical Magazine and Journal of Science (pgs. 214-25; quote, pg. 232), May 1; in: Transactions of the Royal Society of Edinburgh, 21:123-#. Royal Society, 1857.
16. Planck, Max. (1909). Eight Lectures on Theoretical Physics (translator: A.P. Willis). Columbia University, 1915.
17. Efficiency – Hmolpedia 2020.
18. Carnot efficiency – Wikipedia.
19. Multiplicity – Hyper Physics.
20. Nernst, Walther. (1895). Theoretical Chemistry: from the Standpoint of Avogadro’s Rule & Thermodynamics (697-pages) (section: The Measure of Affinity, pgs. 586-88; entropy, pg. 18). MacMillan and Co.
21. Randall, Harrison. (1925). Physics: Mechanics, Sound, and Heat (co-authors: Neil Williams, Walter Colby) (pg. 168). Edwards.
22. Boltzmann order-disorder principle – Hmolpedia 2020.
23. Author. (1926), “Article” (pgs. 428, 431), The Journal of Philosophy. Woodbridge.
24. Zemansky, Mark. (1937). Heat and Thermodynamics: an Intermediate Textbook for Students of Physics, Chemistry, and Engineering (disorder, universe, increase, entropy, pg. 163-68). McGraw-Hill.
25. S = k ln W – Hmolpedia 2020.
26. Russell, Bertrand. (1945). History of Western Philosophy (pg. 45). Simon & Schuster.
27. Muller, Ingo. (2007). A History of Thermodynamics (§11: Metabolism, pg. 307). Springer.
28. Carson, Ben. (2015). “US Presidential Campaign Speech” (0:08-1:42) (YT), Liberty University, Nov 11.
29. (a) Solomon, Ed. (2016). Now You See Me 2 (txt). Lionsgate.