In 1988, Lloyd, completed his PhD, with a thesis on "Black Holes, Demons, and the Loss of Coherence: How Complex Systems Get Information, and What They Do With It", under Heinz Pagels, at Rockefeller University.
In 1988, Lloyd, in his “Complexity as Thermodynamic Depth”, coauthored with Heinz Pagels, introduced the idea of "thermodynamic depth", along among other applications of Shannon entropy to complexity, such as mate selection.
In 1989, John Wheeler, during a talk at the Santa Fe Institute, coined the phrase “it from bit” (1989), arguing that the universe could be boiled down to binary digits; this later inspired Seth Lloyd.
In 2011, Lloyd, in his talk “On Quantum Life”, in reference to in a subject he calls "quantum biology", revolving around recent symposiums he has organized at Harvard, attempted to argue that recent studies have shown that quantum mechanics, e.g. quantum entanglement and quantum coherence, plays a role in areas such as making light absorption in photosynthesis efficient, in bacteria (somehow), in how birds navigate in the earth’s magnetic field, and hypothetically in how the smell sense operates.
Programming the Universe
In 2006, Lloyd, in his Programming the Universe, road the "Shannon bandwagon" into the published "objectionable nonsense" territory, par excellence; the following, e.g., is his model of how humans came to be:
- “It is certainly possible for one or another of these monkeys to type Hamlet. It is possible, as well, that the information defining the universe was created by similarly random processes. After all, if we identify heads with 1 and tails with 0, tossing a coin repeatedly will eventually produce any desired string of bits of a finite length, including a bit string that describes the universe as a whole. Quantum mechanics supplies the universe with "monkeys" in the form of random quantum fluctuations, such as those that seeded the locations of galaxies. The computer into which they type is the universe itself. From a simple initial state, obeying simple physical laws, the universe has systematically processed and amplified the bits of information embodied in those quantum fluctuations. The result of this information processing is the diverse, information-packed universe we see around us: programmed by quanta, physics gave rise first to chemistry and then to life; programmed by mutation and recombination, life gave rise to Shakespeare; programmed by experience and imagination, Shakespeare gave rise to Hamlet.”
- — Seth Lloyd (2006), Programming the Universe (pg. 57+61)
Lloyd here, in short, is employing melting pot of ontic opening theory, namely: "chance theory" (Lucretius, 55BC), "quantum flapdoodle" (Gell-Mann, 1994), the Shannon "bandwagon music" (Shannon, 1956), programmed "mutations", randomness, fluctuations, etc., to argue for a grand "god = information" stylized, anti-determinism, free will argument.
- “The conventional history of the universe pays great attention to energy: How much is there? Where is it? What is it doing? By contrast, in the story of the universe told in this book, the primary actor in the physical history of the universe is information. Ultimately, information and energy play complementary roles in the universe: Energy makes physical systems do things. Information tells them what to do. If we could look at matter at the atomic scale, we would see atoms dancing and jiggling every which way at random. The energy that drives this random atomic dance is called heat, and the information that determines the steps of this dance is called entropy. More simply, entropy is the information required to specify the random motions of atoms and molecules—motions too small for us to see. Entropy is the information contained in a physical system that is invisible to us.”
- — Seth Lloyd (2006), Programming the Universe (pgs. 40-41)
- “Energy and information (visible and invisible) are the two primary actors in the universal drama. The universe we see around us arises from the interplay between these two quantities, interplay governed by the first and second laws of thermodynamics. Energy is conserved. Information never decreases. It takes energy for a physical system to evolve from one state to another. That is, it takes energy to process information. The more energy that can be applied, the faster the physical transformation takes place and the faster the information is processed. The maximum rate at which a physical system can process information is proportional to its energy. The more energy, the faster the bits flip. Earth, air, fire, and water in the end are all made of energy, but the different forms they take are determined by information. To do anything requires energy. To specify what is done requires information. Energy and information are by nature (no pun intended) intertwined.”
- — Seth Lloyd (2006), Programming the Universe (pgs. 44)
- “To do anything requires energy. To specify what is done requires information.”
The views presented herein, the gist of which being that the universal drama of events are "determined by information", and that energy and entropy are subordinate players to "information", has resulted to lead many astray on the various "Entropy piper" paths of delusional ridiculousness.
Historical | Misrepresentation
In 1948, Warren Weaver initiated the misrepresentation of history as follows:
- “Shannon’s work roots back, as von Neumann has pointed out, to Boltzmann’s observations, in some of his work on statistical physics (1894), that entropy is related to ‘missing information’, inasmuch as it is related to the number of alternatives which remain possible to a physical system after all the macroscopically observable information concerning it has been recorded. Leo Szilard (Zeitschrift fur Physik, Vol. 53, 1925) extended this idea to a general discussion of information in physics, and von Neumann (Mathematical Foundation of Quantum Mechanics, Berlin, 1932, Chap V) treated information in quantum mechanics and particle physics. Shannon’s work connects more directly with certain ideas developed some twenty years ago by Harry Nyquist and Ralph Hartley, both of Bell Laboratories; and Shannon has himself emphasized that communication theory owes a great debt to Norbert Wiener for much of its basic philosophy [cybernetics]. Wiener, on the other hand, points out that Shannon’s early work on switching and mathematical logic antedated his own interest in this field; and generously adds that Shannon certainly deserves credit for independent development of such fundamental aspects of the theory as the introduction of entropic ideas. Shannon has naturally been specially concerned to push the applications to engineering communication, while Wiener has been more concerned with biological applications (central nervous system phenomena, etc.).”
- — Warren Weaver (1949), footnote to Shannon’s The Mathematical Theory of Communication
Correctly, the roots of Claude Shannon's information theory do NOT trace back to Boltzmann's observations about entropy of gas molecules. This was a Sokal affair style, inside joke, invented by John Neumann, done to play a crude joke on the ignorance of humanity. Correctly, the fundamental formulas of information theory, were devised by Ralph Hartley (1928) who used logarithms, which were invented by John Napier (1614).
Lloyd, in his over-zealous effort to sell his "information theory of everything", continues the Weaver-based historical misrepresentation of facts, as follows:
- “The great nineteenth-century statistical physicists James Maxwell, Ludwig Boltzmann, and Willard Gibbs derived the fundamental formulas of what would go on to be called ‘information theory’.”
- — Seth Lloyd (2006), Programming the Universe (§:Information and Physical Systems, pg. 163)
The work of Maxwell, Boltzmann, and Gibbs have no relation to Harley-Shannon information theory.
Others too, have noted that Lloyd writes in a way that misrepresents history:
- “Seth Lloyd writes in a way of telling the story that misrepresents the history.”
Deacon, here, comments that Lloyd's misrepresentative writing style leads one down a slippery slope.
- “The language of thermodynamics [2.0] is ‘energy’ or its derivatives such as ‘entropy’ or ‘information’ which could be more fundamental than energy in time.”
- — Ram Poudel (2019), Thermodynamics 2.0 homepage, draft version, Jul
Thims told Poudel that he would not be attending (see: discussion) the conference if this quote remained on the homepage. Poudel explained that he was trying to lure Seth Lloyd to the conference with this quote; per reason that Poudel had seen, supposedly, a good video by Lloyd.
Quotes | Employed
The following are quotes employed by Lloyd:
- “Information is physical.”
- — Rolf Landauer (1991), “Information is Physical”; cited by Seth Lloyd (2006) in Programming the Universe (pg. 213)
Quotes | On
The following are quotes on Lloyd:
- “While I leave it to others to argue about the similarity (or not) between these two entropies, my intent in presenting it here is simply to point out that, for many, Shannon started a paradigm shift from thinking about energy to thinking about information, even going so far as to suggest that information is more fundamental than energy.”
- — Robert Hanlon (2020), Brick by Brick (§43: “Shannon: Entropy and Information Theory”, per Lloyd  citation)
Quotes | By
The following are quotes by Lloyd:
- “Free will is safe. Even if the universe is completely deterministic, then we (and computers, and god knows who else) possess free will. At first, the deterministic nature of the laws of physics would seem to forbid free will: No choice is available. In fact, however, the computational nature of the universe actually guarantees free will.”
- — Seth Lloyd (2010), "An Interview with Seth Lloyd" 
- “There is a general tendency in the universe, in human societies, and actually everywhere, for entropy AND information to increase.”
- — Seth Lloyd (2018), “The Black Hole of Finance” 
- Pagels, Heinz. and Lloyd, Seth. (1988). “Complexity as Thermodynamic Depth” (abs) (pg. 185), Annals of Physics, 188(1):186-213.
- Popova, Maria. (2016). “It from Bit”, Brain Pickings, Sep 2.
- Stenger, Victor. (2012). God and the Folly of Faith (pg. #). Publisher.
- Lloyd, Seth. (2011). “On Quantum Life” (YT), Perimeter Institute. Feb 3.
- Shannon bandwagon – Hmolpedia 2020.
- Objectionable nonsense – Hmolpedia 2020.
- Typing monkeys – Hmolpedia 2020.
- Quantum flapdoodle – Hmolpedia 2020.
- Lloyd, Seth. (2004). Programming the Universe: a Quantum Computer Scientist Takes on the Cosmos (to specify, pg. 144). Knopf Doubleday.
- Entropy pied piper – Hmolpedia 2020.
- Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (annotated review: pdf, by Robert Doyle, 2020), Journal of Human Thermodynamics (Ѻ), 8(1): 1-120, Dec 19.
- Deacon, Terrence W. (2011). Incomplete Nature: How Mind Emerged from Matter (Lloyd, pgs. 74-75). W.W. Norton & Co.
- JDNM peer review (2013) – Journal of Human Thermodynamics.
- (a) Landauer, Rolf. (1991). “Information is Physical” (pdf), Physics Today, 44:23-29.
(b) Landauer, Rolf. (1996). “The Physical Nature of Information” (abs) (pdf), Physics Letters A, 217:188-93.
- Ross, Greg. (2009). "An Interview with Seth Lloyd" (WB), American Scientist, Dec 15
- Lloyd, Seth. (2018). “The Black Hole of Finance: Creation and Destruction in Life, the Economy, and the Universe” (quote, 8:17-8:26), YouTube, Oxford Martin School, Feb 27.
- Seth Lloyd – Hmolpedia 2020.