Informationist

From Hmolpedia
Jump to navigation Jump to search
A depiction of information theory (Shannon, 1948)[1] as being the biggest current growing "weed" in the "garden of thermodynamics" (Hiebert, 1966)[2]; according to which "informationists" are "weed cultivators"[3], e.g. arguing, in scientific publication, that the mechanics of the sun, heat, and plant growth can be reduced to bits, Shannon information theory, and Boolean algebra; which amounts to the cultural "information obesity".[4]

In terminology, informationist is one who believes that “information” is more fundamental, in the dynamics and operation of the universe, than matter, energy, or entropy; that entropy is equivalent to information, both of which measured in bits; or is a general advocated of "information theory"[1], typically of the Neumann-Shannon variety (1941).[5]

Overview

An informationist can be contrasted with a physicist, chemist, or a thermodynamicist, in the sense that the informationist believes that physics, chemistry, and thermodynamics “reduce” to “information”. Informationists also tend to be anti-determinists, free will believers, and creationists, or theists. An informationist, in short, more often than not, tends to be someone who uses information theory as an ontic opening, as a Sokal affair style cover for closet creationism, knowingly or not.

Szilard | 1922

In 1922, Leo Szilard, in his "On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings’, wherein he derives the result, based connectively on the 1905 thermodynamics of Brownian motion work of Albert Einstein, that the following equation is the "entropy dissipation", symbol S, that occurs in the memory processing act of an ‘exactly informed’ Maxwell’s demon:[6]

where k is the Boltzmann constant.

Hartley | 1928

In 1928, Ralph Hartley, in his "Transmission of Information", explained that the logarithm:[7]

where s is the number of "symbol" options, e.g. high, no-signal, low (voltage or current), in a telegraph wire, n is the "number" of times a telegraph operator makes a selection, and H is the logarithm of the number of possible symbol sequences, is the best "practical measure of information" sent in a transmission.

Neumann | 1930s

In 1930s, John Neumann, an associate of Szilard, initiated the "informationist movement", via his spurious suggestions that (a) telegraph operators are like a Maxwell demons, (b) that logic gates of computers can be thought of as a "degree of freedom", in the statistical thermodynamics meaning of things, and (c) that each elementary act or step in the central processor of a computer has a Boltzmann-like gas theory entropy change associated with it.

Shannon | 1940 / 1948

In 1940 to 1941, when Neumann and Claude Shannon were both at the Princeton Institute for Advanced Study, Shannon approached Neumann about what "name" he should assign to his new telegraph transmission of bits formulas, Shannon vacillating between the terms "information" (of the Hartley type) or "uncertainty" (of the Heisenberg type), at which point Neumann advised Shannon to call it "entropy" (of the Szilard type), per reason that the (a) logarithm formulas are similar and (b) no one really knows what entropy is, meaning that if he should ever get into a debate, he can always bluff his way out of the argument, and win the debate; this so-called Shannon-Neumann anecdote is illustrated below:[5]

An artistic rendition of the 1941 Neumann-Shannon "what should I call my new formula" anecdote.

In 1948, Shannon, in his "Mathematical Theory of Communication", took Neumann's advice to heart, and attempted to stylize, in namesake, his new communication formula, on that of Ludwig Boltzmann's H-theorem. This was compounded when, in 1949, during the book stage publication of the former article arose, Warren Weaver, added the the following "misinformation", in a footnote, about "missing information":[8]

Shannon’s work roots back, as von Neumann has pointed out, to Boltzmann’s observations, in some of his work on statistical physics (1894), that entropy is related to ‘missing information’, inasmuch as it is related to the number of alternatives which remain possible to a physical system after all the macroscopically observable information concerning it has been recorded. Leo Szilard (Zeitschrift fur Physik, Vol. 53, 1925) extended this idea to a general discussion of information in physics, and von Neumann (Mathematical Foundation of Quantum Mechanics, Berlin, 1932, Chap V) treated information in quantum mechanics and particle physics.”
— Warren Weaver (1949), footnote to the book-version of Shannon’s The Mathematical Theory of Communication

In the years to follow, as illustrated below, Shannon's new "entropy" named model of "information", became a conceptual "open sesame"[9] to a large number of the intellectual population, most ignorant about thermodynamic, who gleaned the idea that via one simple equation, namely the Hartley logarithm (1928)[7], namely: , where s is the number of "symbol" options, e.g. high, no-signal, low (voltage or current), in a telegraph wire, n is the "number" of times a telegraph operator makes a selection, and H is the "logarithm of the number of possible symbol sequences" (which Hartley defines as a "practical measure of information" sent in a transmission):

Shannon (open sesame).jpg

In 1955, things had ballooned so much out of control, that Louis Rosa, chairman of the newly-formed Professional Group on Information Theory, had to publish a memorandum, entitled “In Which Fields Do We Graze?”, stating that what their field of the new science of “communication by radio or wire” had ballooned so much out of control, owing to Shannon’s ill-advised terminology usage, that the new information theory was being applied in fields such as: management, biology, psychology, and linguistic theory.

Bandwagon | 1956

A parody of the 1956-present Shannon bandwagon, from Libb Thims' “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (2012).[6]

In 1956, Shannon, owing to the fact that his publication had grown out of control, producing baseless applications, had to issue his recant memo “The Bandwagon”, illustrated adjacent, advising that his theory is not applicable outside of “communication by radio or wire” proper.[10] This memo, however, hardly curtailed things, and by 1964, over 32,000 copies of his book had been sold.

Wheeler | 1989

In 1989, John Wheeler, fueled by a NSF grant, about which gave four lectures world-wide attempting to argue the following view:

“Every it-every particle, every field of force, even the spacetime continuum itself-derives its function, its meaning, its very existence entirely-even if in some contexts indirectly-from the apparatus-elicited answers to ‘yes-or-no’ questions, binary choices. Bits.”
John Wheeler (1989), “Information, Physics, Quantum: the Search for Links – Can We Ever Expect to Understand Existence?”[11]

Wheeler, here, we note, only cites "information theory" indirectly, via citation to a few tertiary references; and does not cite Shannon directly (although by this time Shannon's influence had grown into many of the 179-publications Wheeler cites). Wheeler's interest here was to argue that black holes flout the second law of thermodynamics; and attempted to argue that a black hole can be quantified by a certain number of "bits" via logarithmic formula. In this article, Wheeler cites Chauncey wright, Josiah Royce, and Charles Peirce as his intellectual confidants, along with the following quote by Parmenides:

“What is, is identical with the ‘thought’ that recognizes it.”
Parmenides (c.460BC), Fragment #

Wheeler's model, here, although very subtle, accordingly, is an attempt to do a "sokal affair"[12] style, "ontic opening" based, remake "being", in a non-deterministic, seeming closeted-theology sense of the matter.

Other

A few of the instrument players that got caught up on "Shannon's bandwagon" and or "Wheeler wormhole", include: Myron Tribus (1948), who, during his UCLA during his PhD examination, was queried about the difference between Shannon entropy and Clausius entropy; Edwin Jaynes (1957), Seth Lloyd (1988), Rolf Landauer (1991), and Arieh Naim (2007), to name a few.[6]

Quotes

Quotes | Pro

The following are quotes on the pro-side of the conjecture:

Information is physical.”
Rolf Landauer (1991), “Information is Physical”[13]; cited by Seth Lloyd (2006) in Programming the Universe (pg. 213)
“Information occupies the ‘ontological basement’ of mother nature.”
— Paul Davies (2010), Information and the Nature of Reality (pg. 82) [14]
“Historically, matter has been at the bottom of the explanatory chain, and information has been a sort of secondary derivative of it. There's increasing interest among at least a small group of physicists to turn this upside down and say, maybe at rock bottom, the universe is about information and information processing, and it's matter that emerges as a secondary concept.”
— Paul Davies (c.2015), Publication [15]
The language of thermodynamics [2.0] isenergy’ or its derivatives such as ‘entropy’ or information which could be more fundamental than energy in time.”
Ram Poudel (2019), Thermodynamics 2.0 homepage, draft version, Jul

Quotes | Agnostic

The following are quotes on the fence sitter position of the conjecture:

“I don't see justification for the claim [that information is more fundamental]; although maybe I could be convinced in the future. Unless those bits are doing something different from the laws of physics, I don't really see that there's a question here. If two things are equivalent, I don't think there's any valid way to talk about which is more fundamental, and I see the two as equivalent.”
— Alan Guth (c.2015), Publication [15]
“I leave it to others to argue about the similarity (or not) between these two entropies (Boltzmann and Shannon), my intent in presenting it here is simply to point out that, for many, Shannon started a ‘paradigm shift’ from thinking about energy to thinking about information, even going so far as to suggest that information is more fundamental than energy.”
Robert Hanlon (2020), Brick by Brick (§43:Shannon: Entropy and Information Theory) [16]

Quotes | Con

The following are quotes on the con-side of the conjecture:

“It will be all to easy for our somewhat artificial prosperity to collapse overnight when it is realized that the use of a few ‘excited’ words like information, entropy, redundancy, do not solve all our problems.”
— Claude Shannon (1956), “The Bandwagon”, Mar [17][10]
“Your proposal about Shannon information S — is nothing but another word that begins with S and refers to human excrement. You should be abjectly ashamed of yourself. Not only are your methods and your behavior nauseating but your ability in uniting information theory and statistical mechanics is erased by your ignorance in dismissing the needs of ordinary students, specifically beginning chemistry students to whom entropy and thermodynamics are enormous conceptual obstacles. Abandonment of the terribly flawed manuscript that you sent to and collaboration with him on melding your expertise in complex probability with our proved success in conceptual perceptions could result in a synthesis that would change education in entropy for this and succeeding generations. If this does not occur, I will literally fight to my death to show the world the fallacy of believing that your views are anything but those of a deceptive, half-truth promoting, dishonest person as you have shown yourself to me.”
— Frank Lambert (2009), review of Arieh Naim’s Entropy Demystified
“No doubt, Shannon and Neumann thought that this was a funny joke, but it is not! It merely exposes Shannon and Neumann as intellectual snobs. Indeed, it may sound philistine, but a scientist must be clear, as clear as he can be, and avoid wanton obfuscation at all cost.”
— Ingo Muller (2007), A History of Thermodynamics (pgs. 124) [18]
“The equations used in [Shannon's] communication theory have absolutely nothing to do with the equations used in thermodynamics.”
Libb Thims (2012), “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pg. 1)[6]; cited by Robert Doyle (2020)[19]
“I disagree that information is fundamental. Even if all of reality emerges from ‘information’, this information is just knowledge about the universe’s basic quantum state.”
— Sean Carroll (c.2017), Publication [20]

End matter

References

  1. 1.0 1.1 Information theory – Hmolpedia 2020.
  2. Garden of thermodynamics – Hmolpedia 2020.
  3. Weed theory – Hmolpedia 2020.
  4. Information obesity – Hmolpedia 2020.
  5. 5.0 5.1 Shannon-Neumann anecdote – Hmolpedia 2020.
  6. 6.0 6.1 6.2 6.3 Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (annotated review: pdf, by Robert Doyle, 2020), Journal of Human Thermodynamics (Ѻ), 8(1): 1-120, Dec 19.
  7. 7.0 7.1 Transmission of Information – Hmolpedia 2020.
  8. There is NO connection, historically, whatsoever, between Boltzmann's work on gas theory and Shannon's work on telegraphy and communication theory.
  9. Open sesame – Hmolpedia 2020.
  10. 10.0 10.1 Shannon bandwagon – Hmolpedia 2020.
  11. Wheeler, John. (1989). “Information, Physics, Quantum: the Search for Links – Can We Ever Expect to Understand Existence?” (pdf); paper evolved from four presentations: Santa Fe Institute Conferences, 29 May to 2 Jun and 4-8 Jun 1989; Proceedings of the 3rd International Symposium on the Foundations of Quantum Mechanics in the Light of New Technology (pgs. 354-68), Tokyo; and Penrose Lecture, 20-22 Apr 1989 at annual meeting at Benjamin Franklin’s American Philosophical Society, held at Philadelphia for Promoting Useful Knowledge; and the Accademia Nazionale dei Lincei Conference on La Verita nella Scienza, Rome 12 Oct 1989; Publication assisted in part by NSF Grant PHY 245-6243 to Princeton University.
  12. Sokal affair – Hmolpedia 2020.
  13. (a) Landauer, Rolf. (1991). “Information is Physical” (pdf), Physics Today, 44:23-29.
    (b) Landauer, Rolf. (1996). “The Physical Nature of Information” (abs) (pdf), Physics Letters A, 217:188-93.
  14. Davies, Paul. (2010). Information and the Nature of Reality (co-editor: Niels Gregersen) (pg. 82). Cambridge.
  15. 15.0 15.1 Kuhn, Robert. (2015). “Forget Space-Time: Information May Create the Cosmos”, Space.com. May 3.
  16. Hanlon, Robert. (2020). Block by Block: the Historical and Theoretical Foundations of Thermodynamics (Illustrators: Robert Hanlon and Carly Sanker) (Bib) (§43:Shannon: Entropy and Information Theory). Oxford University Press.
  17. (a) Shannon, Claude. (1956). “The Bandwagon”, IRE Transactions: on Information Theory, 2(1):3, March.
    (b) Mitra, Partha; Bokil, Hemant. (2007). Observed Brain Dynamics (§1.3.1: Reversible and Irreversible Dynamics; Entropy, pgs. 9-; Appendix A: The Bandwagon by C.E. Shannon, pgs. 343-44; Appendix B: Two Famous Papers by Peter Elias, pgs. 345-46). Oxford University Press.
  18. Muller, Ingo. (2007). A History of Thermodynamics (§: Other Extrapolations. Information, pgs. 123-25). Springer.
  19. Libb Thims – InformationPhilosopher.com.
  20. Ananthaswamy, Anil. (2017). “Inside Knowledge: is Information the Only Thing that Exists? Physics suggests information is more fundamental than matter, energy, space and time – the problems start when we try to work out what that means”, Space.com, Mar 29.

Videos

  • Campbell, MacGregor. (2017). “What is Information?” (YT), Explanimator, May 11.

External links

Theta Delta ics T2.jpg