From Hmolpedia
Revision as of 14:26, 20 June 2021 by Sadi-Carnot (talk | contribs) (→‎End matter)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
A visual summary of how the "bit", the mathematical unit of "information", from Claude Shannon's 1948 "information theory" has become, in the confused minds of many, an "open sesame" (Dolloff, 1975) to unlock the secrets of the universe, and the thermodynamics of humans.[1]

In terms, information (TR:367) (LH:13) (TL:680|#88), from the Latin in- “into” + formare “to form, shape” (14th century), refers to communication of knowledge, on some subject, topic, news, or thing, designed to shape, give form, delineate, or instruct upon.[2]


In c.1550, information was being used to mean “knowledge communicated concerning a particular topic”. [3]

Information science

In 1928, Ralph Hartley, in his “Transmission of Information” article, explained how the "logarithm", in the form of x = y log z, specifically of the "number of possible symbol sequences" is the best "practical measure of information", specifically in regard to a telegraph operator sending 1s (HIs) and 0s (LOs) in a telegraph transmission.

In 1937, the term “information” was used in reference to television broadcast signals.[3]

In 1944, information was used in reference to punch-card operating systems.

Shannon | Information

In 1948, Claude Shannon, in his “A Mathematical Theory of Communication”, building on Hartley’s logarithm model of “information” of the number of possible symbol sequences of 1s and 0s in a telegraph message, introduced the “information theory”, amid which, per John Neumann’s suggestion (c.1940), named his new H formula for “information” (knowledge transmitted or stored), “choice” (telegraphy operator type), or “uncertainty” (Heisenberg type)” by the already-employed in thermodynamics name “entropy”, presumed, incorrectly, to be thematics to Ludwig Boltzmann’s H-function of statistical mechanics. This terminology mistake launched a pandora’s box of confusion and misunderstanding.


A picture of Shannon bandwagon, which started tooting binary digits (0s and 1s) based music, in the 1950s; which by 1956 had grown so loud, that Shannon had to publish his infamous "The Bandwagon" article, to recant the mess he had made.[4]

In 1956, Shannon, in his article “The Bandwagon”, had to recant, and say specifically that his theory of information is not applicable outside of electronics communication science proper:

“Workers in other fields should realize that the basic results of this [information science] subject are aimed at a very specific direction, one that is NOT relevant to fields such as psychology, economics, and other social sciences. “It will be all to easy for our somewhat artificial prosperity to collapse overnight when it is realized that the use of a few ‘excited’ words like information, entropy, redundancy, do not solve all our problems.” ”
Claude Shannon (1956), “The Bandwagon” [4]

In the wake of Shannon's article, a information "bandwagon" had gained traction, wherein people were attempting to argue for "information theories" of every thing.[5]

In 1999, Shu-Kun Lin, an organic and inorganic physical chemist, founded the Entropy journal as a repercussion of his openly-stated confusion between: the “entropy” of Shannon, on one hand, the “entropy” of Clausius, Gibbs, Boltzmann, and Planck, on another hand, and the “entropy” of Prigogine, on a third hand.[6]

This grand confusion, continues, in growing amounts of ignorance, to this very day (e.g. see: Talk:IAISAE).[7]

Genetic information

In 1953, information, supposedly, was used in reference to DNA; presumably in the context of “genetic information”, in sense of traits that are passed from parent to offspring in genes.[3]


The following are related quotes:

“It is misleading in a crucial way to view ‘information’ as something that can be poured into an empty vessel, like a fluid or even energy.”
— Anatol Rapoport (1956), “The Promise and Pitfalls of Information Theory”[8]
Information is physical.”
Rolf Landauer (1991), “Information is Physical”[9]; cited by Seth Lloyd (2006) in Programming the Universe (pg. 213)
“The whole universe may be a single hologram: The information about all of it is encapsulated in every part of it.”
John Wiley (1993), Natural High (pg. #) [10]

End matter

See also


  1. Dolloff, Norman H. (1975). Heat Death and the Phoenix: Entropy, Order, and the Future of Man (pg. xvi). Exposition Press.
  2. (a) Information – Online Etymology Dictionary.
    (b) Inform – Online Etymology Dictionary.
  3. 3.0 3.1 3.2 Information – Online Etymology Dictionary.
  4. 4.0 4.1 (a) Shannon, Claude. (1956). “The Bandwagon”, IRE Transactions: on Information Theory, 2(1):3, March.
    (b) Mitra, Partha; Bokil, Hemant. (2007). Observed Brain Dynamics (§1.3.1: Reversible and Irreversible Dynamics; Entropy, pgs. 9-; Appendix A: The Bandwagon by C.E. Shannon, pgs. 343-44; Appendix B: Two Famous Papers by Peter Elias, pgs. 345-46). Oxford University Press.
    (c) Shannon bandwagon – Hmolpedia 2020.
  5. Shannon bandwagon – Hmolpedia 2020.
  6. Shu-Kun Lin – Hmolpedia 2020.
  7. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (Ѻ), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
  8. Rapoport, Anatol. (1956). “The Promise and Pitfalls of Information Theory” (abs), Behavioral Science, 1:303-09; in: Modern Systems Research for the Behavioral Scientist (editor: Walter Buckley) (pgs. 137-42). Aldine, 1968.
  9. (a) Landauer, Rolf. (1991). “Information is Physical” (pdf), Physics Today, 44:23-29.
    (b) Landauer, Rolf. (1996). “The Physical Nature of Information” (abs) (pdf), Physics Letters A, 217:188-93.
  10. Wiley, John. (1993). Natural High (pg. #) (abs). UPNE.

External links

Theta Delta ics T2.jpg