From Hmolpedia
Jump to navigation Jump to search
An annotated depiction of Ralph Hartley's four types of electrical transmissions, from his "Transmission of Information" (1927), showing as low current (or voltage) levels shown as "0s", high current (or voltage) levels shown as "1s", and unreadable signals, in between, shown as "uncertainty", in the data, or signal, of the message, which Hartley translated, mathematically, using logarithms. These 0s and 1s, aka "binary digits", eventually became renamed as "bits", coined by John Tukey (c.1947).[1]

In terms, bit (TR:29) (LH:3) (TL:32), a portmanteau of “binary digit” (Tukey, 1948), is 0 or 1, defined by a high or low voltage, current, or electromagnetic signal, employed as the two letters or numbers of a coding language, defined by Boolean algebra, used in telecommunication, computer science, and information theory, to store, transmit, and process information.


The Morris telegraph system, devised by Samuel Morris (1836), wherein letters and numbers were "coded" or defined by a unique series of dots (short hold) and dashes (long hold), e.g. the letter "A" is one dot Morris dot.png followed by one dash Morris dash.png, "B" is one dash Morris dash.pngfollowed by three dots Morris dot.pngMorris dot.pngMorris dot.png, and so on. These messages could be sent along any wire that would serve as a conductor. In 1926, Claude Shannon, as a youth, in Michigan, e.g. sent Morris code along a barbed wire fence, several miles away to his friend, so to send secret messages.

In 1836, Samuel Morris invented a communication device called the "telegraph system", shown below right, could be used to send coded pulses — dots (short hold) or dashes (long hold) — of electric current along a wire which controlled an electromagnet that was located at the receiving end of the telegraph system:

Morris dot.png = short hold (current) = 0 | zero binary digit
Morris dash.png = long hold (current) = 1 | one binary digit

In 1927, Ralph Hartley, in his "Transmission of Information" article, presented at the International Congress of Telegraphy and Telephony, explained how to quantify the capacity of a system to transmit information, mathematically, using logarithms; the four types of information pulses defined as follows:

In 1940, John Neumann joked to Claude Shannon that he should call his new Hartley information formula, for mapping the transmission of 0s and 1s in telecommunication lines, by the name “entropy”, because similar logarithmic formulas for heat models of gas system had been used in thermodynamics (Boltzmann, 1872; Planck, 1901) and for demons using their intelligence to gain information about the trajectories of gas particles in a two-compartment system (Szilard, 1922).

In 1945, Shannon, “Mathematical Theory of Cryptography”, taking Neumann’s joking advice to heart, began to call telecommunication “information” by the name “entropy”.

In c.1947, John Tukey, in discussion with Shannon, suggested that the term "bit", short for binary digit, be used as shorthand to mean a "0" or a "1", in coding or in communication theory.

The Hartley model, via Claude Shannon's "Mathematical Theory of Communication" (1948), became "information theory", and Hartley's bit-based information, mixed with uncertainty, became confusingly dubbed "information entropy", themed on a confused mixture of Boltzmann entropy and Heisenberg uncertainty, a result of the "John Neumann joke" (Neumann, 1940).

Theory | Abuse

In 1956, Shannon, in his "The Bandwagon", owing to the misuse and abuse of his theory, outside of telecommunications proper, had to publicly recant his use of the term "entropy", stating openly that his information theories are NOT applicable in psychology, economics, or social sciences.[2]

Unit | Confusion

In 1961, Rolf Landauer, in his ‘Irreversibility and Heat Generation in the Computing Process’ article, proposed that there is an unavoidable cost in entropy whenever one erases information, namely that the entropy of the surroundings must increase by at least per each bit of information erased.[3]

In 1970s, confusion over units began to arise:

“A situation is characterized by only two possible choices, has ‘unit’ information or one bit. More generally, the unit of information is determined by the arbitrary scale factor K in Shannon’s formula. If one looks at the entropy equation from quantum mechanics, one finds S, the thermodynamic entropy, is in units of joules / degree).”
— Author (1975), “Article”[4]

The confusion has only increased; particularly following the "it from bit" theory of John Wheeler (1989).

Presently, their are many who believe, naively, that the universe, including all of chemistry, physics, and thermodynamics, can be boiled down to units "bits", rather than joules (J) or joules per kelvin (J/K).


In 2012, Libb Thims, in his “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair”, suggested the terms "infotropy" or "bitropy" be used instead of "Shannon entropy" or "entropy" in ALL communication science publications and works.[1]

A diagram of semi-famous and or infamous statements: such as: "life feeds on negative entropy" (Schrodinger, 1943), "it from bit" (Wheeler, 1989), "information is physical" (Landauer, 1991), each phrase made of "words", each word made of "characters", those shown being the letters of the English alphabet, each character defined by one "byte" of data, in computer science, each byte, comprised of eight "bits", one bit being either the number zero (0) or the number (1), which can be representative either of a high or low voltage, e.g. in the memory of a silicon chip, or a signal or no signal in a transmission, showing a person lost in center, looking confused, as a result of a 1940 joke suggestion by John Neumann, made to Claude Shannon, to call "information", latter called bits (Tukey, 1947), by the name of "entropy" of thermodynamics (Naim, 2015).[5]

Bit god

See main: Bit god

In the wake of the coining of the term "bit" (Tukey, 1947), information theory (Shannon, 1949), the discovery of DNA (Crick, 1953), as the "information carrier" of the genetic code, the "Shannon bandwagon" (Shannon, 1956), the growth of computer science, the premise of god being behind all of this, connecting the dots, so to say, in the sense of the "logos" of theology, began to become an easy sell, to many minds.

In 1995, Keith Devlin, in his Logic and Information, a book aiming to obtain a deeper understanding of the nature of intelligence and knowledge acquisition, via concept of logic, presented a cover, as shown above, with god, coming out of the sun, who makes or creates the “bits” of the universe, which thus creates the human mind, or something along these lines.

Hence, in the late 20th century, the bulk of information-based arguments, more often than not, tended to be closet god type arguments, coated with bits.


In 2007, Arieh Naim, an Israeli physical chemist, with some sort of closet god issues, with his Entropy Demystified (2007), Farewell to Entropy (2008), and others to follow, argued that "entropy" (of thermodynamics) needs to be replaced by term such as: "information, missing information, or uncertainty”, of the Claude Shannon variety, and to replace the joule by some sort of bit-based unit system.

In 2015, Naim, in his Information, Entropy, Life and the Universe: What We Know and What We Do Not Know, presented a picture of himself, shown adjacent, in his early days, surrounded by all the various confused models of entropy.


The following are quotes:

“The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called ‘binary digits’, or more briefly ‘bits’, a word suggested by John Tukey.”
Claude Shannon (1948), “A Mathematical Theory of Communication” (pg. #)
“Modern information theory allows us to verify the old maxim that "a picture is worth a thousand words". A thousand words of connected English, at 6 characters (5 letters and space) per word, and 1 bit per character (approximately the best current overall estimate, allowing for redundancy) [15,35], amounts to 6,000 bits of information, equivalent to a choice among 26000 equally probable patterns.”
John Tukey (1961), “The Science of Statistical and Quantitative methodology” (pg. 172)[6]
“The term ‘bit’ was coined in 1949 by computer scientist John Tukey during a lunchtime discussion of ‘binary digits’. Byte ,on the other hand , entered the lexicon at IBM in 1956 and came to equal 8 bits, the unit of memory or data necessary to store [or encode] a single character of text in a computer.”
— Constance Hale (1996), Wired Style: Principles of English Usage in the Digital Age (pg. 41) [7]
“In 1947, the term ‘bit’, a single binary digit, was coined by John Tukey. Later, the term ‘byte’, or eight bits, was used in the first computer program for the first fully electronic stored-program computer.”
— Norris McWhirter (1999), McWhirter’s Book of Millennium Records (pg. 217)[8]

End matter

See also


  1. 1.0 1.1 Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (annotated review: pdf, by Robert Doyle, 2020) (infotropy, pgs. 94-95; bitropy, pgs. 95-96), Journal of Human Thermodynamics (Ѻ), 8(1): 1-120, Dec 19.
  2. (a) Shannon, Claude. (1956). “The Bandwagon”, IRE Transactions: on Information Theory, 2(1):3, March.
    (b) Mitra, Partha; Bokil, Hemant. (2007). Observed Brain Dynamics (§1.3.1: Reversible and Irreversible Dynamics; Entropy, pgs. 9-; Appendix A: The Bandwagon by C.E. Shannon, pgs. 343-44; Appendix B: Two Famous Papers by Peter Elias, pgs. 345-46). Oxford University Press.
  3. Landauer, Ralph. (1961). “Irreversibility and Heat Generation in the Computing Process” (abs) (pdf), IBM Journal of Research and Development, 5(3), Jul.
  4. Author. (1975). “Article” (pg. 12), WRRC Bulletin, 78-82:12.
  5. Ben-Naim, Arieh. (2015). Information, Entropy, Life and the Universe: What We Know and What We Do Not Know (Amz). World Scientific.
  6. Tukey, John. (1961). “The Science of Statistical and Quantitative methodology” (pg. #); in: The Collected Works of John W. Tukey: Philosophy and Principles of Data Analysis 1949-1964, Volume 3 (editor: L.V. Jones) (§7:143-86, pg. 172). Publisher.
  7. Hale, Constance. (1996). Wired Style: Principles of English Usage in the Digital Age (pg. 41). Publisher.
  8. McWhirter, Norris. (1999). McWhirter’s Book of Millennium Records (pg. 217). Publisher.


  • Poudel, Ram; Thims, Libb; Haddad, Wassim; Kondepudi; Themis, Matsoukas; Deacon, Terrence; and Nahum, Gerard. (2020). “Boltzmann entropy (J/K) vs Shannon entropy (bits)” (YT), Thermodynamics 2.0 Conference, Group Discussion, Jun 22.

External links

  • Bit – Hmolpedia 2020.
Theta Delta ics T2.jpg