In terms, bit (TR:29) (LH:3) (TL:32), a portmanteau of “binary digit” (Tukey, 1948), is 0 or 1, defined by a high or low voltage, current, or electromagnetic signal, employed as the two letters or numbers of a coding language, defined by Boolean algebra, used in telecommunication, computer science, and information theory, to store, transmit, and process information.
In 1836, Samuel Morris invented a communication device called the "telegraph system", shown below right, could be used to send coded pulses — dots (short hold) or dashes (long hold) — of electric current along a wire which controlled an electromagnet that was located at the receiving end of the telegraph system:
In 1927, Ralph Hartley, in his "Transmission of Information" article, presented at the International Congress of Telegraphy and Telephony, explained how to quantify the capacity of a system to transmit information, mathematically, using logarithms; the four types of information pulses defined as follows:
In 1940, John Neumann joked to Claude Shannon that he should call his new Hartley information formula, for mapping the transmission of 0s and 1s in telecommunication lines, by the name “entropy”, because similar logarithmic formulas for heat models of gas system had been used in thermodynamics (Boltzmann, 1872; Planck, 1901) and for demons using their intelligence to gain information about the trajectories of gas particles in a two-compartment system (Szilard, 1922).
In 1945, Shannon, “Mathematical Theory of Cryptography”, taking Neumann’s joking advice to heart, began to call telecommunication “information” by the name “entropy”.
In c.1947, John Tukey, in discussion with Shannon, suggested that the term "bit", short for binary digit, be used as shorthand to mean a "0" or a "1", in coding or in communication theory.
The Hartley model, via Claude Shannon's "Mathematical Theory of Communication" (1948), became "information theory", and Hartley's bit-based information, mixed with uncertainty, became confusingly dubbed "information entropy", themed on a confused mixture of Boltzmann entropy and Heisenberg uncertainty, a result of the "John Neumann joke" (Neumann, 1940).
Theory | Abuse
In 1956, Shannon, in his "The Bandwagon", owing to the misuse and abuse of his theory, outside of telecommunications proper, had to publicly recant his use of the term "entropy", stating openly that his information theories are NOT applicable in psychology, economics, or social sciences.
Unit | Confusion
In 1961, Rolf Landauer, in his ‘Irreversibility and Heat Generation in the Computing Process’ article, proposed that there is an unavoidable cost in entropy whenever one erases information, namely that the entropy of the surroundings must increase by at least per each bit of information erased.
In 1970s, confusion over units began to arise:
- “A situation is characterized by only two possible choices, has ‘unit’ information or one bit. More generally, the unit of information is determined by the arbitrary scale factor K in Shannon’s formula. If one looks at the entropy equation from quantum mechanics, one finds S, the thermodynamic entropy, is in units of joules / degree).”
- — Author (1975), “Article”
Presently, their are many who believe, naively, that the universe, including all of chemistry, physics, and thermodynamics, can be boiled down to units "bits", rather than joules (J) or joules per kelvin (J/K).
In 2012, Libb Thims, in his “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair”, suggested the terms "infotropy" or "bitropy" be used instead of "Shannon entropy" or "entropy" in ALL communication science publications and works.
- See main: Bit god
In the wake of the coining of the term "bit" (Tukey, 1947), information theory (Shannon, 1949), the discovery of DNA (Crick, 1953), as the "information carrier" of the genetic code, the "Shannon bandwagon" (Shannon, 1956), the growth of computer science, the premise of god being behind all of this, connecting the dots, so to say, in the sense of the "logos" of theology, began to become an easy sell, to many minds.
In 1995, Keith Devlin, in his Logic and Information, a book aiming to obtain a deeper understanding of the nature of intelligence and knowledge acquisition, via concept of logic, presented a cover, as shown above, with god, coming out of the sun, who makes or creates the “bits” of the universe, which thus creates the human mind, or something along these lines.
Hence, in the late 20th century, the bulk of information-based arguments, more often than not, tended to be closet god type arguments, coated with bits.
In 2007, Arieh Naim, an Israeli physical chemist, with some sort of closet god issues, with his Entropy Demystified (2007), Farewell to Entropy (2008), and others to follow, argued that "entropy" (of thermodynamics) needs to be replaced by term such as: "information, missing information, or uncertainty”, of the Claude Shannon variety, and to replace the joule by some sort of bit-based unit system.
In 2015, Naim, in his Information, Entropy, Life and the Universe: What We Know and What We Do Not Know, presented a picture of himself, shown adjacent, in his early days, surrounded by all the various confused models of entropy.
The following are quotes:
- “The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called ‘binary digits’, or more briefly ‘bits’, a word suggested by John Tukey.”
- — Claude Shannon (1948), “A Mathematical Theory of Communication” (pg. #)
- “Modern information theory allows us to verify the old maxim that "a picture is worth a thousand words". A thousand words of connected English, at 6 characters (5 letters and space) per word, and 1 bit per character (approximately the best current overall estimate, allowing for redundancy) [15,35], amounts to 6,000 bits of information, equivalent to a choice among 26000 equally probable patterns.”
- “The term ‘bit’ was coined in 1949 by computer scientist John Tukey during a lunchtime discussion of ‘binary digits’. Byte ,on the other hand , entered the lexicon at IBM in 1956 and came to equal 8 bits, the unit of memory or data necessary to store [or encode] a single character of text in a computer.”
- — Constance Hale (1996), Wired Style: Principles of English Usage in the Digital Age (pg. 41) 
- “In 1947, the term ‘bit’, a single binary digit, was coined by John Tukey. Later, the term ‘byte’, or eight bits, was used in the first computer program for the first fully electronic stored-program computer.”
- — Norris McWhirter (1999), McWhirter’s Book of Millennium Records (pg. 217)
- Information entropy quotes
- Information theory
- Information is more fundamental than
- It from bit
- Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (annotated review: pdf, by Robert Doyle, 2020) (infotropy, pgs. 94-95; bitropy, pgs. 95-96), Journal of Human Thermodynamics (Ѻ), 8(1): 1-120, Dec 19.
- (a) Shannon, Claude. (1956). “The Bandwagon”, IRE Transactions: on Information Theory, 2(1):3, March.
(b) Mitra, Partha; Bokil, Hemant. (2007). Observed Brain Dynamics (§1.3.1: Reversible and Irreversible Dynamics; Entropy, pgs. 9-; Appendix A: The Bandwagon by C.E. Shannon, pgs. 343-44; Appendix B: Two Famous Papers by Peter Elias, pgs. 345-46). Oxford University Press.
- Landauer, Ralph. (1961). “Irreversibility and Heat Generation in the Computing Process” (abs) (pdf), IBM Journal of Research and Development, 5(3), Jul.
- Author. (1975). “Article” (pg. 12), WRRC Bulletin, 78-82:12.
- Ben-Naim, Arieh. (2015). Information, Entropy, Life and the Universe: What We Know and What We Do Not Know (Amz). World Scientific.
- Tukey, John. (1961). “The Science of Statistical and Quantitative methodology” (pg. #); in: The Collected Works of John W. Tukey: Philosophy and Principles of Data Analysis 1949-1964, Volume 3 (editor: L.V. Jones) (§7:143-86, pg. 172). Publisher.
- Hale, Constance. (1996). Wired Style: Principles of English Usage in the Digital Age (pg. 41). Publisher.
- McWhirter, Norris. (1999). McWhirter’s Book of Millennium Records (pg. 217). Publisher.
- Poudel, Ram; Thims, Libb; Haddad, Wassim; Kondepudi; Themis, Matsoukas; Deacon, Terrence; and Nahum, Gerard. (2020). “Boltzmann entropy (J/K) vs Shannon entropy (bits)” (YT), Thermodynamics 2.0 Conference, Group Discussion, Jun 22.
- Bit – Hmolpedia 2020.