In existographies, Claude Shannon (39 BE-46 AE) (1916-2001 ACM) (IQ:170|#425) (CR:183) (LH:7) (TL:194|#48) was an American electrical engineer, mathematician, cryptographer, and telecommunication theorist, noted for 
In 1948, Shannon, in his “A Mathematical Theory of Communication”, and followup expanded 1949 book by the same title, building on the earlier work of Ralph Hartley (1928), derived of some basics on the mathematics of signal transmission.
Terminology | Confusion
In 1940, Shannon consulted John Neumann on what name he should give to his new Hartley logarithm style information formula; offering the following choices:
- Information | of the binary digits (0s and 1s) type
- Uncertainty | of the Heisenberg "uncertainty principle" type
Neumann, for whatever reason, tells him NOT use neither of these, but instead call his measure “entropy”, because firstly similar logarithms are used in statistical mechanics, such as his friend Leo Szilard (1929) had done in respect to Maxwell's demon, and secondly, according to Neumann, “nobody knows what entropy is”, so that if he gets into an argument, he can always bluff his way out.
In 1945, Shannon, in his “A Mathematical Theory of Cryptography”, testing Neumann’s advice, used the term “entropy” one time.
In 1948, Shannon, in his “A Mathematical Theory of Information”, used the term “entropy” 152 times.
In 1948, Myron Tribus, during his PhD exam, at UCLA, he was asked to explain the connection between entropy of Clausius and the entropy of Shannon? But, neither he nor the examination committee knew the answer?
In 1949, Shannon, in his The Mathematical Theory of Information, and expanded book version of the former, employed the term "entropy" hundreds of time.
In 1950, at the first London symposium on information theory, held in 1950, 6 of 20 papers presented were on psychology and neurophysiology.
In 1951, at the second symposium on information theory, 8 papers were on psychology and neurophysiology.
in 1953, James Watson and Francis Crick elucidated the structure of DNA, the so-called "information" carrier of genetics.
In 1955, Louis Rosa, the chair of the Institute of Radio Engineers' Professional Group on Information Theory (PGIT) published "In Which Fields do We Graze?", wherein, he had to address the question of the growing proliferation of the use of Shannon's theories, which applies only in the science of “communication by radio or wire”, OUTSIDE of radio engineering proper, in fields such as: management, biology, psychology, and linguistic theory.
In 1956, Shannon, in his "The Bandwagon", publicly recanted his use of the term "entropy", stating openly that his information theories are NOT applicable in psychology, economics, or social sciences; for example:
- “It will be all to easy for our somewhat artificial prosperity to collapse overnight when it is realized that the use of a few ‘excited’ words like information, entropy, redundancy, do not solve all our problems.”
- — Claude Shannon (1956), “The Bandwagon”, Mar
In 1964, over 32,000 copies of Shannon's book had been sold.
Entropy | Journal
In 1999, Shu-Kun Lin, an organic and inorganic physical chemist, founded the the Entropy journal as a repercussion of his openly-stated confusion between: the “entropy” of Shannon, on one hand, the “entropy” of Clausius, Gibbs, Boltzmann, and Planck, on another hand, and the “entropy” of Prigogine, on a third hand. The following, as of 2020, shows the resulting effect, in respect to it ranked sections, and ranked authors, based on citation count:
wherein, as we see, "Shannon" has become the lead and representative author of "thermodynamics"?
Effect | Ramifications
In 2007, Ingo Muller, in his A History of Thermodynamics, classified Shannon's adoption of terminology from thermodynamics has wrought or resulted in "wanton obfuscation", on a grand scale, across all areas of science, which continues to this very day.
Infotropy | Bitropy
In 2012, Libb Thims, in his “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair”, suggested the terms "infotropy" or "bitropy" be used instead of "Shannon entropy" or "entropy" in ALL communication science publications and works.
- Information entropy quotes
- Neumann-Shannon anecdote
- Shannon bandwagon 
- Shannon entropy
- Shannon information
- Shannon, Claude E. (1948). "A Mathematical Theory of Communication" (pdf), Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October.
- Szilard, Leo. (1929). “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings” (pdf) (Uber die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen), Zeitschrift fur Physik, 53, 840-56.
- De Rosa, Louis A. (1955). “In Which Fields Do We Graze?” (abs), IRE Transactions on Information Theory, 1(3):2, Dec.
- (a) Shannon, Claude. (1956). “The Bandwagon”, IRE Transactions: on Information Theory, 2(1):3, March.
(b) Mitra, Partha; Bokil, Hemant. (2007). Observed Brain Dynamics (§1.3.1: Reversible and Irreversible Dynamics; Entropy, pgs. 9-; Appendix A: The Bandwagon by C.E. Shannon, pgs. 343-44; Appendix B: Two Famous Papers by Peter Elias, pgs. 345-46). Oxford University Press.
- (a) Elias, Peter. (1958). “Two Famous Papers” (pg. 99) (pdf), IRE Transactions: on Information Theory, 4(3):99.
(b) Mitra, Partha, and Bokil, Hemant. (2008). Observed Brain Dynamics (Appendix A: The Bandwagon by C.E. Shannon, pg. 343; Appendix B: The Two Famous Papers by Peter Elias, pg. 345). Oxford University Press.
- Shu-Kun Lin – Hmolpedia 2020.
- Muller, Ingo. (2007). A History of Thermodynamics (pgs. 123-26). Springer.
- Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (annotated review: pdf, by Robert Doyle, 2020) (infotropy, pgs. 94-95; bitropy, pgs. 95-96), Journal of Human Thermodynamics (Ѻ), 8(1): 1-120, Dec 19.
- Shannon bandwagon – Hmolpedia 2020.
- Claude Shannon – Hmolpedia 2020.