In terms, information (TR:367) (LH:13) (TL:680|#88), from the Latin in- “into” + formare “to form, shape” (14th century), refers to communication of knowledge, on some subject, topic, news, or thing, designed to shape, give form, delineate, or instruct upon.
In c.1550, information was being used to mean “knowledge communicated concerning a particular topic”. 
In 1928, Ralph Hartley, in his “Transmission of Information” article, explained how the "logarithm", in the form of x = y log z, specifically of the "number of possible symbol sequences" is the best "practical measure of information", specifically in regard to a telegraph operator sending 1s (HIs) and 0s (LOs) in a telegraph transmission.
In 1937, the term “information” was used in reference to television broadcast signals.
In 1944, information was used in reference to punch-card operating systems.
Shannon | Information
In 1948, Claude Shannon, in his “A Mathematical Theory of Communication”, building on Hartley’s logarithm model of “information” of the number of possible symbol sequences of 1s and 0s in a telegraph message, introduced the “information theory”, amid which, per John Neumann’s suggestion (c.1940), named his new H formula for “information” (knowledge transmitted or stored), “choice” (telegraphy operator type), or “uncertainty” (Heisenberg type)” by the already-employed in thermodynamics name “entropy”, presumed, incorrectly, to be thematics to Ludwig Boltzmann’s H-function of statistical mechanics. This terminology mistake launched a pandora’s box of confusion and misunderstanding.
In 1956, Shannon, in his article “The Bandwagon”, had to recant, and say specifically that his theory of information is not applicable outside of electronics communication science proper:
- “Workers in other fields should realize that the basic results of this [information science] subject are aimed at a very specific direction, one that is NOT relevant to fields such as psychology, economics, and other social sciences. “It will be all to easy for our somewhat artificial prosperity to collapse overnight when it is realized that the use of a few ‘excited’ words like information, entropy, redundancy, do not solve all our problems.” ”
In the wake of Shannon's article, a information "bandwagon" had gained traction, wherein people were attempting to argue for "information theories" of every thing.
In 1999, Shu-Kun Lin, an organic and inorganic physical chemist, founded the Entropy journal as a repercussion of his openly-stated confusion between: the “entropy” of Shannon, on one hand, the “entropy” of Clausius, Gibbs, Boltzmann, and Planck, on another hand, and the “entropy” of Prigogine, on a third hand.
In 1953, information, supposedly, was used in reference to DNA; presumably in the context of “genetic information”, in sense of traits that are passed from parent to offspring in genes.
The following are related quotes:
- “It is misleading in a crucial way to view ‘information’ as something that can be poured into an empty vessel, like a fluid or even energy.”
- — Anatol Rapoport (1956), “The Promise and Pitfalls of Information Theory”
- “Information is physical.”
- “The whole universe may be a single hologram: The information about all of it is encapsulated in every part of it.”
- Dolloff, Norman H. (1975). Heat Death and the Phoenix: Entropy, Order, and the Future of Man (pg. xvi). Exposition Press.
- (a) Information – Online Etymology Dictionary.
(b) Inform – Online Etymology Dictionary.
- Information – Online Etymology Dictionary.
- (a) Shannon, Claude. (1956). “The Bandwagon”, IRE Transactions: on Information Theory, 2(1):3, March.
(b) Mitra, Partha; Bokil, Hemant. (2007). Observed Brain Dynamics (§1.3.1: Reversible and Irreversible Dynamics; Entropy, pgs. 9-; Appendix A: The Bandwagon by C.E. Shannon, pgs. 343-44; Appendix B: Two Famous Papers by Peter Elias, pgs. 345-46). Oxford University Press.
(c) Shannon bandwagon – Hmolpedia 2020.
- Shannon bandwagon – Hmolpedia 2020.
- Shu-Kun Lin – Hmolpedia 2020.
- Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (Ѻ), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
- Rapoport, Anatol. (1956). “The Promise and Pitfalls of Information Theory” (abs), Behavioral Science, 1:303-09; in: Modern Systems Research for the Behavioral Scientist (editor: Walter Buckley) (pgs. 137-42). Aldine, 1968.
- (a) Landauer, Rolf. (1991). “Information is Physical” (pdf), Physics Today, 44:23-29.
(b) Landauer, Rolf. (1996). “The Physical Nature of Information” (abs) (pdf), Physics Letters A, 217:188-93.
- Wiley, John. (1993). Natural High (pg. #) (abs). UPNE.