In terms, information, from the Latin in- “into” + formare “to form, shape” (14th century), refers to communication of knowledge, on some subject, topic, news, or thing, designed to shape, give form, delineate, or instruct upon.
In c.1550, information was being used to mean “knowledge communicated concerning a particular topic”. 
In 1928, Ralph Hartley, in his “Transmission of Information” article, explained how the "logarithm", in the form of x = y log z, specifically of the "number of possible symbol sequences" is the best "practical measure of information", specifically in regard to a telegraph operator sending 1s (HIs) and 0s (LOs) in a telegraph transmission.
In 1937, the term “information” was used in reference to television broadcast signals.
In 1944, information was used in reference to punch-card operating systems.
Shannon | Information
In 1948, Claude Shannon, in his “A Mathematical Theory of Communication”, building on Hartley’s logarithm model of “information” of the number of possible symbol sequences of 1s and 0s in a telegraph message, introduced the “information theory”, amid which, per John Neumann’s suggestion (c.1940), named his new H formula for “information” (knowledge transmitted or stored), “choice” (telegraphy operator type), or “uncertainty” (Heisenberg type)” by the already-employed in thermodynamics name “entropy”, presumed, incorrectly, to be thematics to Ludwig Boltzmann’s H-function of statistical mechanics. This terminology mistake launched a pandora’s box of confusion and misunderstanding.
In the wake of Shannon's article, a information "bandwagon" had gained traction, wherein people were attempting to argue for "information theories" of every thing.
In 1975, James Lovelock, to cite one example, acting as an advisor for NASA, per citation of Shannon, argued that the “information” of a possible “alien biospheric system” on Mars, could be, in theory, defined, in a “thermodynamic sense”, according to Lovelock, by the chemical thermodynamic entropy change ΔS of an alien; the following formula specifically:
where S0 is the "entropy of the system whose components are at equilibrium" and S is the "entropy of the system when assembled".
In 1953, information, supposedly, was used in reference to DNA; presumably in the context of “genetic information”, in sense of traits that are passed from parent to offspring in genes.
The following are related quotes:
- “Information is physical.”
- “The whole universe may be a single hologram: The information about all of it is encapsulated in every part of it.”
- Dolloff, Norman H. (1975). Heat Death and the Phoenix: Entropy, Order, and the Future of Man (pg. xvi). Exposition Press.
- (a) Information – Online Etymology Dictionary.
(b) Inform – Online Etymology Dictionary.
- Information – Online Etymology Dictionary.
- Shannon bandwagon – Hmolpedia 2020.
- (a) Lovelock, James E.; Kaplan, I.R. (1975). “Thermodynamics and the Recognition of Alien Biospheres [and Discussion]”, Proceedings of the Royal Society of London, Series B: Biological Sciences, 189(1095):167-81.
(b) Skene, Keith. (2020). “In Pursuit of the Framework behind the Biosphere: S-curves, Self-assembly, and the Genetic Entropy Paradox” (abs) (pdf), BioSystems, Volume 190, Apr.
- Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (Ѻ), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
- (a) Landauer, Rolf. (1991). “Information is Physical” (pdf), Physics Today, 44:23-29.
(b) Landauer, Rolf. (1996). “The Physical Nature of Information” (abs) (pdf), Physics Letters A, 217:188-93.
- Wiley, John. (1993). Natural High (pg. #) (abs). UPNE.