# Information

A visual summary of how the "bit", the mathematical unit of "information", from Claude Shannon's 1948 "information theory" has become, in the confused minds of many, an "open sesame" (Dolloff, 1975) to unlock the secrets of the universe, and the thermodynamics of humans.[1]

In terms, information, from the Latin in- “into” + formare “to form, shape” (14th century), refers to communication of knowledge, on some subject, topic, news, or thing, designed to shape, give form, delineate, or instruct upon.[2]

In c.1550, information was being used to mean “knowledge communicated concerning a particular topic”. [3]

## Information science

In 1928, Ralph Hartley, in his “Transmission of Information” article, explained how the "logarithm", in the form of x = y log z, specifically of the "number of possible symbol sequences" is the best "practical measure of information", specifically in regard to a telegraph operator sending 1s (HIs) and 0s (LOs) in a telegraph transmission.

In 1937, the term “information” was used in reference to television broadcast signals.[3]

In 1944, information was used in reference to punch-card operating systems.

### Shannon | Information

In 1948, Claude Shannon, in his “A Mathematical Theory of Communication”, building on Hartley’s logarithm model of “information” of the number of possible symbol sequences of 1s and 0s in a telegraph message, introduced the “information theory”, amid which, per John Neumann’s suggestion (c.1940), named his new H formula for “information” (knowledge transmitted or stored), “choice” (telegraphy operator type), or “uncertainty” (Heisenberg type)” by the already-employed in thermodynamics name “entropy”, presumed, incorrectly, to be thematics to Ludwig Boltzmann’s H-function of statistical mechanics. This terminology mistake launched a pandora’s box of confusion and misunderstanding.

### Bandwagon

In the wake of Shannon's article, a information "bandwagon" had gained traction, wherein people were attempting to argue for "information theories" of every thing.[4]

In 1975, James Lovelock, to cite one example, acting as an advisor for NASA, per citation of Shannon, argued that the “information” of a possible “alien biospheric system” on Mars, could be, in theory, defined, in a “thermodynamic sense”, according to Lovelock, by the chemical thermodynamic entropy change ΔS of an alien; the following formula specifically:[5]

${\displaystyle I=S_{0}-S}$

where S0 is the "entropy of the system whose components are at equilibrium" and S is the "entropy of the system when assembled".

This grand confusion, continues, in growing amounts of ignorance, to this very day (e.g. see: Talk:IAISAE).[6]

## Genetic information

In 1953, information, supposedly, was used in reference to DNA; presumably in the context of “genetic information”, in sense of traits that are passed from parent to offspring in genes.[3]

## Quotes

The following are related quotes:

Information is physical.”
Rolf Landauer (1991), “Information is Physical”[7]; cited by Seth Lloyd (2006) in Programming the Universe (pg. 213)
“The whole universe may be a single hologram: The information about all of it is encapsulated in every part of it.”
John Wiley (1993), Natural High (pg. #) [8]

## References

1. Dolloff, Norman H. (1975). Heat Death and the Phoenix: Entropy, Order, and the Future of Man (pg. xvi). Exposition Press.
2. (a) Information – Online Etymology Dictionary.
(b) Inform – Online Etymology Dictionary.
3. Information – Online Etymology Dictionary.
4. Shannon bandwagon – Hmolpedia 2020.
5. (a) Lovelock, James E.; Kaplan, I.R. (1975). “Thermodynamics and the Recognition of Alien Biospheres [and Discussion]”, Proceedings of the Royal Society of London, Series B: Biological Sciences, 189(1095):167-81.
(b) Skene, Keith. (2020). “In Pursuit of the Framework behind the Biosphere: S-curves, Self-assembly, and the Genetic Entropy Paradox” (abs) (pdf), BioSystems, Volume 190, Apr.
6. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (Ѻ), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
7. (a) Landauer, Rolf. (1991). “Information is Physical” (pdf), Physics Today, 44:23-29.
(b) Landauer, Rolf. (1996). “The Physical Nature of Information” (abs) (pdf), Physics Letters A, 217:188-93.
8. Wiley, John. (1993). Natural High (pg. #) (abs). UPNE.