Information

From Hmolpedia
Jump to navigation Jump to search
A visual summary of how the "bit", the mathematical unit of "information", from Claude Shannon's 1948 "information theory" has become, in the confused minds of many, an "open sesame" (Dolloff, 1975) to unlock the secrets of the universe, and the thermodynamics of humans.[1]

In terms, information, from the Latin in- “into” + formare “to form, shape” (14th century), refers to communication of knowledge, on some subject, topic, news, or thing, designed to shape, give form, delineate, or instruct upon.[2]

In c.1550, information was being used to mean “knowledge communicated concerning a particular topic”. [3]

Information science

In 1928, Ralph Hartley, in his “Transmission of Information” article, explained how the "logarithm", in the form of x = y log z, specifically of the "number of possible symbol sequences" is the best "practical measure of information", specifically in regard to a telegraph operator sending 1s (HIs) and 0s (LOs) in a telegraph transmission.

In 1937, the term “information” was used in reference to television broadcast signals. [3]

In 1944, information was used in reference to punch-card operating systems.

Shannon | Information

In 1948, Claude Shannon, in his “A Mathematical Theory of Communication”, building on Hartley’s logarithm model of “information” of the number of possible symbol sequences of 1s and 0s in a telegraph message, introduced the “information theory”, amid which, per John Neumann’s suggestion (c.1940), named his new H formula for “information” (knowledge transmitted or stored), “choice” (telegraphy operator type), or “uncertainty” (Heisenberg type)” by the already-employed in thermodynamics name “entropy”, presumed, incorrectly, to be thematics to Ludwig Boltzmann’s H-function of statistical mechanics.

This terminology mistake launched a pandora’s box of confusion and misunderstanding; one example, of thousands, being James Lovelock (1975) advising NASA, per citation of Shannon, that the “information” of a possible “alien biospheric system” on Mars, could be, in theory, defined, in a “thermodynamic sense”, according to Lovelock, by the chemical thermodynamic entropy change ΔS of an alien; the following formula specifically:[4]

where S0 is the "entropy of the system whose components are at equilibrium" and S is the "entropy of the system when assembled".

This grand confusion, continues, in growing amounts of ignorance, to this very day (e.g. see: Talk:IAISAE).[5]

Genetic information

In 1953, information, supposedly, was used in reference to DNA; presumably in the context of “genetic information”, in sense of traits that are passed from parent to offspring in genes. [3]

References

  1. Dolloff, Norman H. (1975). Heat Death and the Phoenix: Entropy, Order, and the Future of Man (pg. xvi). Exposition Press.
  2. (a) Information – Online Etymology Dictionary.
    (b) Inform – Online Etymology Dictionary.
  3. 3.0 3.1 3.2 Information – Online Etymology Dictionary.
  4. (a) Lovelock, James E.; Kaplan, I.R. (1975). “Thermodynamics and the Recognition of Alien Biospheres [and Discussion]”, Proceedings of the Royal Society of London, Series B: Biological Sciences, 189(1095):167-81.
    (b) Skene, Keith. (2020). “In Pursuit of the Framework behind the Biosphere: S-curves, Self-assembly, and the Genetic Entropy Paradox” (abs) (pdf), BioSystems, Volume 190, Apr.
  5. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (pdf) (Ѻ), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.

External links

Theta Delta ics T2.jpg