Entropy (Wikipedia)

From Hmolpedia
Jump to navigation Jump to search
A visual of how far the apple, namely "entropy" (2021) defined by Wikipedia, has fallen from the tree (Clausius, 1865), as ranked below.

In thermodynamics, entropy (Wikipedia) (LH:1) refers to entropy defined according to Wikipedia, the top 15 definitions shown below, ranked by definitional correctness.

Overview

The following, as of 25 Feb 2021, are the opening sentence or paragraph definitions of on "entropy", from the top 15 Wikipedia language entries, specifically those with 1M+ articles and 2K+ active users[1], ranked by definitional correctness, top row being being most-correct, bottom row being least-correct:

# Definition Language Users Discussion
In thermodynamics, thermodynamic entropy (or simply entropy), denoted S, is a measure of heat dQ dissipated or absorbed when a physical system transitions at a specified absolute temperature T.[2] Vietnamese 2,138 Good standard basic definition.
The entropy (word from ancient Greek ἐντροπία ENTROPIA from ἐν "into" and τροπή trope "turn") is a fundamental thermodynamic state variable with the SI unit joules per kelvin (J/K).[3] German 22,378 Nothing incorrect (↑) here; although it says only that it is a "state variable", nothing else.
Entropy, or "thermal deficiency" [thermal inertia], origin of the word is derived from the Greek meaning «transformation». It is an important concept in thermodynamics, especially for the second law that deals with the physical processes of large systems consisting of very numbered particles and examines their behavior as a process that occurs automatically or not. The second law of thermodynamics states a basic principle that says: Any change that occurs automatically in a physical system must be accompanied by an increase in the amount of its entropy.[4] Arabic 10,498 Good basic definition (↑); entropy, however, is not "thermal defficiency" or [thermal inertia].
Entropy (S) is an important concept in thermodynamics. At the most fundamental level, it is a measure of the probability of a particular distribution of micro states, i.e. states of motion of elementary building blocks, such as atoms and molecules) within an isolated physical system.[5] Dutch 4,789 Nothing really wrong with this; it is the basic Boltzmann definition of entropy, without any added frills. It is not, however, the "fundamental" level (↓), i.e. Clausius (1865)[6], definition of entropy.
In thermodynamics, entropy (symbolized as S) is a physical quantity for a thermodynamic system in equilibrium. It measures the number of microstates compatible with the macrostate of equilibrium, it can also be said that it measures the degree of organization of the system, or that it is the ratio of an increase between internal energy versus an increase in temperature of the thermodynamic system.[7] Spanish 17,754 This: "ratio of an increase between internal energy versus an increase in temperature of the system" is blurry? (~)
The entropy (the Greek εντροπία), unit [J/K] (joules per kelvin) is a magnitude in thermodynamic measuring the degree of molecular freedom of a system and is associated to its configuration number (or microstates), that is, in how many ways particles (atoms, ions or molecules) can be distributed in quantized energy levels, including translational, vibrational, rotational, and electronic. Entropy is also generally associated with randomness, dispersion of matter and energy, and "disorder" (not in common sense) of a thermodynamic system.[8] Portuguese 10,494 First part correct (↑); second part incorrect. (↓)
Entropy (s or S) thermodynamic state function that determines the direction of spontaneous (spontaneous) processes in an isolated thermodynamic system. Entropy is a measure of the degree of disorder in a system and energy dissipation. It is an extensive quantity. According to the second law of thermodynamics, if the thermodynamic system goes from one state of equilibrium to the other, without the participation of external factors (i.e. spontaneously), its entropy always increases.[9] Polish 5,278 Everything good here, except "degree of disorder" and "degree of dissipation".
L ' entropy (from the Greek ancient ἐν en , "inside", and τροπή TROPE , "transformation") is, in statistical mechanics, a size (more particularly a generalized coordinates) which is interpreted as one of this disorder measure in a system any physical, including, as an extreme case, the universe. It is generally represented by the letter S. In the International System it is measured in joules divided by kelvin (J/K).[10] Italian 10,085 Entropy is not a measure of the disorder (↓) of the universe.
Entropy in thermodynamics and statistical mechanics is defined in the extensive property of the state quantity it is. In thermodynamics adiabatic conditions in irreversible introduced as an index representing the in statistical mechanics system of microscopic of "clutter" (sometimes referred to as "randomness". The term "random" here does not mean that it contains contradictions, mistakes, or is off-target, but that it is uncorrelated and random) meaning that the physical quantity representing the has been made. From the results of statistical mechanics, it was pointed out that it is related to the information obtained from the system, and it has come to be applied to information theory. Physicist Edwin Jaynes argues that it should be regarded as one application of information theory entropy in rather physics and so on.[11] Japanese 16,090 Extensive property is good (↑). Entropy, however, has nothing (↓↓↓) to do with "information". Citation to Edwin Jaynes (↓) is a red flag.
Entropy S the physical quantity used to describe the thermodynamic system is one of the main thermodynamic quantities. Entropy is a function of the state of the thermodynamic system and is widely used in thermodynamics, including technical (analysis of the operation of heat engines and refrigeration systems) and chemical (calculation of the equilibrium of chemical reactions). The statement about the existence and growth of entropy and the list of its properties constitute the content of the second law of thermodynamics. The significance of this quantity for physics is due to the fact that along with temperature, it is used to describe thermal phenomena and thermal properties of macroscopic objects. Entropy is also called a measure of chaos.[12] Ukranian 3,818 All good here; except "measure of chaos" sentence.
Entropy is a physical state function, designated S. In statistical mechanics, it can be seen as a measure of the probability that a system will assume a certain state; in thermodynamics rather as a measure of how much of the heat energy in a system can not be converted to work. The concept of entropy is also used in statistics, information theory, psychology, and theories of the mind.[13] Swedish 2,750 All good here; except "also used in information theory" (↓↓), this was a Sokal affair joke suggested by John Neumann in 1940.
Entropy (from ancient Greek Ἐν «in» + τροπή «treatment; conversion») is widely used in natural and exact sciences, the term (first introduced within thermodynamics as a function of the state of a thermodynamic system) denotes measure of the irreversible dissipation of energy or energy uselessness (because not all the energy of the system can be used to transform it into useful work). Russian 12,460 "Dissipation" is not fully correct; this is the Thomson interpretation, which has religious underpinnings. The "uselessness" model is not fully correct; first it is an anthropism, second, a unit of heat, which is what entropy is, was never derived based on its "use" or not (by humans).
Chemistry and thermodynamics, the so-called entropy is a measurement dynamics can not do respect the work of the energy total, that is, when the overall entropy increase its power capability also dropped, entropy is the measure of an indicator of energy degradation. It is used to calculate the disorder or degree of chaos of the system.[14] Chinese 8,415 Entropy is not a measure of chaos (↓).
The term entropy was introduced in 1865 by Rudolf Clausius from a Greek word meaning "transformation". It characterizes the degree of disorganization, or unpredictability, of the information content of a system.[15] French 23,276 Note: The French version is so-confused that they have turned the main entry on entropy into a disambiguation page.
Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty.[16] English 146,245 This is the worst lead sentence of the group. This is an example of incorrect dumbed-down thermodynamics par excellence. Clausius (1865) never uses the terms "disorder" or "randomness", in his published works on entropy; and only uses the term "uncertainty" three times, but with respect to some numerical calculations.[6]

End matter

See also

References

  1. List of Wikipedians – Wikipedia.
  2. Entropy (Vietnamese → English) – Wikipedia.
  3. Entropy (German → English) – Wikipedia.
  4. Entropy (Arabic → English) – Wikipedia.
  5. Entropy (Dutch → English) – Wikipedia.
  6. 6.0 6.1 Clausius, Rudolf. (1865). The Mechanical Theory of Heat (translator: Thomas Hirst) (entropy, 10-pgs; uncertainty, 3-pgs; disorder, 0-pgs; randomness, 0-pgs). Macmillan & Co, 1867.
  7. Entropy (Spanish → English) – Wikipedia.
  8. Entropy (Portuguese → English) – Wikipedia.
  9. Entropy (Polish → English) – Wikipedia.
  10. Entropy (Italian → English) – Wikipedia.
  11. Entropy (Japanese → English) – Wikipedia.
  12. Entropy (Ukrainian → English) – Wikipedia.
  13. Entropy (Swedish → English) – Wikipedia.
  14. Entropy (Chinese → English) – Wikipedia.
  15. Entropy (French → English) – Wikipedia.
  16. Entropy – Wikipedia.
  17. What is entropy debate – Hmolpedia 2020.
  18. Moriarty-Thims debate (subdomain) – Hmolpedia 2020.

External links

Theta Delta ics T2.jpg