Talk:negentropy

RFV discussion: July–August 2018
We have one citation that uses the term. Note: citations of "negative entropy" do not count. SemperBlotto (talk) 08:23, 19 July 2018 (UTC)
 * ✅: see "Citations:negentropy". Still unsure how to define the word, though. — SGconlaw (talk) 17:39, 19 July 2018 (UTC)

RFV-passed Kiwima (talk) 03:01, 5 August 2018 (UTC)

RFC discussion: July 2018
The only sense: this seems really encyclopedic. S URJECTION ·talk·contr·log· 16:30, 18 July 2018 (UTC)
 * Wow, someone went really overboard. Here is a somewhat shorter attempt: “The deficit in the actual entropy of a system compared to what might be expected.” Or is that overly simplistic? (As to the potential objection that “what might be expected” is vague, the specific meaning in various contexts does indeed depend on one’s notion of what might be expected, which will be different for different branches of science.) --Lambiam 00:19, 19 July 2018 (UTC)
 * If gravity is taken into account, then the actual entropy of the universe is quite expected. The first definition should be general, not context-specific. In the most general case, the number of degrees of freedom is proportional to the number of spatial dimensions. Gravitational spaghettification decreases the number of dimensions from 3 to 1 and thus decreases the number of degrees of freedom. The radius and thus the internal entropy of a single quantum (such as a proton) is inversely proportional to its gravitational mass. Without gravity, the universe has the maximal possible entropy and is a uniform blanket of enormously swollen protons (each proton miles across) whose gravitational fields cancel each other because the protons are uniformly distributed. This is the initial state of the universe. Then protons undergo a 13.8-billion-year-long hierarchization into a "world tree", within which their gravitational fields amplify each other. Thus gravity appears and makes the world negentropic. —91.122.5.137 02:59, 19 July 2018 (UTC)
 * This is a dictionary. We write concise definitions that summarize the meaning of the terms as used by speakers of the language. We don't explain the concepts, and we certainly don't enumerate them in comprehensive detail. Just as a start, I removed the illustration- which illustrated nothing, and its "caption"- which was really a full-page essay. The quotes should also be severely pruned, because they aren't there to illustrate usage but rather to collectively tell a long, long L-O-O-O-N-N-N-G story. Chuck Entz (talk) 04:41, 19 July 2018 (UTC)
 * It is impossible to use a word properly without understanding it. That is why a dictionary definition should not be excessively concise. —91.122.5.137 05:26, 19 July 2018 (UTC)
 * I have removed the so-called quotations that didn't actually use the word. I'm not convinced that this is a word used in the scientific community. The "definition" doesn't make much sense to me. SemperBlotto (talk) 05:29, 19 July 2018 (UTC)
 * You are not convinced that the word "negentropy" is used in the scientific community? And you are a Wiktionary administrator? It is very telling. —91.122.5.137 05:53, 19 July 2018 (UTC)
 * There's no danger of being concise, let alone excessively so. As I said, this is a dictionary. We can't explain the meaning behind a word in the detail necessary for someone to write a paper on the subject- it's the wrong tool for what you want to do. As for "understanding", you're failing miserably there, too. Judging by the other comments, it would seem that only someone who already was quite familiar with the concepts would be able to figure out what you're trying to say- and they'd probably disagree with you. Chuck Entz (talk) 13:55, 19 July 2018 (UTC)
 * Based on the Wikipedia article, I would convert it into two definitions concerning biology and information theory: first: "(biology) The entropy exported by a living system in order to keep its own entropy low." and second: "(information theory) The measure of normality; the difference between the entropy of a distribution and the respective Gaussian distribution.". It might be possible to merge them, though... S URJECTION ·talk·contr·log· 11:02, 19 July 2018 (UTC)
 * For an organism's entropy to become lower, the organism must export entropy at a higher rate than it acquires it endogenously and exogenously. However, the net export of entropy cannot be used as a measure of negentropy, because it may be accompanied by a decrease in the organism's mass and thus in the maximum possible entropy:
 * Every type of constraint, every additional condition imposed on the possible freedom of choice immediately results in a decrease of information.
 * —Brillouin, Léon. Science and Information Theory Courier Dover Publications, 2004, p. 8
 * Therefore, negentropy can only be defined as "the difference between the entropy of a system and the entropy of the same system at equilibrium (i.e., when it has its maximum possible entropy)". —91.122.5.137 11:49, 19 July 2018 (UTC)
 * For physical systems that is not a bad definition. I suggest as a refinement, “... the same system at thermodynamic equilibrium with its environment”. It skirts the issue whether life is involved; it could be an artificial system, not necessarily biological, emulating life. This definition does not work for mathematical abstractions such as probability distributions. I dislike the definition Wikipedia gives for information theory; it is unnecessarily restrictive. In particular, as stated it cannot be applied to discrete distributions. It should be defined more generally, as “the negentropy of a distribution D with respect to a given family F of distributions of which D is a member is the difference between ...”. In the Wikipedia definition, that family is specialized to consist of the distributions over R with given mean and variance. I do not see immediately how to make this palatable for dictionary use, even with that particular specialization. To keep it sweet and simple, we could simply say, “a measure of the order in a probability distribution”. --Lambiam 20:49, 19 July 2018 (UTC)
 * The negentropy of a system does not depend on the system's environment. In Boltzmann's definition, the entropy of a system is directly proportional to the natural logarithm of the number of the system's microstates compatible with the system's macrostate. The macrostate having the highest possible number of compatible microstates is that of a uniformly spread-out and infinitely rarefied gas. Thus, the negentropy of a system is a measure of the system's hierarchic condensedness. —91.122.6.147 04:19, 20 July 2018 (UTC)
 * The term is definitely used in various scientific communities, see this Google scholar search. A quick glance over the results returned will also show that a negligible number of uses concerns cosmology. The current Wiktionary definition is exclusively concerned with a cosmological theory, and a fringe theory at that. In view of both the theory espoused and the IP’s concerned I smell a connection with . --Lambiam 08:14, 19 July 2018 (UTC)
 * Thank you for smelling my socks. —91.122.5.137 08:37, 19 July 2018 (UTC)

As to the ODO definition “reduction in entropy (and corresponding increase in order)”, I think its main fault is that this may be interpreted as referring to a process rather than a quantity. And when interpreted the right way as the difference beween two amounts of entropy, the question arises, reduction with respect to what? For a reduction in salary, everyone will understand this is a reduction with respect to an earlier salary, but here we are not referring to a reduction in earlier entropy, but rather potentially achievable entropy. --Lambiam 09:30, 20 July 2018 (UTC)
 * I'd say just revert to last good entry. Stuff like "gravitoelectrical treeing" is jargon that doesn't appear to be in general use. Equinox ◑ 18:19, 19 July 2018 (UTC)
 * I’m not sure there ever was a good entry. The choice is between not good, pretty bad, and abysmally pathetic. --Lambiam 20:53, 19 July 2018 (UTC)
 * I’ve been bold and applied my suggestions to the lemma. --Lambiam 21:33, 19 July 2018 (UTC)
 * As a layperson I still find the current definitions mystifying. What about the definition of “negentropic” at Oxford Dictionaries Online? Is it too simplistic? — SGconlaw (talk) 23:01, 19 July 2018 (UTC)
 * Since your post my 23-word definition ballooned to 86 words; did that make it more accessible? I can imagine a merger between Wikipedia and Wiktionary, like a glorified , but at the moment the separation between the dictionary and encyclopedia functions is guardedly maintained. Even in an encyclopedia you cannot expect the exposition of a highly technical subject to be generally understandable without prior background knowledge. How likely is it that a user will understand our definition of, or of ? If you don't have a working idea of what is, or , you had better read up on them before attempting to understand the notion of negentropy – and by “reading up” I do not mean looking up their dictionary definitions. You may find then that you also need to grasp the  and then get confused by all the talk about degrees of freedom. There is no way you can get that from dictionary definitions.
 * Actually, the IP editor, who is seemingly extremely obsessed with making the entry to be precisely the way they see fit, has done most of the bloating. Is it already time for a block? S URJECTION ·talk·contr·log· 11:20, 20 July 2018 (UTC)