My watch list  


In 1943 Erwin Schrödinger used the concept of “negative entropy” in his popular-science book What is life? [1]. Later, Léon Brillouin shortened the expression to a single word, negentropy. [2][3] Schrödinger introduced the concept when explaining that a living system exports entropy in order to maintain its own entropy at a low level (see entropy and life). By using the term negentropy, he could express this fact in a more "positive" way: a living system imports negentropy and stores it.

In a note to What is Life? Schrödinger explains his usage of this term.

Let me say first, that if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things.

In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy, a term which may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who attempted to construct a unified theory of the biological and physical worlds. (This attempt has not gained renown or borne great fruit.) Buckminster Fuller attempted to popularize this usage, though negentropy still remains common.


Information theory

In information theory and statistics, negentropy is used as a measure of distance to normality. [4][5][6] Consider a signal with a certain distribution. If the signal is Gaussian, the signal is said to have a normal distribution. Negentropy is always positive, is invariant by any linear invertible change of coordinates, and vanishes iff the signal is Gaussian.

Negentropy is defined as

J(p_x) = S(\phi_x) - S(p_x)\,

where Sx) stands for the Gaussian density with the same mean and variance as px and S(px) is the differential entropy:

S(p_x) = - \int p_x(u) \log p_x(u) du

Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in Independent Component Analysis. [7][8] Negentropy can be understood intuitively as the information that can be saved when representing px in an efficient way; if px where a random variable (with Gaussian distribution) with the same mean and variance, would need the maximum length of data to be represented, even in the most efficient way. Since px is less random, then something about it is known beforehand, it contains less unknown information, and needs less length of data to be represented in an efficient way.

Organization theory

In risk management, negentropy is the force that seeks to achieve effective organizational behavior and lead to a steady predictable state.[9]


  1. ^ Schrödinger Erwin What is Life - the Physical Aspect of the Living Cell, Cambridge University Press, 1944
  2. ^ Brillouin, Leon: (1953) "Negentropy Principle of Information", /J. of Applied Physics/, v. 24:9, pp. 1152-1163
  3. ^ Léon Brillouin La science et la théorie de l'information, Masson, 1959
  4. ^ Aapo Hyvärinen, Survey on Independent Component Analysis, node32: Negentropy, Helsinki University of Technology Laboratory of Computer and Information Science
  5. ^ Aapo Hyvärinen and Erkki Oja, Independent Component Analysis: A Tutorial, node14: Negentropy, Helsinki University of Technology Laboratory of Computer and Information Science
  6. ^ Ruye Wang, Independent Component Analysis, node4: Measures of Non-Gaussianity
  7. ^ P. Comon, Independent Component Analysis - a new concept?, Signal Processing, 36:287-314, 1994.
  8. ^ Didier G. Leibovici and Christian Beckmann, An introduction to Multiway Methods for Multi-Subject fMRI experiment. FMRIB Technical Report, Oxford Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), Department of Clinical Neurology, University of Oxford, John Radcliffe Hospital, Headley Way, Headington, Oxford, UK.
  9. ^ Pedagogical Risk and Governmentality: Shantytowns in Argentina in the 21st Century (see p. 4).

See also

This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Negentropy". A list of authors is available in Wikipedia.
Your browser is not current. Microsoft Internet Explorer 6.0 does not support some functions on Chemie.DE