Note on Entropy, Disorder and Dizorganization
2 Open Systems
3 The 'Meaning' of Entropy
4 Entropy, Disorder and Disorganization
The occasion for this note was my viewing a recent TV programme in
which several scientists, physical as well as biological, took part. I
was surprised by their expression of the view that the spontaneous formation
and evolution of living systems is a phenomenon not easily reconcilable
with thermodynamics. Of course formerly this idea had been widespread;
but I had hoped that a number of studies, including some of my own. had
disposed of the notion that any actual contradiction is involved—i.e. between
entropy increase, on the one hand, and, on the other, certain less rigorously
definable processes such as might characterise biological evolution. For
example increase of 'orderliness', increase of 'organization' or of 'complexity'.
These terms are far from meaning the same thing, but have all been supposed—erroneously
in my view—to be contraries to the process of entropy increase.
I shall argue that the Second Law of Thermodynamics, though having an
important scope, is actually much less strongly restrictive about 'what
may be going on in the cosmos' than is widely thought. As it happens the
notion of 'entropy' is now much used in literary and journalistic circles.
It has become a vogue word, displayed on T-shirts, and taken as a measure
of everything that is supposedly deteriorating or getting worse. These
far-fetched uses of the entropy concept will not be eradicated until scientists
themselves declare them to be mistaken. And indeed they amount to something
much more than a mere misuse of words for they have resulted in scientific
error as well, notably in biology.
2 OPEN SYSTEMS
A preliminary point, needing only a few paragraphs, is that the Second
Law does not apply directly to 'open systems'. Let the reader be reminded
that thermodynamics makes distinction between four classes of systems2
according to the constraints imposed upon them. They are:
(a) Isolated systems. These do not allow the transfer either of matter
or of energy across their boundaries.
(b) Adiabatically isolated systems. Here the transfer of heat (and
also of matter) across the boundaries is excluded, but not the transfer
of other forms of energy.
(c) Closed systems. Here the constraints are further relaxed and only
the transfer of matter is excluded.
(d) Open systems. These are those systems, defined by their geometrical
boundaries, which allow the passage of energy together with the molecules
of some (but not necessarily all) chemical substances.3
An abbreviated statement of the Second Law, to the effect that a system's
entropy can only increase (or remain constant), is directly applicable
only to systems belonging to classes (a) and (b). Systems in classes (c)
or (d) may actually undergo decreases of entropy, due to an outwards passage
of heat and/or of matter across the boundary surface. But of course the
Law can still be made to apply to a sufficiently enlarged system—i.e. to
the sub-system of interest together with all that part of its environment
such that the total system satisfies those conditions which bring it into
classes (a) or (b).4 It can then occur that
a decrease of entropy in the sub-system is compensated by an equal or greater
increase of entropy of the environment. Such behaviour is fully consistent
with the Second Law.
Now the vast majority of naturally occurring systems belong to class
(d). 'Isolation' and 'closedness' are not natural states of affairs. Consider
a few examples of open systems. Celestial bodies very slowly lose their
atmospheres by diffusion of gaseous molecules into space; but they also
gain matter in the forms of dust and meteorites and they gain radiation
from hotter bodies. For these reasons it may be difficult to say precisely
whether their entropies are increasing or diminishing. Living creatures
are also a very significant sub-class of open systems. For instance an
individual cell continuously takes up metabolites through its enclosing
membrane and this material undergoes chemical reactions within the cell
interior resulting in a variety of low- and high-molecular weight products.
Some of these pass out of the cell: others contribute to the cell's growth
and to its eventual division. The exchange of metabolites between a cell
and its environment is essential to the maintenance of cell function in
all of its aspects.
In fact quite complicated behaviour can be displayed by open systems
without such behaviour being in any way contrary to the Second Law. Particularly
striking is the occurrence of 'dissipative structures' as described by
Prigogine [1980, 1984] and his collaborators. They have described how an
otherwise improbable structure can become stabilised within an open system
at the expense of the compensating entropy production due to, say, an energy
flow through that system. Many experimentally studied examples, of the
type of the Belousov-Zhabotinskii reaction, have been described in the
Returning to the case of living creatures it may first be remarked that
the making of an accurate entropy balance on an organism together with
its environment (i.e. the 'sufficiently enlarged system' referred to above)
is a matter of very considerable difficulty. Nevertheless such experimental
evidence as is available (e.g. Linschitz, 1953) has not revealed any contravention
of the Second Law.
What then is meant by those who maintain that the phenomenon of life
is not easily reconcilable with this law? I think that what they have in
mind is a rather vague assertion to the effect that (a) organisms are highly
'orderly' and/ or 'organized' systems; (b) 'orderliness' and 'organization'
are inversely related to entropy: and therefore (c) organisms are abnormally
low entropy systems. They may then wish to go beyond (a), (b) and (c) and
to assert that organisms somehow avoid the Second Law, even after allowing
for their 'openness'. Or without actually going so far they may wish only
to say that highly orderly and/or organized systems are exceedingly improbable,
that improbability is related to low entropy and that Einstein's fluctuation
theory is sufficient to show that, even within an open system, a fluctuation
sufficient to produce a local increase of order and/or of organization
of sufficient magnitude is exceedingly unlikely.
If one or the other of these assertions is a fair statement of what
is meant, I shall argue that the whole argument is confused and fails because,
in fact, there is no necessary connection between entropy and either 'orderliness'
3 THE 'MEANING' OF ENTROPY
Consider a physico-chemical system subject to the constraint that it
is closed. Let it change during a period of time between its states numbered
1 and 2 which are determined by a sufficient number of macroscopic variables
such as energy, volume and composition. In thermodynamics its entropy change
is defined by the relation
where dq is the heat intake over any infinitesimal part of the change,
T is the corresponding temperature (absolute) and the subscript 'rev.'
denotes that the change is made to occur reversibly—i.e. through a sequence
of states none of which are displaced more than infinitesimally from equilibrium
Since (1) applies only to closed systems it leaves no scope for the
making of an immediate application of the entropy concept to open systems
such as living creatures. However, as noted above, open systems can often
be regarded as being embedded in a much larger closed system to which (1)
is indeed applicable. In fact one can define an entropy flow vector between
the open system and the closed system which contains it.
According to thermodynamics entropy is nothing other than it is defined
to be by (1). Of course entropy has interesting and useful properties but
is (1) which supplies its meaning. On the other hand statistical mechanics
(whose objective is to link thermodynamics with atomic and kinetic theory)
produces several alternative definitions of statistical entropy. None of
them is precisely the same as the thermodynamic entropy and none of them
is identical with the others.6 Gibbs called
them 'entropy analogues' and this is a good usage because it helps to reduce
confusion (a confusion which has subsequently been greatly increased by
the emergence of various 'information entropies').
It remains the thermodynamic entropy to which the Second Law refers
and indeed it has been one of the long-standing problems of statistical
mechanics to achieve a rigorous proof that any of the analogues display
an increasing tendency.
in ltie present context the most perspicuous analogue is the one associated
with the names of Boltzmann and Planck:
where k is Boltzmann's constant.7 Sap is
the statistical entropy of a closed system constrained to a fixed energy,
volume and composition (and such other macroscopic variables as may be
relevant). It corresponds to what Gibbs called the 'micro-canonical ensemble'.
W is then the number of independent quantum states (energy eigenstates)
which are accessible to that system. sbp is thus a logarithmic measure
of the extent to which the constraints limit the accessibility of the eigenstates
out of their otherwise infinite number: W is a measure of 'spread' (Guggenheim,
1949). (It may be added that it is a quite unnecessary gloss to assert
that either sbp or sg is a measure of our ignorance about the actual quantum
state momentarily occupied by the system.)8
If an 'interpretation' of entropy is desired, Guggenheim's notion of
'spread' seems to be the best candidate. Gibbs himself appears to have
toyed with the idea that entropy might be understood as 'mixed-up-ness'.9
Whether he would have gone ahead with the idea seems problematic, for he
was surely aware of the existence of counter-examples. Think for instance
of a finely divided emulsion of oil in water contained in an adiabatically
isolated container. As time goes on its entropy can only remain constant
or increase. Nevertheless the oil and water gradually become unmixed and
separate out as two liquid layers. The notion of entropy as mixed-up-ness
could then only be 'saved' by using the term in some non-obvious sense—e.g.
by resorting to the molecular rather than to the macro-level of description.
But if so one would surely do better to talk about Boltzmann and Planck's
In summary: if one wishes to substantiate a claim or a guess that some
particular process involves a change of thermodynamic or statistical entropy
one should ask oneself whether there exists a reversible heat effect, or
a change in the number of accessible energy eigenstates, pertaining to
the process in question. If not. there has been no change of physical entropy
(even though there may have been some change in our 'information').
4 ENTROPY, DISORDER AND DISORGANIZATION
Following these necessary preliminaries let us proceed to the 'interpretations'
which are the topics at issue—i.e. entropy regarded as disorder or as a
measure of disorganization. In my view these interpretations are erroneous,
or at best ambiguous, and in fact there are no pairwise relationships of
identity between any of the three terms appearing in the above heading.
Although I have made these points in previous publications I will here
present them in rather more formal terms.
It may be remarked first of all that the notions neither of disorder
nor of disorganization are related at all obviously to the quantities on
the right hand sides of equations (1) and (2) which are defining equations
for thermodynamic and statistical entropy respectively. Indeed the 'interpretations'
at issue are dubious from the very beginning in view of the fact that 'entropy'
is a term which belongs firmly within science whereas the terms 'order'
and 'organization', and their negations, do not. Their meanings are very
broad and are subject to large variations according to context—e.g. political,
Let us first try to characterise 'order' and 'orderliness', and their
negations, in the restricted context of space and time, and then attempt
to do the same for 'organization'.
It is a question, I think, of whether or not the set of entities (objects
or events) under discussion are distributed in space and/or time according
to some rule. For example a set of three or more objects, A, B, C, etc.,
will display a certain kind of orderliness if they exist in a linear arrangement.
Thereby the objects obey the rule that B is to the right of A, that C is
to the right of B, etc., when viewed from anywhere on one side of the line.
The same objects will display an additional kind of orderliness if there
is also a relationship (e.g. of equality, of doubling, etc.) between successive
separations, AB, BC, etc. There is then a more comprehensive state of order.
Now natural objects will not usually lie exactly on a straight line,
or on any other simple geometric figure such as a circle or an ellipse.
The arrangement of the objects has to be recognized10
as approximating to that figure and the question therefore arises: how
large a standard deviation (or comparable measure) is permissable for a
given set of entities to qualify as orderly in some particular respect.
Indeed one might distinguish between 'order' and 'order-liness', the former
being taken as referring to an ideal state in which there is complete agreement
with the rule, and the latter being a measure of the extent to which some
particular set of real entities approximates to that ideal.
The foregoing points can now be considered in relation to entropy by
taking the example of a crystalline solid. There is certainly no difficulty
in appreciating that a crystal is an orderly arrangement of its component
particles in so far as these lie at positions close to the intersection
points of a geometric lattice. The question is: How orderly is it? For
the supposed interpretation of entropy as orderliness is useless unless
the concept of orderliness is capable of being made at least as quantitative
as is that of entropy itself, and this has not been achieved in a relevant
To see that this is so, recall that W in equation (2) is calculable,
at least to high approximation, for the case of a perfect lattice, and
it is also calculable after allowing for various kinds of crystal imperfections.
But there is no theory of orderliness independent of statistical mechanics
which would provide the means of calculating a crystal's degree of orderliness,
and which would therefore permit the claim: "Behold, we have shown
that W, and thus statistical entropy, is an exact measure of disorder!"11
The error or ambiguity which is involved in the identification of entropy
with disorder is well illustrated by an example I have used previously:
the spontaneous crystallisation of a super-cooled melt. Under adiabatic
conditions the entropy of this system increases, but it would involve special
pleading to substantiate a claim to the effect that its disorder also increases!
Similar considerations apply in other contexts. In my view some of the
many discussions in the literature on the evolution of the universe from
the Big Bang onwards have been weakened by attempts to apply the notions
of 'chaos' and 'disorder'—and also 'uniformity'—as if these were equivalent
to using the Second Law.
Let us turn to a consideration of 'organization'. It is the supposed
relation of 'disorganization' to entropy increase which is my primary interest
in this note because of its presumed biological significance.
In the biological literature the terms 'order' and 'organization' are
often used interchangeably as if they were synonyms. That this is not the
case is shown by the existence of counter-examples. For instance a patterned
wallpaper is surely much more orderly, in having almost exact repetition,
than is. say, a Cezanne, but is much less highly organized. Any great painting
displays organization to a high degree but its parts are not related to
each other by a rule, such as is characteristic of a state of order. Similarly
a living cell is more highly organized than is a crystal even though the
latter is much more orderly, at least in the spatio-temporal context. The
existence of these pairs of entities which display the qualities in question
in reverse order of ranking is sufficient to show that these qualities
cannot be the same.
So far so good but the actual meaning of 'organized', as tacitly understood
in the last paragraph, is very difficult to specify, as has been recognised
and discussed by a number of philosophical biologists.12
Consider some examples where the concept 'organized' is manifestly applicable.
Just as the items depicted in a painting are spatially organized, so also
the notes and bars in a musical composition are temporally organized. Similarly
the component parts of a scientific theory or a mathematical treatise are
organized in a logical space.
Businesses and other sorts of institutions are exemplary cases and they
bring out once again the distinction between 'organized' and 'orderly'.
A diagram representing the structure of a business may be such that it
has none of the attributes of orderliness, such as symmetry and regularity;
and yet it clearly represents an organization in so far as it shows the
flow of work from one part of the business to another, and thus it relates
to time as well as to location.
There are many other instances of systems which are temporally, as well
as spatially, organized. For example in the living body the various processes
(chemical reactions, diffusions, etc) are clearly coordinated with each
other in time, as well as in space. Yet other examples of organized entities
are skilled performances, such as cycling, piano playing, using speech,
and so on, and also mechanical and electrical structures such as cars,
computers, etc. And indeed all things belonging to what Popper calls World
In several of these examples, especially those relating to physical
structures, it may appear that the notion of 'organized' is much the same
as that of 'complexity'. Yet there is an important difference. A computer
is complex but if a few of its internal connections were to be broken it
could no longer compute and, in my view, is then no longer to be deemed
an organization, even though it remains complex.
In short I believe that the most essential aspect of what is meant by
an organized system is that the system in question has a function', it
can do something or can be used to do something. As examples a mathematical
treatise can be used to derive some further results and a legal system
can be used to achieve justice and orderliness in society (whilst not being
'orderly' in itself). Even so this notion offunction remains ambiguous
when it is applied to living things. It would be altogether too vague to
claim that the function of an organism is to survive and to reproduce.
As I have remarked elsewhere (1981): '. . . an organism cannot be regarded
as an assembly of previously known components put together for the purpose
of achieving a specific objective . . . the "functions" of an
organism are read into it on the basis of hindsight . . .'.
Having thus briefly sketched what I believe to be the accepted usage
of the term 'organized' let me turn to the third possible equivalence—i.e.
that which might exist between entropy and disorganization. That this does
not in fact exist can be seen from the occurrence of counter-examples.
Think for instance of a fertile bird's egg inside an incubator. The latter
contains a sufficiency of air and was initially raised to a temperature
high enough for the hatching of the egg. The incubator was thereafter surrounded
by perfect thermal insulation with the consequence that its total entropy
can only increase or remain constant. However there remain two possibilities
concerning a different aspect of the system's temporal development: (1)
the egg dies; (2) the egg lives and eventually gives rise to a live chick.
Now it is true that in case (1) there is an entropy increase accompanied
by a process of disorganization, localised in the egg. But the opposite
is the situation in case (2): For although the egg is certainly a highly
organized system, the live chick must surely be deemed to be much more
so. Entropy again increases but now there is an increase in the degree
of organization as well. This example thus provides a clear instance of
its being false to suppose that entropy increase is equivalent to a process
Notice that we habitually speak of some particular entity as being more
or less highly organized than another. This suggests that we have an intuitive
concept of amount or degree of organization, as was tacitly assumed in
the previous paragraph. If so, examples such as that of the egg dying or
hatching indicate that 'amount of organization' is not conserved—i.e. it
need not remain constant in time. Of course this is not to say that organisms
operate in a manner contrary to the Second Law. That is not the case at
all. The irreversible processes of metabolism, heat conduction etc., occurring
within organisms are entropy producing like any others. It is only to say
that changes in 'amount of organization' and of entropy can occur quite
independently of each other.
A similar conclusion was reached earlier in this note about changes
of 'orderliness' and of entropy being mutually independent. This bears
out what was claimed at the beginning—i.e. that the Second Law is a good
deal less restrictive than is commonly supposed. In addition to entropy
there may well exist other 'one-way functions'13
which add to the overall description of the world's temporal development.
Bohm, D. : Wholeness and The Implicate Order. Routledge and
Bohm, D. : Foundation of Physics. 17, pp. 667-77.
Denbigh, K. G. : Three Concepts of Time. Springer-Verlag.
Denbigh, K. G. and Denbigh, J. S. : Entropy in Relation to
incomplete Knowledge. Cambridge University Press.
Fox, S. W. and Dose, K. : Molecular Evolution and The Origin
of Life. Freeman.
Gibbs, W. : Collected Works, Vol. I, p. 418. Dover.
Guggenheim, E. A. : Research. 2, p. 450.
Linschitz : in H. Quastler (ed.) Information Theory in Biology.
University of Illinois Press.
Peacocke, A. R. : The Physical Chemistry of Biological Organization.
Oxford University Press.
Penrose, O. : Foundations of Statistical Mechanics. Pergamon,
prigogine, I. : From Being to Becoming. Freeman.
Prigogine, I. and Stengers, I. : Order Out of Chaos. Heinemann.
1This note is
based on a paper given to a B.S.P.S. meeting at Sussex University in September
1983, and on the book Entropy in Relation to incomplete Knowledge by K.
G. Denbigh and J. S. Denbigh, Cambridge University Press .
2The term 'system'
here refers to whatever part of the physical world is being discussed.
For purposes of simplicity I have taken the terms 'heat' and 'energy' as
being understood prior to the defining of the four classes, even though
a systematic presentation of thermodynamics usually proceeds rather differently.
Perhaps it should be added that in the case of those systems such as the
stars in which there occurs a relativistic interconversion of matter and
other forms of energy the distinction between openness and closedness loses
3In the case
of open systems the concept of 'heat' must be used with great caution,
due to the fact that when matter is transferred across the boundary of
a system it carries energy with it.
4It should be
mentioned that the constraints used for the definition of classes (a) and
(b) are not the only sorts of constraints used in thermodynamics. Instead
of talking about isolated systems we can discuss systems which are held
at constant volume and temperature by being contained in a rigid vessel
immersed in a sufficiently large thermostat. The system plus thermostat
is an instance of an 'enlarged system' as referred to above. Whereas the
entropy of the enlarged system tends to a maximum, the 'characteristic
function' of the sub-system held at constant volume and temperature is
the Helmholtz free energy, and this tends to a minimum. It is important
to notice however that the application of this criterion (or the corresponding
one in terms of the Gibbs free energy when the sub-system is held at constant
pressure and temperature) continues to be limited to closed systems. There
is no such criterion or tendency attributable to open systems.
5The fact that
this equation defines only a difference of entropy is of no relevance in
this note where the issue is whether entropy changes are related to changes
of 'orderliness' or of 'organization'.
6For a discussion
on the relationships between thermodynamic entropy and some of the statistical
entropies see, for example, Penrose .
where pi is the probability of the i'th energy eigenstate, is actually
more useful than is sbp, even though it is less perspicuous. It corresponds
to the 'canonical ensemble' and the constraint of constant temperature
replaces that of constant energy.
theory an 'informational entropy' is defined as being a measure of the
uncertainty' or of the 'missing information' relating to whatever entity
or event is under discussion. The definition has the same form as that
of SG/k in the previous footnote, but in general the probabilities pi do
not refer to the same events. Typically they are the probabilities of the
symbols occurring in a message, and in such cases the 'informational entropy'
has no bearing on the Second Law. Neither is it related to 'orderliness',
'organization' or 'complexity'. The matter is discussed by Denbigh and
the title 'On entropy as mixed-up-ness' in a list of Gibbs' unpublished
has to be recognised this gives emphasis to the view that it often has
an essentially subjective character. This is also shown in instances where
orderliness depends on the choice of a convention—e.g. in the example of
that part of an 'ordered' sequence of playing cards which has been assigned
the values A>K>Q>J>10.
mathematization of the concept of order has been put forward by Bohm 
but it does not provide what is needed in the present context. Nevertheless
in his  he shows the extent to which a probabilistic treatment of
the implicate order links up with Prigogine's theory of entropy.
12For a review
with references see Peacocke . Bohm [1980, 1987] uses the term 'organized'
as meaning the 'working together', in a coherent way, of all aspects of
for instance, concerning the total amount of organization in the planetary
biospheres. Of course such possibilities do not imply vitalism. That there
can occur entirely natural processes of self-assembly (e.g. of collagen
stacks) was described by Fox and Dose  and they went on to develop
their 'constructionist' theory of natural evolution. This supposes that
separate 'components' come together through the action of natural forces
and the resulting aggregate is then able to engage in certain physico-chemical
'functions' on which natural selection can operate.