Time’s Arrow and Entropy

K. Popper

Editor’s Note

Physicists since the late nineteenth century had speculated over the links between time, thermodynamics and entropy. A related question, discussed by Erwin Schrödinger, is how life on Earth seems to produce ever-increasing complexity, and therefore lower entropy. Here the philosopher Karl Popper suggests that the mystery of increasing biological order may actually have more to do with the cooling of the planet than with its perpetual warming by the Sun. Popper argues that there does not appear to be any special link between the second law of thermodynamics and either the nature of time or biological processes on Earth. Physicist Ilya Prigogine, mentioned here, went on to propose a non-thermodynamic, quantum origin for the “arrow of time”.ft  中文

SEVERAL years ago1 I suggested that we should distinguish between two essentially different ways in which energy can be degraded or dissipated: “Dissipation in the form of increasing disorder (entropy increase) is one of them, and dissipation by expansion without increase of disorder is the other. For an increase of disorder, walls of some kind are essential: a sufficiently thin gas expanding in a ‘vessel without walls’ (that is, the universe) does not increase its disorder.” Reasons for this view were given in the place cited.ft  中文

In order to explain this a little more precisely, I shall here introduce, following Prigogine2, the term “system” to denote the (energy and material) “contents of a well-defined geometrical volume of macroscopic dimensions” (so that, for example, an organism enclosed by its skin, or our solar system as enclosed by a sphere round the Sun with a radius of 105 light seconds, would be a “system”); and I shall speak of the “exterior” of a system X as a region of space (leaving it open whether or not this is in its turn a geometrically well-defined “system”) of which X forms a part.ft  中文

Following Prigogine, I shall distinguish between (materially or at least energetically) “open” and “closed” systems. (An energetically closed system is called “isolated”.) Moreover, I shall call a system X “essentially open” if it is part of a system Y such that all geometrically convex systems of which Y is a part are (at least energetically) open. (This definition makes it possible even for an isolated system to be essentially open.)ft  中文

I further call X “essentially open towards a cooler exterior” if X is enclosed by some convex system Y such that: (a) all elements of any sequence Zi of convex systems of which Y is a part are essentially open and of a lower average temperature than Y, and that (b) for every such system Zi there is a system Zj which encloses Zi and which is not of a higher average temperature than Zi.ft  中文

The terminology here introduced makes it possible to clarify a number of points in connexion with the second law of thermodynamics which seem in urgent need of clarification.ft  中文

Again following Prigogine3, we can split the change of entropy dSX in any system X into two parts: dSXe, or the flow of entropy due to interaction with the exterior of X, and dSXi, the contribution to the change of entropy due to changes inside the system X. We have, of course:

dSX = dSXe + dSXi

(1)

and we can express the second law by:

dSXi ≥ 0

(2)

ft  中文

For an energetically closed (or “isolated”) system X, for which by definition dSXe = 0, expression (2) formulates the classical statement that entropy never decreases. But if X is open towards a cooler exterior:

dSXe < 0

(3)

holds, and the question whether its total entropy increases or decreases depends, of course, on both its entropy production dSXi and its entropy loss dSXe.ft  中文

The fact that entropy can decrease in an open system X does not, of course, conflict with the second law as given by expression (2). But the second law is often formulated in a different way; for example, it is said that “if we … expand our system to include all the energy exchange, it would be found that in the larger system the entropy had increased. For example, to measure the entropy change taking place in living organisms as a whole, it would be necessary to include in our system the Sun and some additional portion of the universe, as well as the Earth itself”4. Thus it is suggested that for sufficiently large systems X of our universe, dSX ≥ 0, so that the entropy always increases.ft  中文

Yet, so far as our knowledge of the Universe goes, the precise opposite appears to be the case. With very few and short-lived exceptions, the entropy in almost all known regions (of sufficient size) of our universe either remains constant or decreases, although energy is dissipated (by escaping from the system in question). This is so, at any rate, if we assume that the law of conservation of energy is valid; and it is also so if we assume the “steady state” theory of the expanding universe. (It is not so on the assumption of a finite and non-expanding universe with non-zero energy density.)ft  中文

In order to see this, all that is needed is to be clear about the empirical fact that in our universe we know only essentially open systems, and only systems X which, so far as they produce entropy at all, are essentially open towards a cooler exterior. (This is true even of all so-called “closed” or “isolated” systems.) But for all such systems, one of the following cases must hold: (a) they are (practically) stationary, like the solar system and most stars known to us, in which case their entropy production (practically) equals their entropy loss, at least temporarily; or (b) they are losing temperature, and thereby entropy; or (c) they are producing more entropy than they lose, in which case they are in process of getting hotter, a process which, whether energy conservation is assumed or the steady state theory, can be only a comparatively rare and short-lived temporary process. (Even if the system in question should be one that collects matter from its environment until its gravitational field becomes so strong as to encapsulate and separate off the system from the rest of the universe, it would thereby presumably become stationary.) All we know about the universe points to (a) and (b) as being by far the most frequent and important cases: in almost all sufficiently large systems known to us, entropy production seems to be equalled, or even exceeded, by entropy loss through heat radiation.ft  中文

This may be explained by the conjecture that every entropy-producing region is open towards some large (perhaps infinite) sinks of energy—regions the energy capacity or heat capacity of which, at least for heat in the form of radiation, is infinite (or approximately so for all practical purposes). The existence of such sinks seems to be strongly indicated by the darkness of the night sky. (We might represent this conjecture by the model of an infinite universe with zero energy density; or by that of an energy-conserving expanding—and therefore cooling and entropy-destroying—universe which tends towards zero energy density; of by that of an expanding steady-state universe with constant temperature, and entropy production equalled by entropy escape.)ft  中文

So there do not seem to be theoretical or empirical reasons to attribute to expression (2) any cosmic significance or to connect “time’s arrow” with that expression; especially since the equality sign in expression (2) may hold, for almost all cosmical regions (and especially for regions empty of matter). Moreover, we have good reason to interpret expression (2) as a statistical law; while the “arrow” of time, or the “flow” of time, does not seem to be of a stochastic character: nothing suggests that it is subject to statistical fluctuation, or connected with a law of large numbers.ft  中文

As for the evolution of life, this seems to be connected, if at all, with a cooling rather than a heating process on Earth (or perhaps with periodic temperature fluctuations); that is, with increasing order and decreasing entropy. Yet it does not seem that “feeding on neg-entropy” has much to do with the preservation of life, as has been suggested, for example, by Schroedinger5. For during the incubation of birds’ eggs entropy rather than neg-entropy is supplied to them, though they are in a period of increasing organization; and while in an organism dying of heat or of fever entropy may increase, if it dies of cold—say, by deep-freezing—its entropy certainly decreases.ft  中文

(207, 233-234; 1965)

Karl Popper: University of London.


References:

  1. Popper, K. R., Nature, 178, 381 (1956); 177, 538 (1956); 179, 1296 (1957); 181, 402 (1958); Brit. J. Phil. Sci., 8, 151 (1957).

  2. Prigogine, I., Introduction to Thermodynamics of Irreversible Processes, 3 (1955).

  3. Prigogine, I., Introduction to Thermodynamics of Irreversible Processes, 16 (1955).

  4. Blum, Harold F., Time’s Arrow and Evolution, 15 (1935). (Similar statements are to be found, for example, on pages 16, 24, 33, 201.) Compare also Planck, M., A Survey of Physics, 17, 27 (1925).

  5. Schroedinger, E., What is Life? 72 (1944).