In my view this kind of claim is mistaken, on two levels. For one thing, it involves some fallacies which have characterised attempts to explain the thermodynamic arrow of time since the subject's beginnings in the nineteenth century. Worse still--but again like many previous accounts--it misunderstands the nature of the problem, and simply addresses the wrong issue. As I say, both of these are old mistakes, but they are mistakes which are very poorly recognised, even today. My plan in this paper is lay bare the mistakes in their classical (pre-chaos theory) manifestations, in the hope of making it clear that chaos theory cannot possibly do better.
By the mid-nineteenth century, however, the new science of thermodynamics provided a unifying framework for these familiar processes. They all exemplify the same general tendency--"a universal tendency in nature to the dissipation of mechanical energy," as William Thomson describes it in a note of that title in 1852, or "the universal tendency of entropy to increase," as Rudolf Clausius puts it in the paper in which the term "entropy" is first introduced, in 1865.
The fact that the Second Law is time-asymmetric was not initially seen as problematic. What eventually made it problematic was the success of the attempt to connect thermodynamics to microphysics. By the 1870s, it was already becoming clear that thermodynamic phenomena were a manifestation of the mechanical behavior of the microscopic constituents of matter.
Let me mention a couple of the crucial steps. In his 1867 paper "On the Dynamical Theory of Gases," James Clerk Maxwell shows how to derive a number of characteristics of the macroscopic behavior of gases from a theory concerning the statistical distribution of the velocities of their component molecules. In particular, he derives an expression for the distribution of molecular velocities in a gas in thermal equilibrium. This came to be called the Maxwell distribution. Maxwell's work was taken up by Ludwig Boltzmann, who makes the connection between statistical mechanics and the Second Law of thermodynamics absolutely explicit. Like Maxwell, Boltzmann considers the effects of collisions on the distribution of velocities of the molecules of a gas. He argues that no matter what the initial distribution of velocities, the effect of collisions was to make the distribution of velocities approach the Maxwell distribution.
This result is called Boltzmann's H-theorem. Boltzmann defines a quantity, H, which takes its minimum possible value when the gas has the Maxwell distribution, and always decreases with time as a result of collisions (if not already at its minimum value). Moreover, Boltzmann saw that there was a direct connection between the H-Theorem and Clausius's principle. In the case of a gas already in equilibrium, Boltzmann's quantity H is equivalent to -E, where E is the entropy as defined by Clausius. Boltzmann suggested that H provides a generalized notion of entropy. In showing that H always decreases, the H-Theorem thus amounted to a proof of a generalized Second Law for the case of gases; in other words, it amounted to a proof that entropy always increases.
Where does the asymmetry of the Second Law come from, on this account? The significance of this issue was not noticed for another twenty years or so, until it was raised by E. P. Culverwell, of Trinity College, Dublin, who was puzzled by the fact that the H-theorem seemed to produce asymmetry out of thin air. Culverwell was quite right to be puzzled, but the discussion which his intervention generated, in the pages of Nature in the mid-1890s, does not really do justice to his main point: if we are interested in explaining where the thermodynamic asymmetry comes from--in particular, how it manages to arise in a world in which the underlying mechanical laws seem to make no distinction between past and future--we must be careful not simply to pass the buck; i.e., to shift the burden of explanation from one place to another, while leaving it no less puzzling.
In fact, the asymmetry of the H-theorem stems from an apparently innocuous assumption that Boltzmann had borrowed from Maxwell. Sometimes called the assumption of molecular chaos, it is the assumption that the velocities of the molecules of a gas are independent before they collide with one another. If this were assumed to hold after collisions, as well as before, the H-theorem would not yield an asymmetry. It would simply imply that entropy in non-decreasing in both directions; i.e., that it is always at its maximum value. But why should the assumption hold in one case but not the other? At best, then, the H-theorem simply replaces one puzzle about time asymmetry with another.
Long before Culverwell raised this challenge to the H-theorem, however, Boltzmann himself had already arrived, rather tentatively, at a view of the nature of the problem which actually makes the H-theorem redundant. In a sense, then, Culverwell was flogging a dead horse. To be fair, however, few people then or since have noticed that the horse is dead. Even Boltzmann, who had dealt his own horse the fatal blow some fifteen years previously, entered the fray on the poor creature's behalf.
This point was already appreciated by Maxwell in the late 1860s. Maxwell's Demon is an imaginary creature who segregates the fast and slow molecules of a gas, thereby making it hot in one region and cold in another. The point of the story is that what the Demon does deliberately might happen by accident, and so the tendency of gases to reach thermal equilibrium can be nothing more than a tendency--it cannot be an exceptionless law. Even more tellingly, Maxwell and Thomson realised that for any sample of gas in the process of approaching equilibrium, there is another in the process of departing from equilibrium: namely, the sample we get by exactly reversing the motions of all the molecules of the original. The determinism of Newtonian mechanics implies that this new sample will simply retrace the history of the old sample, so that if the gas had originally been becoming more uniform, the new state will lead to its becoming less uniform. (Don't make the mistake of thinking that the point depends on the practicality of such a reversal: what matters is that by reversing the motions on paper, we describe a physically possible state in which entropy decreases.) Again, the conclusion is that if the Second Law is to be grounded on a statistical treatment of the behavior of the microscopic constituents of matter, it cannot be an exceptionless principle.
The idea of reversal of motions also occurred to Boltzmann's Viennese colleague, Franz Loschmidt, and it is by this route that the so-called "reversibility paradox" came to Boltzmann's attention. The Loschmidt's argument convinced Boltzmann that the Second Law is of a statistical rather than an exceptionless nature, and he moved on to embrace a new conception of the nature of entropy itself. Applied to the case of gases, the crucial idea is that a given condition of the gas will normally be realizable in many different ways: for any given description of the gas in terms of its ordinary or `macroscopic' properties, there will be many different microstates--many different arrangements of the constituent molecules--which would produce the macrostate concerned.
If all possible microstates are assumed to be equally likely, the gas will spend far more of its time in some macrostates than others--a lot of time in states that can be realized in many ways, and little time in states that can be realized in few ways. From here it is a short step to the idea that the equilibrium states are those of the former kind, and that the entropy of a macrostate is effectively a measure of its probability, in these microstate counting terms. Why then does entropy tend to increase, on this view? Simply because from a given starting point there are very many more microstates to choose from that correspond to higher entropy macrostates, than microstates that correspond to lower entropy macrostates.
This account builds the statistical considerations in from the start. Hence it makes explicit the first lesson of the reversibility objection, viz. that the Second Law is not exceptionless. Moreover, it seems to bypass the H-Theorem, by attributing the general increase of entropy in gases not to the effects of collisions as such, but to broader probabilistic considerations. Where does the asymmetry come from, however, if not from the assumption of molecular chaos?
The answer, as Boltzmann saw, is that there is no asymmetry in this new statistical argument. The above point about entropy increase towards (what we call) the future applies equally towards (what we call) the past. At a given starting point there are very many more possible histories for the gas that correspond to higher entropy macrostates in its past, than histories that correspond to lower entropy macrostates. In so far as the argument gives us reason to expect entropy to be higher in the future, it also gives us reason to expect entropy to have been higher in the past. Suppose we find a gas sample unevenly distributed at a particular time, for example. If we consider the gas's possible future, there are many more microstates which correspond to a more even distribution than to a less even distribution. Exactly the same is true if we consider the gas's possible past, however, for the statistical argument simply relies on counting possible combinations, and doesn't know anything about the direction of time.
Boltzmann seems to have been the first person to appreciate this point. In responding to Loschmidt he adds the following note:
Thus the reversibility objection--originally seen by Maxwell, Thomson and Loschmidt just as an argument against the exceptionless character of the Second Law--led Boltzmann to a more important conclusion. All the same, even Boltzmann doesn't seem to have seen how thoroughly this conclusion undermines the project of his own H-theorem, which is to explain why entropy increases towards the future.
Think of the problem with the temporal perspective reversed. From this perspective it seems that entropy is high in the past, and always decreases. The universal tendency to decrease towards what we now take to be the future looks puzzling, of course, but might be taken to be adequately explained if we could show that the laws of physics somehow impose an appropriate boundary condition in that direction. Why does entropy decrease? Because it is a consequence of certain laws that the entropy of the universe must be low at some point in the future. Each individual decrease would thus be explained as a contribution to the required general decrease.
Do we need to explain why entropy is high in the past, in this picture? No, for according to the statistical account, this is not an unusual way for the past to be. All we need to do is to note that in that direction the universe does not appear to be subject to the boundary constraint that imposes low entropy towards the future. From this reversed perspective, in other words, the real work of explaining why entropy shows a universal tendency to decrease is done by an account of why it is low at a certain point in the future, together with the remark that the past is not similarly in need of explanation.
However, if we accept that this is a satisfactory account of what we would see if we looked at our universe in reverse, it is hard to maintain that it is not a satisfactory explanation of what we actually do see--for the difference between the two views lies in our perspective, not in the objective facts in need of explanation. And so from the ordinary viewpoint, all the work is done by the account of why entropy is low in the past. The future needs no more than a footnote to the effect that no such constraint appears to operate there, and that what we foresee in that direction is not in need of explanation, for it is the normal way for matter to be.
This conclusion applies to the countless individual processes in which entropy increases, as well as to the Second Law in general. Consider what happens when we remove the top from a bottle of beer, for example: pressurized gas and liquid escape from the bottle. Traditionally it has been taken for granted that we need to explain why this happens, but I think this is a mistake. The beer escapes simply because its initial microstate is such that this is what happens when the bottle is opened. As the tradition recognizes, however, this isn't much of an explanation, for we now want to know why the initial microstate is of this kind. But the correct lesson of the statistical approach is that this kind of microstate doesn't need explanation, for it is (overwhelmingly) the most natural condition for the system in question to possess. What does need to be explained is why the microstate of the beer is such that, looked at in reverse, the beer enters the bottle; for it is in this respect that the microstate is unusual. And in the ordinary time sense, this just a matter of explaining how the beer comes to be in the bottle in the first place.
Science often changes our conception of what calls for explanation and what does not. Familiar phenomena come to be seen in a new light, and often as either more or less in need of explanation as a result. One crucial notion is that of normalcy, or naturalness. Roughly, things are more in need of explanation the more they depart from their natural condition, but science may change our view about what constitutes the natural condition. The classic example concerns the change that Galileo and Newton brought about in our conception of natural motion. I think the lessons of the Second Law should be seen in a similar light. Thermodynamic equilibrium is a natural condition of matter, and it is departures from this condition that call for explanation.[1]
In my view, then, the problem the H-Theorem addresses--that of explaining why entropy increases--has been vastly overrated. The puzzle is not about how the universe reaches a state of high entropy, but about how it comes to be starting from a low one. It is not about (what appears in our time sense to be) the destination of the great journey on which matter is engaged, but about the starting point.
Like many other approaches, then, chaos theory is flogging a dead horse in trying to account for the arrow of time by explaining why entropy increases. Even if the horse were alive, however, this use of chaos theory would still be vulnerable to Culverwell's century-old objection to Boltzmann's program, namely that a symmetric theory is bound to have the same consequences in both temporal directions.
A particularly powerful way to apply Culverwell's insight is like this. Suppose that the proponents of the non-linear dynamical methods--or any other dynamical method, for that matter--claim that despite the fact that it is a symmetric theory, it produces asymmetric consequences in thermodynamics. To undermine their claim, we describe an example of the kind of physical system to which the new method is supposed to apply, specifying its state at some time t. We then ask our opponents to tell us the state of the system at another time, say t + 1, without being told whether t + 1 is actually earlier or later than t. (That is, without being told whether a positive time interval in our description corresponds to a later or an earlier time in the real world.) If our opponents are able to produce an answer without this extra information, then their theory must be time-symmetric, for it generates the same results in both temporal directions. If they need the extra information, on the other hand, this can only be because at some point their theory treats the two directions of time differently--like Boltzmann's original H-Theorem, in effect, it slips in some asymmetry at the beginning. So in neither case do we get what the advocates of this approach call "symmetry-breaking": a temporal asymmetry which arises where there was none before. Either there is no temporal asymmetry at any stage, or it is there from the beginning.
For all their intrinsic interest, then, the new methods of non-linear dynamics do not throw new light on the asymmetry of thermodynamics. Writers who suggest otherwise have failed to see appreciate the real puzzle of thermodynamics--Why is entropy low in the past?--and failed to see that no symmetric dynamical theory could yield the kind of conclusions they claim to draw.[2]
2. I discuss the issues raised in this paper at greater length in Price (1996), chapter 2. I am grateful to the participants in the ISST Conference in Sainte-Adèle for comments on an earlier version of the paper, and especially to John and Roma Henderson, for allowing me to spend the week of the Conference at the delightful Maison des Mouches, Lake Châtillon.
Brush, S., 1966: Kinetic Theory. Volume 2: Irreversible Processes, Oxford: Pergamon Press.
Clausius, R. 1865: Annalen der Physik, Series 2, 125, 426.
Coveney, P. and Highfield, R., 1990: The Arrow of Time, London: W. H. Allen.
Maxwell, J. C., 1867: "On the Dynamical Theory of Gases," Philosophical Transactions of the Royal Society, 157, 49; reprinted in Brush (1966), 23.
Penrose, R., 1989: The Emperor's New Mind, Oxford: Oxford University Press.
Price, H., 1996: Time's Arrow and Archimedes' Point, New York: Oxford University Press.
Prigogine, I. and Stengers, I., 1985: Order Out of Chaos: Man's New Dialogue with Nature. London: Flamingo.
Thomson, W., 1852: "On a Universal Tendency," Proceedings of the Royal Society of Edinburgh, 3, 139.