{\rtf1\mac\deff2 {\fonttbl{\f0\fswiss Chicago;}{\f2\froman New York;}{\f3\fswiss Geneva;}{\f4\fmodern Monaco;}{\f13\fnil Zapf Dingbats;}{\f18\fnil Zapf Chancery;}{\f20\froman Times;}{\f21\fswiss Helvetica;}{\f22\fmodern Courier;}{\f23\ftech Symbol;}
{\f135\fnil I Courier Oblique;}{\f136\fnil B Courier Bold;}{\f137\fnil BI Courier BoldOblique;}{\f138\fnil I Helvetica Oblique;}{\f139\fnil B Helvetica Bold;}{\f140\fnil BI Helvetica BoldOblique;}{\f150\fnil I Times Italic;}{\f151\fnil B Times Bold;}
{\f152\fnil BI Times BoldItalic;}{\f2010\fnil Times New Roman;}{\f6415\fnil TektoMM_100 LT 250 CN;}{\f6416\fnil TektoMM_240 RG 250 CN;}{\f6417\fnil TektoMM_503 BD 250 CN;}{\f6418\fnil TektoMM_100 LT 564 NO;}{\f6419\fnil TektoMM_240 RG 564 NO;}
{\f6420\fnil TektoMM_503 BD 488 NO;}{\f6421\fnil TektoMM_100 LT 850 EX;}{\f6422\fnil TektoMM_240 RG 850 EX;}{\f6578\fnil TektoMM_503 BD 850 EX;}{\f14560\fnil AGaramond SemiboldItalic;}{\f14561\fnil AGaramond Semibold;}{\f14562\fnil AGaramond;}
{\f14563\fnil AGaramond Italic;}}{\colortbl\red0\green0\blue0;\red0\green0\blue255;\red0\green255\blue255;\red0\green255\blue0;\red255\green0\blue255;\red255\green0\blue0;\red255\green255\blue0;\red255\green255\blue255;}{\stylesheet{\s242 \f14562
\sbasedon0\snext0 page number;}{\s243\tqc\tx3969\tqr\tx8504 \f14562 \sbasedon0\snext243 footer;}{\s244\tqc\tx3969\tqr\tx8504 \f14562 \sbasedon0\snext244 header;}{\s245 \f14562\up6 \sbasedon0\snext0 footnote reference;}{
\s246\fi-680\li680\ri10\sl-480\tx680 \v\f14562 \sbasedon0\snext246 footnote text;}{\f14562 \sbasedon222\snext0 Normal;}{\s1\qj\li840\sb240\tx840 \f14562 \sbasedon0\snext1 quotation;}{\s2\fi-280\li280\sb240\sl-480 \f14562 \sbasedon8\snext2 NewListLeft;}{
\s3\sl480 \f20 \sbasedon0\snext3 huw;}{\s4\qc\sb480\sa240\sl-480\keep\keepn \f14563\fs28\expnd2 \sbasedon0\snext4 NewHead;}{\s5\qj\li567\ri567\sb360\sa360\sl-480 \f14562 \sbasedon0\snext5 NewQuote;}{\s6\fi567\sl-480 \f14562 \sbasedon0\snext6 NewBodyText;}
{\s7\sl-480 \f14562 \sbasedon6\snext7 NewFirstText;}{\s8\fi-280\li560\sb120\sl-480 \f14562 \sbasedon6\snext8 NewList;}{\s9\qc\sa240\sl480\keepn \f20\fs28 \sbasedon0\snext9 section heading;}{\s10\sb240\sl480 \b\f20\fs28 \sbasedon0\snext10 standard;}{
\s11\fi-580\li720\ri240\sb240\sa240\sl-480 \f14562 \sbasedon6\snext11 NewExample;}{\s12\sb240\sl360 \f20\fs28 \sbasedon0\snext12 Normal times;}{\s13\sl280\tx1162 \f20 \sbasedon0\snext13 top;}{\s14\qj\li708\sl360 \f20 \sbasedon0\snext14 quotes;}{\s15\sl480
\box\brdrs \posxc\dxfrtext180 \f20 \sbasedon0\snext15 picture;}{\s16\fi-280\li560\sb120\sa120\sl-480 \f14562 \sbasedon8\snext16 NewListLastItem;}{\s17\fi-280\li280\sb240\sa240\sl-480 \f14562 \sbasedon2\snext17 NewListLeftLast;}{
\s18\qj\fi280\li567\ri567\sl-480 \f14562 \sbasedon5\snext18 NewQuoteContinued;}{\s19\qj\li567\ri567\sb360\sl-480 \f14562 \sbasedon5\snext19 NewQuoteFirstOfSeveral;}{\s20\qj\fi280\li567\ri567\sa240\sl-480 \f14562 \sbasedon18\snext20
NewQuoteLastOf Several;}{\s21\qc\sb480\sl-480\keep\keepn \f14563\fs28\expnd2 \sbasedon4\snext21 NewStoryHead;}{\s22\ri7\sa960\sl-480 \f14562\fs36 \sbasedon0\snext22 NewChapterHead;}{\s23\qj\li311\sl360 \f20\fs20 \sbasedon0\snext23 z;}{
\s24\phmrg\posxc\posyc\dxfrtext567 \f20 \sbasedon0\snext24 NewPicture;}{\s25\fi-140\li680\sl-480\tqr\tx6280 \f14562 \sbasedon8\snext25 NewContentItem;}{\s26\qc\fi-720\li720\sb140\sl-280\keep\keepn\tx560\tqr\tx6260 \f14563\fs26\expnd2 \sbasedon21\snext26
NewContentHead;}{\s27\ri7\sa240\sl-360\pagebb \f14562\fs48 \sbasedon22\snext27 New Chapter Number;}{\s28\qc\fi-720\li720\sb280\sl-280\keep\keepn\tx560\tqr\tx6260 \scaps\f14563\expnd2 \sbasedon26\snext28 NewContentPartHead;}}
\paperw11900\paperh16840\margl1418\margr1418\margt1417\margb1417\deftab709\ftnbj \sectd \linemod0\linex0\headery851\footery851\cols1\endnhere {\footer \pard\plain \s243\qc\tqc\tx3969\tqr\tx8504 \f14562 {\fs20 \endash }{\fs20 \chpgn }{\fs20 \endash
\par
}}\pard\plain \s22\qc\ri7\sa160\sl-480 \f14562\fs36 {\i\f20\fs28 Chaos Theory and the Difference between Past and Future\par
}{\plain \f20 Huw Price\par
}\pard \s22\qc\ri7 {\plain \f20 School of Philosophy\par
}\pard \s22\qc\ri7\sa240 {\plain \f20 University of Sydney\par
\par
}\pard\plain \brdrt\brsp240\brdrhair \brdrb\brsp120\brdrhair \f14562 {\b\f20 Summary: }{\f20 Contemporary writers often claim that chaos theory explains the thermodynamic arrow of time. This paper argues that such claims are mistak
en, on two levels. First, they underestimate the difficulty of extracting asymmetric conclusions from symmetric theories. More important, however, they misunderstand the nature of the puzzle about the temporal asymmetry of thermodynamics, and simply addres
s the wrong issue. Both of these are old mistakes, but mistakes which are poorly recognised, even today. This paper aims to lay bare the mistakes in their classical (pre-chaos theory) manifestations, in order to make it clear that chaos theory cannot possi
bly do better. \par
}\pard\plain \s4\qc\sb480\sa240\sl-480\keep\keepn \f14563\fs28\expnd2 {\i\f20 Introduction\par
}\pard\plain \s7\sl-480 \f14562 {\f20 Contemporary writers sometimes claim that chaos theory explains important differences between the past and the future. In their book }{\i\f20 The Arrow of Time,}{\f20
for example, Peter Coveney and Roger Highfield say that \ldblquote [t]his idea is beginning to lay the basis for the solution to a problem which has beset science since the time of Boltzmann.\rdblquote They suggest that because \ldblquote
dynamical chaos is the rule, not the exception, with our world, ... the past is fixed [but] the future remains open and we rediscover the arrow of time.\rdblquote (1990, pp. 37\endash
8). This view is particularly associated with the Brussels School, led by the Ilya Prigogine, who is famous for for his work on the behavior of systems far from thermodynamic equilibrium. In his book with Isabelle Stengers, Prigogine himself says that
\par
}\pard\plain \s5\qj\li567\ri567\sb360\sa360\sl-480 \f14562 {\f20\fs20
irreversibility emerges ... from instability, which introduces irreducible statistical features into our description. Indeed, what could an arrow of time mean in a deterministic world in which both future and pa
st are contained in the present? It is because the future is not contained in the present and that we go from the present to the future that the arrow of time is associated with the transition from present to future. (1985, p. 277)\par
}\pard\plain \s7\sl-480 \f14562 {\f20 Prigogine and Stengers also speak of the \ldblquote construction of irreversibility out of randomness,\rdblquote where by \ldblquote randomness\rdblquote
they mean the kind of irreducible unpredictability associated with chaos theory and non-linear dynamics. \par
}\pard\plain \s6\fi567\sl-480 \f14562 {\f20 In my view this kind of claim is mistaken, on two levels. For one thing, it involves some fallacies which have characterised attempts to explain the thermodynamic arrow of time since the subject\rquote
s beginnings in the nineteenth century. Worse still\emdash but again like many previous accounts\emdash it misunderstands the nature
of the problem, and simply addresses the wrong issue. As I say, both of these are old mistakes, but they are mistakes which are very poorly recognised, even today. My plan in this paper is lay bare the mistakes in their classical (pre-chaos theory) manifes
tations, in the hope of making it clear that chaos theory cannot possibly do better. \par
}\pard\plain \s4\qc\sb480\sa240\sl-480\keep\keepn \f14563\fs28\expnd2 {\i\f20 The mechanical arrow\par
}\pard\plain \s7\sl-480 \f14562 {\f20
By the end of the eighteenth century, science was beginning to notice a range of phenomena with the interesting property that they do not occur in reverse: the diffusion of gases, the dissipative effects of frictional forces, and the flow of heat from warm
bodies to cold, for example. But irreversibility is }{\i\f20 so}{\f20 familiar, so ubiquitous, that it is difficult to see it as a proper concern of physics, and the idea that these processes embody some natural law was slow in coming. \par
}\pard\plain \s6\fi567\sl-480 \f14562 {\f20 By the mid-nineteenth century, however, the new science of thermodynamics provided a unifying framework for these familiar processes. They all exemplify the same general tendency\emdash \ldblquote
a universal tendency in nature to the dissipation of mechanical energy,\rdblquote as William Thomson describes it in a note of that title in 1852, or \ldblquote the universal tendency of entropy to increase,\rdblquote
as Rudolf Clausius puts it in the paper in which the term \ldblquote entropy\rdblquote is first introduced, in 1865.\par
The fact that the Second Law is time-asymmetric was not initially seen as problematic. What eventually made it problematic was the success of the attempt to connect thermodynamics to microphysics. By the 1870s, was already becoming clear that thermodynamic
phenomena were a manifestation of the mechanical behavior of the microscopic constituents of matter. \par
Let me mention a couple of the crucial steps. In his 1867 paper \ldblquote On the Dynamical Theory of Gases,\rdblquote James Clerk Maxwell sho
ws how to derive a number of characteristics of the macroscopic behavior of gases from a theory concerning the statistical distribution of the velocities of their component molecules. In particular, he derives an expression for the distribution of molecula
r velocities in a gas in thermal equilibrium. This came to be called the Maxwell distribution. Maxwell\rquote
s work was taken up by Ludwig Boltzmann, who makes the connection between statistical mechanics and the Second Law of thermodynamics absolutely explicit.
Like Maxwell, Boltzmann considers the effects of collisions on the distribution of velocities of the molecules of a gas. He argues that no matter what the initial distribution of velocities, the effect of collisions was to make the distribution of velocit
ies approach the Maxwell distribution. \par
This result is called Boltzmann\rquote s }{\i\f20 H}{\f20 -theorem. Boltzmann defines a quantity, }{\i\f20 H,}{\f20 which takes its minimum possible value when the gas has the Maxwell distribution, and always decreases with time as a result of collisions
(if not already at its minimum value). Moreover, Boltzmann saw that there was a direct connection between the }{\i\f20 H}{\f20 -Theorem and Clausius\rquote s principle. In the case of a gas already in equilibrium, Boltzmann\rquote s quantity }{\i\f20 H}{
\f20 is equivalent to minus the entropy, as defined by Clausius. Boltzmann suggested that }{\i\f20 H}{\f20 provides a generalized notion of entropy. In showing that }{\i\f20 H}{\f20 always decreases, the }{\i\f20 H}{\f20
-Theorem thus amounted to a proof of a generalized Second Law for the case of gases; in other words, it amounted to a proof that entropy always }{\i\f20 increases}{\f20 .\par
Where does the asymmetry of the Second Law come from, on this account? The significance of this issue was not noticed for another twenty years or so, until it was raised by E.\~P.\~
Culverwell, of Trinity College, Dublin, who was puzzled by the fact that the }{\i\f20 H}{\f20
-theorem seemed to produce asymmetry out of thin air. Culverwell was quite right to be puzzled, but the discussion which his intervention generated, in the pages of }{\i\f20 Nature}{\f20 in the mid-1890s, does not really do justice to his mai
n point: if we are interested in explaining where the thermodynamic asymmetry comes from\emdash in particular, how in manages to arise in a world in which the underlying mechanical laws seem to make no distinction between past and future\emdash
we must be careful not simply to pass the buck; i.e., to shift burden of explanation from one place to another, while leaving it no less puzzling. \par
In fact, the asymmetry of the }{\i\f20 H}{\f20 -theorem stems from an apparently innocuous assumption that Boltzmann had borrowed from Maxwell\emdash someti
mes called the assumption of molecular chaos, it is the assumption that the motions of the molecules of a gas are uncorrelated before they collide with one another. If this was assumed to hold after collisions, as well as before, the }{\i\f20 H}{\f20
-theorem would not yield an asymmetry. It would simply imply that entropy in non-decreasing in both directions; i.e., that it is always at its maximum value. But why should the assumption hold in one case but not the other? At best, then, the }{\i\f20 H}{
\f20 -theorem simply replaces one puzzle about time asymmetry with another.\par
Long before Culverwell raised this challenge to the }{\i\f20 H}{\f20 -theorem, however, Boltzmann himself had already arrived, rather tentatively, at a view of the nature of the problem which actually makes the }{\i\f20 H}{\f20
-theorem redundant. In a sense, then, Culverwell was flogging a dead horse. To be fair, however, few people then or since have noticed that the horse is dead. Even Boltzmann, who had dealt his own horse the fatal blow some fifteen years previously, entered
the fray on the poor creature\rquote s behalf.\par
}\pard\plain \s4\qc\sb480\sa240\sl-480\keep\keepn \f14563\fs28\expnd2 {\i\f20 The reversibility objection\par
}\pard\plain \s7\sl-480 \f14562 {\f20 The fatal blow lies in Boltzmann\rquote s response to the so-called }{\i\f20 reversibility objection. }{\f20
The initial significance of this objection turns on the fact that the Second Law was thought of originally as an exceptionless principle, on a par with the other laws of physics. As statistical approaches to thermodynamics developed, however, it came to be
appreciated that the principles they implied would have a different character. Deviations from the norm would be highly unlikely, but not impossible. Far from excluding them, the statistical treatment makes them inevitable, eventually. \par
}\pard\plain \s6\fi567\sl-480 \f14562 {\f20 This point was already appreciated by Maxwell in the late 1860s. Maxwell\rquote
s Demon is an imaginary creature who segregates the fast and slow molecules of a gas, thereby making it hot in one region and cold in another. The point of the story is that what the
Demon does deliberately might happen by accident, and so the tendency of gases to reach thermal equilibrium can be nothing more than a tendency\emdash
it cannot be an exceptionless law. Even more tellingly, Maxwell and Thomson realised that for any sample of gas in the process of approaching equilibrium, there is another in the process of departing from equilibrium: namely, the sample we get by exact
ly reversing the motions of all the molecules of the original. The determinism of Newtonian mechanics implies that this new sample will simply retrace the history of the old sample, so that if the gas had originally been becoming more uniform, the new stat
e will lead to its becoming less uniform. (Don\rquote t make the mistake of thinking that the point depends on the practicality of such a reversal: what matters is that by reversing the motions }{\i\f20 on paper, }{\f20
we describe a physically possible state in which entropy decreases.) Again, the conclusion is that if the Second Law is to be grounded on a statistical treatment of the behavior of the microscopic constituents of matter, it cannot be an exceptionless princ
iple.\par
The idea of reversal of motions also occurred to Boltzmann\rquote s Viennese colleague, Franz Loschmidt, and it is by this route that the so-called \ldblquote reversibility paradox\rdblquote came to Boltzmann\rquote s attention. The Loschmidt\rquote
s argument convinced Boltzmann that the Second Law is of a statistical rather than an exceptionless
nature, and he moved on to embrace a new conception of the nature of entropy itself. Applied to the case of gases, the crucial idea is that a given condition of the gas will normally be realizable in many different ways: for any given description of the g
as in terms of its ordinary or \lquote macroscopic\rquote properties, there will be many different }{\i\f20 microstates}{\f20 \emdash many different arrangements of the constituent molecules\emdash which would produce the }{\i\f20 macrostate}{\f20
concerned. \par
If all possible microstates are assumed to be equally likely, the gas will spend far more of its time in some macrostates than others\emdash
a lot of time in states that can be realized in many ways, and little time in states that can be realized in few ways. From here it is a short step to the idea that the equili
brium states are those of the former kind, and that the entropy of a macrostate is effectively a measure of its probability, in these microstate counting terms. Why then does entropy tend to increase, on this view? Simply because from a given starting poin
t there are very many more microstates to choose from that correspond to higher entropy macrostates, than microstates that correspond to lower entropy macrostates. \par
This account builds the statistical considerations in from the start. Hence it makes explicit the first lesson of the reversibility objection, viz. that the Second Law is not exceptionless. Moreover, it seems to bypass the }{\i\f20 H}{\f20
-Theorem, by attributing the general increase of entropy in gases not to the effects of collisions as such, but to broader probabilistic considerations. Where does the asymmetry come from, however, if not from the assumption of molecular chaos?\par
The answer, as Boltzmann saw, is that there is }{\i\f20 no}{\f20 asymmetry in this new statistical argument. The above point about entropy increase t
owards (what we call) the future applies equally towards (what we call) the past. At a given starting point there are very many more possible histories for the gas that correspond to higher entropy macrostates in its past, than histories that correspond to
lower entropy macrostates. In so far as the argument gives us reason to expect entropy to be higher in the future, it also gives us reason to expect entropy to have been higher in the past. Suppose we find a gas sample unevenly distributed at a particular
time, for example. If we consider the gas\rquote s possible future, there are many more microstates which correspond to a more even distribution than to a less even distribution. Exactly the same is true if we consider the gas\rquote
s possible past, however, for the statistical argument simply relies on counting possible combinations, and doesn\rquote t know anything about the direction of time. \par
Boltzmann seems to have been the first person to appreciate this point. In responding to Loschmidt he adds the following note: \par
}\pard\plain \s5\qj\li567\ri567\sb360\sa360\sl-480 \f14562 {\f20 I will mention here a peculiar consequence of Loschmidt\rquote
s theorem, namely that when we follow the state of the world into the infinitely distant past, we are actually just as correct in taking it to be very probable that we would reach a state in which all te
mperature differences have disappeared, as we would be in following the state of the world into the distant future. (Boltzmann 1877, at p. 193 in the translation in Brush 1966)\par
}\pard\plain \s7\sl-480 \f14562 {\f20 In fact, of course, entropy seems to have been even lower in the past than it is now. The statistical treatment of entropy thus makes it extremely puzzling why entropy is }{\i\f20 not}{\f20
higher in the past. At least to some degree, Boltzmann came to see that this represents the deep mystery of the subject. The mystery isn\rquote
t why entropy goes up as we look towards the future; it is why it goes down as we look towards the past. For it is this feature of the phenomena which looks exceptional, from the statistical point of view.\par
}\pard\plain \s6\fi567\sl-480 \f14562 {\f20 Thus the reversibility objection\emdash originally seen by Maxwell, Thomson and Loschmidt just as an argument against the exceptionless character of the Second Law\emdash
led Boltzmann to a more important conclusion. All the same, even Boltzmann doesn\rquote t seem to have seen how thoroughly this conclusion undermines the project of his own }{\i\f20 H}{\f20
-theorem, which is to explain why entropy increases towards the future. \par
}\pard\plain \s4\qc\sb480\sa240\sl-480\keep\keepn \f14563\fs28\expnd2 {\i\f20 Why the }{\f20 H}{\i\f20 -Theorem is a dead horse\par
}\pard\plain \s7\sl-480 \f14562 {\f20 At first sight the }{\i\f20 H}{\f20 -theorem might seem on solid ground. After all, the universal increase in entropy is the central asymmetry identified by thermodynamics. Isn\rquote
t it obvious that it requires explanation? But this misses the point. Suppose we had succeeded in explaining why entropy is low in the past\emdash
i.e., in effect, why the entropy curve slopes downwards in that direction. Is there a separate question as to why it slopes upwards in the other direction, or is this just another way of asking the same thing? \par
}\pard\plain \s6\fi567\sl-480 \f14562 {\f20 Think of the problem with the temporal perspective reversed. From this perspective it seems that entropy is high in the past, and always decreases. The un
iversal tendency to decrease towards what we now take to be the future looks puzzling, of course, but might be taken to be adequately explained if we could show that the laws of physics somehow impose an appropriate boundary condition in that direction. Wh
y does entropy decrease? Because it is a consequence of certain laws that the entropy of the universe must be low at some point in the future. Each individual decrease would thus be explained as a contribution to the required general decrease.\par
Do we need t
o explain why entropy is high in the past, in this picture? No, for according to the statistical account, this is not an unusual way for the past to be. All we need to do is to note that in that direction the universe does not appear to be subject to the b
oundary constraint that imposes low entropy towards the future. From this reversed perspective, in other words, the real work of explaining why entropy shows a universal tendency to decrease is done by an account of why it is low at a certain point in the
future, together with the remark that the past is not similarly in need of explanation.\par
However, if we accept that this is a satisfactory account of what we would see if we looked at our universe in reverse, it is hard to maintain that it is not a satisfactory explanation of what we actually do see\emdash
for the difference between the two views lies in our perspective, not in the objective facts in need of explanation. And so from the ordinary viewpoint, all the work is done by the account of why entropy is low i
n the past. The future needs no more than a footnote to the effect that no such constraint appears to operate there, and that what we foresee in that direction is not in need of explanation, for it is the normal way for matter to be.\par
This conclusion applies to the countless individual processes in which entropy increases, as well as to the Second Law in general. Consider what happens when we remove the top from a bottle of beer, for example: pressurized gas and liquid escape from the b
ottle. Traditionally i
t has been taken for granted that we need to explain why this happens, but I think this is a mistake. The gas escapes simply because its initial microstate is such that this is what happens when the bottle is opened. As the tradition recognizes, however, t
his isn\rquote t much of an explanation, for we now want to know }{\i\f20 why}{\f20 the initial microstate is of this kind. But the correct lesson of the statistical approach is that this kind of microstate doesn\rquote
t need explanation, for it is (overwhelmingly) the most natural c
ondition for the system in question to possess. What does need to be explained is why the microstate of the gas is such that, looked at in reverse, the gas enters the bottle; for it is in this respect that the microstate is unusual. And in the ordinary tim
e sense, this just a matter of explaining how the gas comes to be in the bottle in the first place.\par
Science often changes our conception of what calls for explanation and what does not. Familiar phenomena come to be seen in a new light, and often as either
more or less in need of explanation as a result. One crucial notion is that of normalcy, or naturalness. Roughly, things are more in need of explanation the more they depart from their natural condition, but science may change our view about what constitu
tes the natural condition. The classic example concerns the change that Galileo and Newton brought about in our conception of natural motion. I think the lessons of the Second Law should be seen in a similar light. Thermodynamic equilibrium is a natural co
ndition of matter, and it is }{\i\f20 departures}{\f20 from this condition that call for explanation.}{\f20\fs20\up6 1}{\f20 \par
In my view, then, the problem the }{\i\f20 H}{\f20 -Theorem addresses\emdash that of explaining why entropy }{\i\f20 increases}{\f20 \emdash
has been vastly overrated. The puzzle is not about how the universe reaches a state of high entropy, but about how it comes to be starting from a low one. It is not about (what appears in our time sense to be) the }{\i\f20 destination}{\f20
of the great journey on which matter is engaged, but about the }{\i\f20 starting point}{\f20 . \par
}\pard\plain \s4\qc\sb480\sa240\sl-480\keep\keepn \f14563\fs28\expnd2 {\i\f20 Does chaos theory make a difference?\par
}\pard\plain \s7\sl-480 \f14562 {\f20 Finally, let\rquote
s go back to the suggestion the key to the apparent conflict between thermodynamics and mechanics lies in chaos theory, and the application of non-linear methods in mechanics. In my view this suggestion rests, like many more traditional approaches to the
}{\i\f20 H}{\f20 -Theorem, on a failure to appreciate the real puzzle of thermodynamics. As I have argued, the puzzle consists in the fact that entropy is low in the first place, not in the fact that it later increases. The non-linear approach m
ay tell us how certain non-equilibrium systems behave, }{\i\f20 given that there are some\emdash }{\f20 it may play an important role in characterizing the nature of systems on the entropy gradient, for example\emdash but it doesn\rquote
t explain how the gradient comes to exist in the first place. This is the great mystery of the subject, and the theory of non-equilibrium systems simply doesn\rquote t touch it. \par
}\pard\plain \s6\fi567\sl-480 \f14562 {\f20 Like many other approaches, then, chaos theory is flogging a dead horse in trying to account for the arrow of time by explaining why entrop
y increases. Even if the horse were alive, however, this use of chaos theory would still be vulnerable to Culverwell\rquote s century-old objection to Boltzmann\rquote
s program, namely that a symmetric theory is bound to have the same consequences in both temporal directions. \par
A particularly powerful way to apply Culverwell\rquote s insight is like this. Suppose that the proponents of the non-linear dynamical methods\emdash or any other dynamical method, for that matter\emdash
claim that despite the fact that it is a symmetric theory, it pr
oduces asymmetric consequences in thermodynamics. To undermine their claim, we describe an example of the kind of physical system to which the new method is supposed to apply, specifying its state at some time }{\i\f20 t}{\f20
. We then ask our opponents to tell us the state of the system at another time, say }{\i\f20 t}{\f20 + 1, without being told whether }{\i\f20 t }{\f20 + 1 is actually earlier or later than }{\i\f20 t.}{\f20
(That is, without being told whether a positive time interval in our description corresponds to a later or an earlier time in the real world
.) If our opponents are able to produce an answer without this extra information, then their theory must be time-symmetric, for it generates the same results in both temporal directions. If they need the extra information, on the other hand, this can only
be because at some point their theory treats the two directions of time differently\emdash like Boltzmann\rquote s original }{\i\f20 H}{\f20
-Theorem, in effect, it slips in some asymmetry at the beginning. So in neither case do we get what the advocates of this approach call \ldblquote symmetry-breaking\rdblquote
: a temporal asymmetry which arises where there was none before. Either there is no temporal asymmetry at any stage, or it is there from the beginning.\par
For all their intrinsic interest, then, the new methods of non-linear dynamics do not throw new light on the asymmetry of thermodynamics. Writers who suggest otherwise have failed to see appreciate the real puzzle of thermodynamics\emdash
Why is entropy low in the past?\emdash and failed to see that no symmetric dynamical theory could yield the kind of conclusions they claim to draw.}{\f20\fs20\up6 2}{\f20 \par
\par
}\pard\plain \s4\qc\sb480\sa240\sl-480\keep\keepn \f14563\fs28\expnd2 {\i\f20 \page Notes\par
}\pard\plain \s2\fi-280\li280\sb240\sl360 \f14562 {\f20 1.\tab One of the very few contemporary writers who seems to have properly appreciated this point is Roger Penrose, who says \ldblquote The high-entropy states are, in a sense, the \lquote natural
\rquote states, which do not need further explanation.\rdblquote (1989, p. 317) \par
2.\tab I discuss the issues raised in this paper at greater length in Price (1996), chapter 2. I am grateful to the participants in the ISST Conference in Sainte-Ad\'8fle for comments on an earlier version of the paper, and especially to John an
d Roma Henderson, for allowing me to spend the week of the Conference at the delightful }{\i\f20 Maison des Mouches}{\f20 , Lake Ch\'89tillon.\par
\par
}\pard\plain \s4\qc\sb480\sa240\sl-480\keep\keepn \f14563\fs28\expnd2 {\i\f20 References\par
}\pard\plain \s2\fi-280\li280\sb240\sl360 \f14562 {\f20 Boltzmann, L. 1877: \ldblquote \'86ber die Beziehung eines allgemeine mechanischen Satzes zum zweiten Haupsatze der W\'8armetheorie,\rdblquote }{\i\f20
Sitzungsberichte, K. Akademie der Wissenschaften in Wien, Math.-Naturwiss.,}{\f20 }{\b\f20 75}{\f20 , 67\endash 73; English translation (\ldblquote On the Relation of a General Mechanical Theorem to the Second Law of Thermodynamics\rdblquote
) in Brush (1966), 188\endash 93.\par
Brush, S., 1966: }{\i\f20 Kinetic Theory. Volume 2: Irreversible Processes}{\f20 , Oxford: Pergamon Press. \par
Clausius, R. 1865: }{\i\f20 Annalen der Physik, Series 2,}{\f20 }{\b\f20 125}{\f20 , 426.\par
Coveney, P. and Highfield, R., 1990: }{\i\f20 The Arrow of Time}{\f20 , London: W. H. Allen.\par
Maxwell, J. C., 1867: \ldblquote On the Dynamical Theory of Gases,\rdblquote }{\i\f20 Philosophical Transactions of the Royal Society, }{\b\f20 157}{\f20 , 49; reprinted in Brush (1966), 23.\par
Penrose, R., 1989: }{\i\f20 The Emperor\rquote s New Mind}{\f20 , Oxford: Oxford University Press. \par
Price, H., 1996: }{\i\f20 Time\rquote s Arrow and Archimedes\rquote Point,}{\f20 New York: Oxford University Press. \par
Prigogine, I. and Stengers, I., 1985: }{\i\f20 Order Out of Chaos: Man\rquote s New Dialogue with Nature}{\f20 . London: Flamingo.\par
Thomson, W., 1852: \ldblquote On a Universal Tendency,\rdblquote }{\i\f20 Proceedings of the Royal Society of Edinburgh}{\f20 , }{\b\f20 3}{\f20 , 139.\par
}}