[Review of David Z Albert, Time and Chance, Harvard University Press, 2000, ISBN 0-674-00317-9. £20.50 cloth. 172 + xiv pp. Forthcoming in The Times Literary Supplement.]

 

In the early 1960s a well-informed British reader knew at least one thing about the second law of thermodynamics: that in Two Cultures, C. P. Snow had compared ignorance of it to ignorance of Shakespeare. Snow later regretted his choice of example, on the grounds that the second law raised profound conceptual difficulties. Did anyone really understand it? Forty years on, the question is still worth asking, but one major new piece of the puzzle seems to have fallen into place. It is one of the greatest achievements of twentieth century physics, but perhaps the most underrated. To understand its importance, you need a sense of the problem it solves, and David Albert’s new book is a good place to start.

Many common physical processes are one-way streets. Ice melts in warm water, coffee cools to room temperature, gas fizzes out of champagne, moving objects are slowed by friction, paint fades, and so on. None of these things happens in reverse. The second law of thermodynamics provides a very general way of characterizing something that these one-way processes have in common. Roughly speaking, they are all cases in which energy becomes more disordered, more dispersed. Thermodynamics provides a measure of this disorder, called entropy, and the second law stipulates that entropy never decreases.

Now the puzzle. To all intents and purposes, the underlying laws of physics show no such temporal bias. If they allow a process to happen one way, they allow it to happen the other way. Why, then, does nature discriminate? Why does entropy go up but not down?

The issue goes back to the late nineteenth century. The most popular proposed solution is due to the great Austrian physicist, Ludwig Boltzmann (1844—1906). Boltzmann argues that under a natural way of counting the possibilities, there are simply many, many more ways for systems to be disordered (high entropy) than there are to be ordered (low entropy). Compare tossing coins. There are many more ways to get a roughly even number of heads and tails, than to get mostly heads, or mostly tails. (With only four tosses, for example, there are already six ways to get two heads, but only one way to get four heads.) In real systems, with vast numbers of microscopic constituents, there are hugely more high entropy options than low entropy options available. Left to its own devices, then, a system is almost bound to end up in a disordered state. Entropy decreases are not impossible, but just exceedingly unlikely.

On the face of it, this is just what we wanted–a reason why entropy increases. But wait. As Boltzmann himself realized, his way of counting possibilities is time-symmetric–if it implies that entropy is likely to increase towards the future, it equally implies that entropy is likely to increase towards the past. In the absence of other information, we have just as much reason to expect the fizz to have bubbled into our glass of champagne in the immediate past, as to expect it to bubble out in the immediate future!

Albert gives a very nice account of this point, and especially of a profound skeptical challenge it contains (first noted, I think, by C. F. von Weizsäcker, in 1939). If Boltzmann’s probabilities are our guide, then it is much easier to produce fake records and memories, than to produce the real events of which they purport to be records. For example, suppose God chooses to browse through possible world-histories, until he finds the complete works of Shakespeare, in all their contemporary editions. It is vastly more likely that he will hit upon a world in which the texts occur as a spontaneous fluctuation of modern molecules, than that he’ll find them produced by the Bard himself. (Versions typed by monkeys are somewhere in between.)

Why is this? Simply because entropy is much higher now than it was in the sixteenth century. According to Boltzmann, probability increases exponentially with entropy. So the higher-entropy twenty-first century–"Shakespearian" texts and all–is much, much more likely than lower-entropy sixteenth century. In other words, almost all possible histories which include the former, don’t include the latter! In Boltzmann’s terms, then, it is extremely unlikely that Shakespeare and his contemporaries ever existed. The same goes for the rest of history. All our "records" and "memories" are almost certainly misleading.

To avoid this startling conclusion, Boltzmann’s statistical reasoning needs to be supplemented by what Albert calls the "past hypothesis"–the assumption that entropy was very low, early in the history of the universe. The big piece of the puzzle which has been added since the 1960s is an understanding of the particular way in which nature supplies this low entropy–the particular way in which the universe is ordered, early in its history. Everything seems to turn on the fact that the matter in the universe was distributed extremely smoothly, immediately after the Big Bang. (Albert himself doesn’t tell this part of the story, but it has been well told by Roger Penrose, in The Emperor’s New Mind.)

It may seem strange that a homogeneous distribution of matter should be said to be highly ordered, but the key lies in the properties of gravity. In a system dominated by an attractive force, a uniform distribution is highly unstable–the "natural" thing is for matter to clump together. (By analogy, think of sticky polystyrene foam pellets, or water droplets on a waxy leaf.)

So the smooth early universe is staggeringly odd, in terms of ordinary conceptions of how gravitating matter might be expected to behave. But it seems to be the sole basic "ordered" oddity needed to account for the vast range of low entropy systems we find in the universe. Initial smoothness seems to provide a vast reservoir of low entropy, on which everything else relies. The most important relevant mechanism is the formation of stars and galaxies. Smoothness is necessary for galaxy and star formation, and most familiar irreversible processes on earth (including humans) owe their existence to the sun.

Why is this such a big piece of the original puzzle? Partly because so much else depends on it, and partly because this is where all the time-asymmetry of thermodynamics goes, if something like Boltzmann's view is right. There is no additional one-way law, forcing entropy to increase, just an even-handed principle to the effect that entropy is likely to be high, other things being equal. And one respect (so far as we know) in which things are not equal, involving the smooth early universe.

Physics has thus found a plausible single candidate for the origin of all the time-asymmetry of the familiar world. This remarkable achievement is in no way diminished–quite the contrary, if anything–by the fact that it raises fascinating new questions. (Why is the early universe so smooth, for example? Could it be smooth again in the future, so that entropy would eventually begin to decrease?) But in practice, I think, the discovery is sadly underrated. To appreciate its significance, one needs to understand the importance of the questions it answers. In particular, one needs to understand the crucial role of the past hypothesis in Boltzmann’s statistical approach. In this respect especially, Albert’s book is a very welcome addition to the literature.

The importance of the issue of the low entropy past has also been obscured by the fact that the debate about the time-asymmetry of the second law has often tried to back a different horse altogether–a rival to Boltzmann’s statistical approach, though also sired in large part (and earlier) by Boltzmann himself. Since the 1870s, much effort has gone into the search for a one-way law–a time-asymmetric reason why entropy is constrained to increase. According to this approach, there are really two temporal asymmetries in play when a cup of coffee cools down. One is the past hypothesis, which accounts for the fact that it is hot in the first place. The other is the one-way law, which "forces" the coffee’s temperature to equalize with its surroundings. (Boltzmann’s own early "H-Theorem" was an attempt to derive such a law, for gases.) In practice, in these approaches, it is the one-way law which hogs the limelight, and the importance of the initial low-entropy past of the coffee tends to be overlooked. Worse still, the two horses are rarely clearly distinguished. This makes it hard to see how far the one-asymmetry view (Boltzmann’s later choice) has raced ahead, these past forty years or so.

Albert gives us an exemplary description of Boltzmann’s preferred horse, but his real money is on its rival. In the final chapter of Time and Chance he argues that a recently suggested solution to the infamous quantum measurement problem–the so-called GRW interpretation, after its authors, Ghirardi, Rimini, and Weber–might also provide a basis for a time-asymmetric law in thermodynamics. And if we needed a one-way law in thermodynamics, this would be an attractive idea. It would make a virtue of what otherwise seems a liability of the GRW proposal, the fact that it is also time-asymmetric. But the point cuts both ways. If we don’t need a one-way law in thermodynamics, it is no defence of the asymmetry of the GRW proposal to point out that time-asymmetry is familiar elsewhere in physics. If Boltzmann’s statistical approach is correct, in other words, the source of the familiar asymmetry lies entirely elsewhere.

According to the statistical approach, nothing forces a cup of coffee to cool. Everything depends on the initial arrangements of its microscopic constituents. Most arrangements are such that it cools, but not quite all. Albert suggests that this dependence on initial conditions is a disadvantage of this approach. In the rival view, he points out, the GRW mechanism introduces a kind of quantum shuffling, so that it doesn’t matter how the coffee molecules are arranged initially–whatever the arrangement, the coffee is almost certain to get cold.

But is there any work for this shuffling to do? To think so is to think that if we didn’t have it, things would happen differently. Coffee wouldn’t cool, or something of the kind. There’s no doubt that such things could happen, without quantum shuffling, if the initial configuration of the world were just right. However, that’s no reason to think that they would happen. What’s missing from the argument is a reason to think that the initial configuration actually was like that. Albert hasn’t given us such a reason, and nor, I think, has anyone else. Hence it is hard to avoid the suspicion that Albert’s proposal is trying to sell us something we simply don’t need–the physical equivalent of an expensive gadget for choosing our Lotto numbers, which makes no difference at all to how often we win.

The problem is not specific to Albert’s suggestion, but afflicts all attempts to find a one-way law in thermodynamics. No one has yet shown that there is work for such a law to do–that there is any remaining temporal lop-sidedness to be accounted for, once the role of the low entropy early universe is properly acknowledged. However, the issue has received little explicit discussion–in part, I think, because the distinction between one time-asymmetry and two time-asymmetry approaches to the puzzle of thermodynamics has not been clearly drawn. Ironically, the conceptual difficulties which led C. P. Snow to regret his choice of example may thus stem partly from the fact that thermodynamics itself has been afflicted with two cultures, not properly distinguished. Albert’s own heart is in the wrong camp, in my view, but his book is a welcome contribution to the project of clearing things up.

 

Huw Price

Department of Philosophy
University of Edinburgh.