Does entropy increase with time or does it make time?S

In this week's Ask a Physicist, we're going to go old school and do some thermodynamics. I'll explore why disorder is such a big deal and whether it might explain the mysteries of how time works.

Photo by Jef Safi.

In our bi-weekly "Ask a Physicist" chats, I know we normally do cosmology, relativity, quantum mechanics, particle physics and whatnot, but I thought it'd be a nice change of pace to hit the 19th century for some good old-fashioned thermodynamics. Don't give me that face! Thermodynamics made the industrial revolution possible, and will ultimately be responsible for the death of the universe. I think it's earned your respect.

Our question this week comes from Andrew Cameron, who had the good sense to email me at askaphysicist@io9.com. He asks:

Alright, I get that entropy is a measure of disorder in objects, but really, why is it that important that it should be a law?

If you look at almost all of the laws of physics, the flow of time seems to be an afterthought. Make a movie of two electrons colliding into one another and then run that movie in reverse, and the time-reversal version will look as normal and physically valid as the forward version. At the microscopic level, time seems to be completely symmetric. (Attention nitpickers: The exception is the weak force, but that doesn't matter for this discussion.)

On the macroscopic level, it's a whole different deal. You don't remember the future, for example, and you can't unscramble an egg or unmix a cocktail. And while I'll be happy (should someone have the foresight to send me a question) to talk about the possibility of time travel, in general, there is a single arrow of time.

There is one common denominator distinguishing the future from the past: everything seems to get messier. You may know this as "The Second Law of Thermodynamics." Or you may not. I won't judge.

The Second Law says, colloquially, that things fall apart, or that things get more and more disordered with time, but this isn't quite right. What it really says is that the total entropy of a closed system increases with time. Entropy is a measure of the number of ways that you could switch things around and still have all of the macroscopic quantities remain the same.

Does entropy increase with time or does it make time?S

A very nerdly example

Let me give you an example, since it will help to make things concrete. Suppose you had three air molecules and you put them into the left side of a box. This is a very tidy, very low-entropy way to arrange things. Allow nature to do its business, and the molecules will fly around, each spending about half its time on the left and half the time on the right.

At any moment, you'll see a randomized snapshot of the three molecules. There are 8 different ways to arrange the molecules, but only 2 of them (LLL, RRR) have all three of the molecules on one side of the container. That's only a 25% probability. The rest of the time, the atoms are (more or less) distributed uniformly. And a uniform distribution is a higher state of entropy than a concentrated one.

You could play this same game by taking a hand full of coins and tossing them into the air. For every head, it's like a molecule is in the left-hand side of the room, and vice-versa. Do this a bunch of times, and you'll see that the molecules are almost always nearly equally distributed.

Big numbers turn probabilities into a law

As you increase the number of air molecules to, say, 1026 or more (the number that would fit in a washing machine box), probability dictates that random motions will ultimately make the molecules spread out "evenly." To put numbers on it, the typical deviation from even is about 1 divided by the square root of the number of particles, so in this case, we're saying that the two halves of the box will typically be equally filled to about one part in 1013 (1 in 10 trillion).

Because of quantum mechanics, there really is a fundamentally random component to all of this. For what it's worth, while it's technically possible that all of the air molecules randomly fly out of your bedroom (or wherever you're reading this from) in the next few minutes, it's not something you should stay awake at night worrying about.

Increase in entropy is really only a law because there are so many particles in the universe that it becomes staggeringly unlikely that things will spontaneously arrange themselves into a state of low entropy. We talk about this type of randomness with regards to gambling and weather prediction in chapter 3 of the "User's Guide to the Universe."

As an even geekier example: you wouldn't be surprised to flip 2 heads in a row, but you'd be very suspicious if someone flipped 100 heads in a row. To put things in perspective, if you did 10 flips every second, it would take roughly a trillion times the current age of the universe before you'd expect to get a run that long. To put this another way, at some point, systems become so large that it becomes not merely "unlikely," but "nearly brain-bendingly impossible" that entropy will decrease. That's why we call it "The Second Law." But in fact, it's really just a very, very good suggestion.

Those creationists among you may use this as evidence that complicated things (like peoples or dinosaurs) couldn't ever form in the first place. After all, you are a very highly ordered person, I'm assuming. If you are somehow a disordered homogeneous cloud of gas, then please accept my apologies. But supposing you are a human, it's not that strange that you exist as a little patch of high order.

The real rule is that entropy will increase over the whole universe. So, for example, if you make a nice refrigerator full of cold air, you'll do so at the expense of making a lot of high-entropy hot air. This is why your air-conditioner needs an exhaust, but your space heater doesn't. It's also the reason you can't build a perpetual motion machine. Some of the energy will always be converted into heat, so quit asking, already.

Entropy continuously increases with time. You sit in a hot bath in a cool room, and at first, you feel all warm and cozy, but then things take an unfortunate turn: The water in the room starts to equilibrate with the air and you get cold and shriveled.

The same holds for the future of the universe. As time goes on, heat gets more and more evenly distributed in the universe. Stars burn out, black holes ultimately evaporate, and everything goes dark and cold. Boo!

Does entropy increase with time or does it make time?S

Time and the Second Law

One of the big debates going on in physics is whether the Second Law of Thermodynamics works the other way around. In other words, is the flow time determined by the increase in entropy in the universe? Sean Carroll has written a very interesting book on this very subject.

Stephen Hawking famously related "psychological time," the way we remember things, to "entropic time." In other words, were the flow of entropy to reverse itself then literally (as far as our brains are concerned) time would be flowing in the opposite direction.

One of the big reasons that these ideas have gained traction is because of an observational puzzle. The early universe seems to be in a state of very high order, but there's no fundamental reason why that should have been the case. The universe set up right after the big bang could have been full of disorder, but instead, it was about as orderly as you could possibly get. High entropy gravity systems tend to be clumpy (producing stars, galaxies, and black holes), but the universe was about as smooth as you can get. Why?

Others have gone even further. Erik Verlinde, for example, claims that phenomena like gravity come out of the second law of thermodynamics (and string theory).

I should point out that these are interesting ideas, but not the consensus view of physics. Most would say that time makes entropy increase, not that entropy creates time. That's the view I take. For my part, I think entropy is just something that happens. Or, at very least, something that is very, very likely to happen.

Dave Goldberg is the author, with Jeff Blomquist, of "A User's Guide to the Universe: Surviving the Perils of Black Holes, Time Paradoxes, and Quantum Uncertainty." (follow us on twitter, facebook or our blog.) He is an Associate Professor of Physics at Drexel University. Feel free to send email to askaphysicist@io9.com with any questions about the universe.