The Last Question, Entropy, and the Direction of Time

Isaac Asimov’s short story “The Last Question” has an ostensibly arcane subject: entropy, that strange and dark force that will eventually destroy civilization and the universe with it. The titular question from “The Last Question” is how exactly to prevent this from happening – can we reverse the flow of entropy? I don’t want to spoil Asimov’s masterpiece (which you should definitely read) but suffice to say that entropy is indeed reversed at the end, though it is not mentioned how exactly it is done. But what exactly is this strange concept called entropy?

Entropy is a bit of a fuzzy word, denoting the “degree of randomness or disorder within a system”. This rather broad definition allows us to get away with a lot: I can call my messy desk a “high entropy work surface” or a pile of heaping nachos “a high entropy snack”. But this fun view of entropy obscures its darker meaning. Entropy stars in the haunting dreams of nihilists, a quasi-magical quantity that increases inexorably, homogenizing our universe into the blandness of heat death. Entropy is not conserved; any process that occurs will increase the total entropy of the universe (even if it decreases entropy in some regions, it will always be counterbalanced by a larger gain in entropy elsewhere according to the Second Law of Thermodynamics).

I think a better way to think of entropy is in terms of probabilities. Mathematically, entropy is a rather simple expression: for each possible outcome, entropy adds up the negative of the probability times the logarithm of the probability. You also need to multiply this total by a constant, but since this constant (called Boltzmann’s constant) just comes from the units we use to measure temperature and energy, it’s not that important.

A better way to express entropy is in terms of bits, a unit of information that we’ve become accustomed to because of computers. As a simple example, the entropy associated with flipping a fair coin is exactly 1 bit. If the coin is unfair, the entropy is slightly less than 1 bit. The entropy of flipping two fair coins is 2 bits, and so on.

Entropy is therefore best imagined as a way to measure the number of possible outcomes, weighted by how fairly these outcomes occur. You can get a high entropy either by having a lot of possibilities or by having very fair outcomes that have an equal chance of happening. From this perspective, the heat death of the universe is really just about radical equality: if everything is evenly spread out everywhere, then all outcomes are equally likely to happen and therefore the entropy reaches its maximum.

This definition should make it clear that you can’t really call your messy room “high entropy”. Your messy room is itself a single outcome in the set of all possible room arrangements, so it does not have any meaningful “entropy” on its own. On the other hand, “messiness” itself could be likened to entropy – an organized room has many fewer possible arrangements than an unorganized room, so “messiness” itself has a higher entropy than “tidiness”.

From this analogy, we see that entropy is only meaningful when you are talking about a distribution of possible outcomes rather than a single outcome. This notion is exactly why entropy is useful in thermodynamics: when you are talking about something macroscopic, with enormous numbers of molecules floating around, you can only ever talk about probability distributions. The second law of thermodynamics – the so-called “arrow of time” – emerges because of these enormous numbers.

Here’s an example to investigate this phenomenon: suppose you start with two barrels filled with colored marbles, one with green marbles and one with blue marbles. You mix them together and then pour half of the mixture back in the first barrel and half in the second barrel. What is the chance you put all the marbles back in their rightful barrel? If there was only one marble of each color, you have a 50:50 chance of not mixing them up. As you add in more marbles, the chance of separating them correctly decreases exponentially, creating what we think of as the one-directional flow of time: the chance that hundreds of marbles spontaneously unmix is unimaginably small, to the point where it just doesn’t happen. Entropy allows you to keep track of this mixing, increasing whenever you mix and homogenize.

Something exactly like this mixing process happens whenever a macroscopic change of any sort happens. Your muscles mix up atoms within molecules to move themselves forward. Engines mix hot and cold gases to turn themselves. Computers mix up chemicals inside their batteries to force electrons to flow in the right directions. Because of probabilities, there is almost no chance any of these can go the wrong direction, creating the sensation of forward-moving, irreversible time. To be fair, we are still talking about probabilities at the end of the day, and it’s still possible for entropy to spontaneously reverse, just given enough time. I like to think this notion is the solution thought up in Isaac Asimov’s famous short story “The Last Question” – you just wait out the heat death of the universe, and eventually, after a long enough time, BOOM, everything comes back!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: