My fiancé always gets nervous when I keep my coffee cup too close to the edge of the desk while working. I’m not even sure why I do it; maybe it is an unhealthy cockiness towards the law of gravity or just “thrill-seeking” in everyday life. Either way, it is easy for her to imagine the cup falling off the table and shattering into a pile of pieces, spilling its contents all over the white carpet — a fear that is definitely not unfounded.
On the other hand, few people warn me against moving ceramic shards together too closely, fearing that they might spontaneously rearrange into a cup and unexpectedly jump onto a nearby desk.
Most people also don’t dread adding milk to their coffee to achieve that perfect shade of brown (or off-white, if you’re generous with your milk). After all, they probably haven’t encountered the problem of the liquids spontaneously un-mixing just as they take a sip, leading to a mouthful of pure milk followed by black coffee.
There seem to be certain processes in our everyday life that only ever happen in a single direction in time, as we all intuitively know. What’s fascinating is that this is quite unexpected from a physics point of view. Almost all interactions and laws that govern the behavior of particles on a microscopic level are reversible in time.*
In the case of the broken cup, if all the pieces moved with exactly the right speed and collided in precisely the right way, they could indeed reform into a perfect cup. Just like the carpet could un-stain itself, which would come in handy after hosting a dinner party featuring red wine.
If we want to understand why there is a distinction between forward and backward in time in our universe, we need to understand a concept that we don’t often talk about in everyday life. But while we rarely discuss it, this concept is actually playing out right in front of us. Those coffee cups and ceramic shards we just talked about are showing us one of the universe’s most fundamental principles: entropy.
Nature’s One-Way Street: Understanding Entropy
But first, let’s strip away all the complexity of coffee and ceramics and start with something simpler: Imagine throwing 10 particles in an empty box, as shown in the image below. We put all our particles into the left part of the box and prevent them from crossing into the right part by adding a solid barrier in the middle. If we give the box a good shake, the particles will keep bumping into each other and the walls, constantly changing where they are heading and at what speed.
Now imagine we can’t peek into the box anymore, but we can put our hand on one of the outer walls to feel how many particles strike it in any given amount of time. When we touch the left wall, we will feel a lot of collisions. The right wall? Nothing at all. There’s pressure on the left wall, while there’s zero pressure on the right — like in that bike tire you forgot about all winter.

If we now lift the barrier in the middle, the particles will spread out through the entire space, bumping into each other and all four walls. When we put our hand on different walls now, we notice the collisions feel basically the same everywhere. The particles have found their most natural arrangement.
You might be wondering why we care about particles in a box. Well, first, this simple example helps us understand everything from coffee cups to cosmic rays, since they’re all just particles in the end. Second, physicists have a deep-seated love for turning complex problems into boxes with spheres in them — and surprisingly often, it works!
Here’s the key insight: There are vastly more ways to arrange particles evenly throughout the box than to cram them all into one corner. Let’s look at just 4 particles to keep things simple. Put all 4 on the left side? There’s only one way to do that. Three on the left, one on the right? Four possible arrangements. But split them evenly? Now you’ve got six different ways to create the same balanced pressure. So when the particles are constantly changing their arrangements, they will most likely pick an arrangement from the middle column.

When we scale up to 1000 particles, this preference for balance becomes overwhelming. The chances of all particles spontaneously gathering in one corner become so tiny that you’d have better luck winning every lottery on Earth simultaneously. You can see this in the plot below: the further a distribution is from being perfectly balanced, the more astronomically unlikely it becomes.

Probability of seeing X particles in one of the halves. 500 particles are the most likely by far and deviations become vastly more unlikely the larger they are.
So what does this have to do with entropy? Simple: When there are lots of different ways to arrange particles and still end up with the same outcome (like pressure or temperature), we say that our system has high entropy. When there are only a few arrangements that give us that outcome, that’s low entropy.
We now know that systems naturally evolve from low entropy to high entropy because they’re simply finding their way to the most likely arrangement. This is the famous second law of thermodynamics: In any isolated system (meaning neither energy nor matter can get in or out), entropy can only increase over time!
Going back to our coffee cup example: There are countless ways to arrange ceramic shards into what we’d call a “broken cup.” But there are comparatively few arrangements that would give us an intact cup. Once the cup shatters into that high-entropy state, the chance of it randomly reassembling is essentially zero.
Similarly, there are thousands of ways to arrange stuff in your room that would qualify as “messy” enough to warrant an hour-long cleanup before your parents come to visit. But there are far fewer arrangements that look “tidy.” This is why rooms naturally get messier over time until you invest energy to impose order again.**
The Room Where Time Stands Still
We have seen that the direction of increasing entropy is a very handy way of defining our arrow of time. This, however, requires that there is always some way of increasing the entropy further. But what happens when we reach the maximum amount of entropy? Imagine slowly filling up a completely empty room with air. Once the air has spread through the room and is distributed everywhere, it is incredibly unlikely to crawl back into one corner of the room. This is like our particle box example from earlier, just on steroids because there is such a gigantic amount of molecules making up the air.
The room will reach its maximum entropy state, the state of equally spread-out air, rather quickly. Afterward, the molecules keep moving around, bumping into each other and the walls. Just like the particles in our box example, they switch into different arrangements of molecules. But since the only likely arrangements are the ones showing equally (or close to equally) spread-out molecules, all of these arrangements basically have the same (maximum) entropy.
Our room is definitely changing over time, since there is lots of air molecule movement, but the entropy stays at (or close to) its maximum value. Previously, we have defined forward in time as increasing entropy — but suddenly, that no longer works. In a sense, the direction of time becomes meaningless in this room: There is no way of telling in what direction time is heading anymore. In the case of this air-filled room, the problem is easily resolved: Just wait outside of the room with a watch and stare at the air molecules through a window. You will be able to tell exactly how much time has passed and what molecule arrangement came before which other one.
But imagine for a second that the room is our universe and the air is all the different particles we can find in it. At some point in the future, our universe will likely reach its maximum entropy state of equally distributed particles. And in that case, there is no one to stand “outside” of it and keep track of time — the direction of time would actually cease to exist. This is more of a philosophical point, however, since a maximum entropy universe only has thinly spread-out particles in it. And I heard spreading out the particles making up humans usually kills them — or ends their metabolism, as my professor in chemistry liked to call it. So we don’t have to worry about this scenario too much.
But here’s something interesting: Just because a system as a whole is at maximum entropy, does not mean there cannot be small entropy fluctuations in parts of it. These fluctuations are actually very easy to observe using a light microscope and a speck of dust floating in water. The dust particle (blue) is surrounded by lots of water molecules (red) randomly bumping into it. If the particles were perfectly evenly spread out, the bumping from all directions should cancel out and the speck would stay in its original place. But that is not the case: Just by chance, some regions have fewer particles in them at a given time, leading to fewer collisions. The effect of this is that our speck of dust performs a random movement called Brownian motion, as you can see below.

A large particle (blue) like a speck of dust is surrounded by a bunch of smaller particles (red), like water molecules. The random movement of the small particles spontaneously creates regions of fewer particles and higher particles. The resulting collisions with the larger particle cause it to perform a movement called Brownian motion.
Therefore, on the scale of a dust particle, something rather unexpected becomes possible: random entropy decreases, which we could think of as tiny reversals in the direction of time. If cells had any form of perception, they might “experience” these bizarre moments where their local universe briefly runs backward. The only reason you as a human don’t experience these time reversals is that you are made of too many atoms (yes, even if you are below 1.6 m tall). So for you, these microscopic fluctuations average out completely. Think of it like being a cruise ship versus a tiny rowboat: while the rowboat gets tossed around by individual waves, the cruise ship barely notices them.
This brings us to an almost philosophical puzzle that would have kept ancient Greek thinkers up at night: Any complex organism capable of feeling and thinking must be large enough to rise above this sea of molecular chaos. If it weren’t, it would be like trying to write a novel while riding a mechanical bull — the random fluctuations would make coherent thought impossible. But here’s the irony: being large enough to think and understand also means being large enough to experience the march of entropy, which we experience as aging (through things like cellular damage accumulating over time).
In conclusion: The very capability that allows us to comprehend the concept of aging requires us to be large enough to experience aging. Nature, it seems, has a rather wry sense of humor about these things.
Time’s Hidden Rewind Button
Let us look at another interesting consequence of this arrow of time: I have told you before that for any system, the most likely state for it to be in is the one with the most entropy. This, of course, begs the question of why our universe started out in a state of low entropy and now evolves to higher and higher entropy. After all, wouldn’t it be much more likely to already start out in the most probable state? Of course, the concept of probability doesn’t make too much sense here since we don’t have a whole collection of universes to look at and compare. So how would we judge what initial state is more likely than another? It’s a bit like trying to calculate the odds of drawing a red card from a deck when you only have one card in total: the whole idea of probability breaks down when you don’t have multiple possibilities to compare against each other.
But for the sake of argument, let us imagine for a second a hypothetical universe that simply came into existence at a maximum entropy state of equally spread out particles and now just sits around for eternity (like that odd friend your flatmate invited and who now refuses to leave).
When we have infinite time at our disposal, even very unlikely events become a certainty. Just like how if you flip a coin forever, you’ll eventually get a million heads in a row. Since the idea that entropy always increases over time is “only” a statistical law (albeit a very certain one), it is conceivable that some part of the universe might spontaneously transform itself into a low entropy region — simply due to some cosmically unlikely coincidence. Think of it as all the air molecules in the previously discussed room suddenly deciding to cluster in one corner. It’s not impossible, just monumentally improbable.
You can see this process in action in the graph below: The hypothetical universe spends most of its time at maximum entropy, as would be expected. But occasionally — much more occasionally than this graph implies (we’re talking about timescales that make the age of our current universe look like a blink of an eye) — it experiences a large decrease in entropy, creating what we might call a “pocket” of order in the chaos.

The hypothetical universe spends most of its time at maximum entropy. But simply due to chance, it changes into a state of lower entropy occasionally. Observer B would assume that time is moving “left”, while observer A sees time moving “right” — if we assume they both define increasing entropy as going forward in time. Illustration inspired by Ćirković.
Now here’s where things get truly weird: What would this look like for a person in this universe? Let us say that Alice is coincidentally at point A in this universe. From her perspective, she sees her local region of space becoming more disordered over time as it returns to its natural state of maximum entropy. She calls this direction of increasing disorder “forward” in time, just like we do in our universe.
But what about Bob, who happens to have been born at point B? We cannot know for certain what he experiences, of course. However, it does not seem unreasonable that he would experience time in the opposite direction that Alice does — simply because his brain, like Alice’s, interprets rising entropy as meaning “forward” in time. From his perspective, his local region of space is also becoming more disordered, but in what Alice would call the “backward” direction.
If there is no such hardwired preference in the brain for interpreting rising entropy as “forward,” he could also interpret decreasing entropy as “forward.” But in either case, the key insight is that the direction of time is fundamentally tied to the concept of entropy. And if we use rising entropy as our arrow of time, different observers might actually disagree about which way time is flowing — a concept that makes my head spin every time I think about it.
The Cosmic Siblings: Time and Entropy
I sometimes think of entropy and time as close siblings — only that one sibling gets all the fame while the other is working behind the scenes. Time is the celebrity, the one we’re all familiar with, while entropy is the quieter sibling doing the essential work that makes time’s direction possible. They generally have the same tastes and move in the same direction: More time, more entropy. Like siblings growing up together, you rarely see one without the other.
They rely on each other in ways that are both beautiful and necessary: Without entropy and its steady march toward disorder, we would have no way to tell which way time flows — no arrow pointing from past to future. But without time itself, entropy would be frozen in place: there would be no stage for the cosmic dance from order to disorder, no way for systems to evolve from low entropy to high entropy states.
They can get into fights, but it is rather rare: A maximum of entropy, like in our hypothetical universe of perfectly spread-out particles, messes up the time arrow by making all directions equivalent. But so does a maximum amount of time (an infinite amount, to be precise) mess up the notion of entropy increase being certain. Given forever, even the most improbable decreases in entropy become inevitable. It’s like how even the closest siblings occasionally find ways to complicate each other’s lives.
To make the way back from these philosophical takes to the real world, I want to conclude this article with a practical intuition. If you want to figure out what types of processes increase entropy, try this simple mental exercise: picture any process playing backward in your head. If the reversed version seems absurd or impossible — like a broken cup reassembling itself or spilled coffee leaping back into its mug — then you’re looking at a process that creates entropy and has a definite direction in time. Nature’s preference for increasing entropy is what makes these reverse processes seem so ridiculous to us.
Thank you for following me on this journey through the mysteries of time and entropy. May your coffee cup always stay intact, may your room always be uncluttered, and may you never have to witness the laws of thermodynamics running in reverse!