- LessonsLearned
- Posts
- Counting possibilities in a universe of order
Counting possibilities in a universe of order

TL;DR: Entropy is not a synonym for clutter or messiness.
reading time: 2^ 8 seconds
When a scientist uses the word “entropy,” many non-scientists roll their eyes. “Oh right,” they say, “that’s just a fancy word for messiness.” Popular culture has turned the Second Law of Thermodynamics into a grim. If entropy equals disorder, therefore the universe is doomed to chaos. At least that’s what they say.
But you know what, entropy is not a synonym for clutter; it’s just a precise measure of how many microscopic ways a system can arrange itself without changing its macroscopic appearance. Entropy is just another measure of a count. Let me briefly take you back in time.
I promise it won’t take long.
A brief history
Entropy entered physics in the mid-19th century thanks to the German physicist Rudolf Clausius. While studying how steam engines convert heat into work, Clausius realised that certain combinations of variables remained invariant as heat flowed from a hot reservoir to a cold one. He defined the differential of entropy as the heat transfer divided by temperature, and gave the new quantity a name: “entropy” or, “transformation content”. Clausius did not claim to know what entropy was; he simply discovered a quantity that changes predictably in a thermal process.
It took Ludwig Boltzmann and Josiah Wilard Gibbs to connect entropy with the microscopic world. They showed that the thermodynamic entropy of a system is proportional to the logarithm of the number of microscopic configurations, or microstates, that correspond to the same macroscopic state. In Boltzmann’s famous formula, S=k*ln(omega), omega is the multiplicity, the count of microstates. The more ways the molecules in a cup of coffee can rearrange themselves while still looking like “coffee,” the larger omega and hence, the larger its entropy.
This microscopic view clarifies the initial key point mentioned: entropy is not inherently disorder or chaos. It’s a tally. When there are many possible microstates, entropy is high; when there are few, entropy is low. A shuffled deck of cards and an ordered deck have the same entropy if we count each arrangement equally. Physicist Dan Styer and colleagues trace the disorder metaphor to early classroom analogies. They argue that “disorder” is a psychological notion, not a physical one. Equating entropy with human messiness often only leads to confusion.
A quick one on thermodynamic entropy
Modern textbooks emphasise that thermodynamic entropy tracks thermal randomness and energy dispersal. Milivoje Kostic reminds readers that entropy is “related to thermal energy and its heat transfer only.” Extending the concept to any type of disorder, be it structural, informational or social, is in Kostic’s words, “too general and overreaching.” In other words, thermodynamic entropy tells us about how heat is shared among atoms and molecules. If you have ever watched a drop of cream swirl into coffee, you’ve witnessed energy dispersal and an increase in entropy - there are vastly more ways to arrange cream and coffee molecules when they are mixed than when they are separated - hence, entropy increased.
The Second Law of Thermodynamics emerges naturally from this microscopic picture. In every real process like rubbing your hands together, gasoline burning in your car, electrons flowing through a circuit, etc., some energy is dissipated into random motions of atoms. Because that random energy increases the number of accessible microstates, total entropy increases. Kostic stresses that entropy “is generated everywhere and always (and thus overall increased) at any scale” and “cannot be destroyed.” Even when local entropy decreases, such as inside your refrigerator, it does so by exporting greater entropy to the environment. A refrigerator makes its interior cold only by warming your kitchen.
One reason the Second Law feels intuitive is that we have experience it daily. Henning Struchtrup presents a non-equilibrium perspective in which five simple observations capture the essence of the law:
Manipulation - A closed system can change only through transfers of heat or work.
Relaxation - An isolated system tends toward a unique, stable equilibrium.
Uniformity - In equilibrium, temperature is uniform throughout the system.
Irreversibility - You can move positions or stir liquids in either direction, but friction ensures some work is lost as heat.
Heat flow - Heat spontaneously flows from hot to cold and never the reverse.
These statements describe why your coffee cools, why compressed gas warms when released, and why perpetual motion machines remain science fiction. They also explain why living organisms must constantly consume energy to maintain order. They import low-entropy energy from sunlight or food and export heat, which then increases the entropy of their surroundings.
Philosophically, entropy also underlines our very sense of time. Carlo Rovelli argues that the abundance of traces like footprints in the sand, craters on the Moon, photographs, etc., is rooted in thermodynamics. He shows that three conditions are sufficient to produce macroscopic traces. First, the separation of systems; second, a temperature difference between them; and third, a long thermalisation times. In our universe these conditions are ubiquitous, so traces of the past are plentiful. Crucially, traces “transform low entropy into available information.” The Second Law, which breaks time-reversal symmetry, ensures that we remember the past but not the future. When a ball sets a pendulum swinging in Rovelli’s illustrative model, the oscillation records the past oscillation. To reverse that record would require an implausible transfer of energy from the pendulum back to the ball, which would decrease entropy (and why you’ve never seen this happened before).
Avoid these misconceptions
Entropy as disorder
The popular metaphor of entropy as disorder is misleading. It anthropomorphises a physical quantity and obscures the role of symmetry. Edward Bormashenko proposes replacing “disorder” with “lack of symmetry”. He demonstrates that introducing symmetry into a physical system reduces the number of accessible microstates and lowers entropy. For example, a row of magnetic spins can be arranged in many random combinations, but if you impose symmetry - every spin on the right mirrors the spin on the left - the number of valid configurations drops. Such symmetrisation “orders” the system and decreases its entropy. Equating entropy with disorder therefore conflates a subjective notion with a precise statistical count.
Negative entropy and life
Popular writers sometimes claim that life feeds on “negative entropy” or negentropy. This phrase arises from information theory, where conditional entropy can be negative, and indicates that knowledge of one variable reduces uncertainty about another. In thermodynamics, however, entropy counts microstates; there is no such thing as negative number of microstates. When the Second Law is applied to open systems like living organisms, it permits local decreases in entropy provided they are offset by equal or greater entropy increases elsewhere. Plants, animals and people do not violate the Second Law. They take in low-entropy energy (sunlight, chemical bonds from food) and emit higher-entropy waste (heat and metabolic byproducts).
Entropy and information
The mathematical similarity between Boltzmann’s entropy and Shannon’s information entropy has inspired fruitful cross-disciplinary insights but also confusion. Shannon defined the entropy of a probability distribution as a measure of the average uncertainty in a message. It counts possible messages, not microstates of a physical system. Applying thermodynamic entropy to describe information can be useful (for example, Landauer’s principle shows that erasing one bit of information requires dissipating at least k_B*T*ln(2) of heat), but only if we remember that the two concepts refer to different kinds of counts. Bormashenko’s examination of magnet models demonstrates that erasing a bit by flipping a magnet carries an energy cost; imposing symmetry restrictions reduces entropy but does not violate Landauer’s bound. As Kostic warns, generalising thermodynamic entropy to any kind of disorder can be misleading.
By reframing entropy as a measure of possibility rather than disorder, we gain a more accurate and incredibly inspiring picture of our universe. Far from heralding inevitable chaos, entropy explains why energy flows the way it does, why memories exist, and why efficient design matters. And in a world grappling with climate change, technological resolutions and information overload, understanding entropy helps us navigate complexity with clarity, but more importantly, purpose.
For the love of mankind,
Krish
If you enjoyed this post or know someone who may find it useful, please share it with them and encourage them to subscribe: https://lessonslearned.beehiiv.com/p/counting-possibilities-in-a-universe-of-order
🚀 Cool things of the week
Some of my favourite videos I found on the internet this week…
The most misunderstood concepts in Physics
Reproducing the calculations from Interstellar
Lex Fridman’s podcast with Sundar Pichai (CEO of Google)
📥 Want to advertise in LessonsLearned? Send me an email.
Reply