Entropy — Definition
Definition
Imagine you have a perfectly organized stack of books. If you accidentally knock it over, the books scatter everywhere, creating a mess. It's highly unlikely they'll spontaneously re-stack themselves. This tendency towards messiness or disorganization is a simple way to think about entropy. In physics, entropy is a more precise concept, but the idea of 'disorder' or 'randomness' is a good starting point.
More formally, entropy is a measure of the number of ways energy and matter can be arranged within a system. When a system has more ways to arrange its particles and energy, we say it has higher entropy.
Think of it like this: if you have a gas confined to a small box, its particles have limited space to move. If you open a valve and let the gas expand into a larger, empty box, the particles now have many more places to go, many more ways to arrange themselves.
This expansion leads to an increase in entropy because the energy and matter are now more spread out, or 'dispersed.
Entropy is also intimately connected to the 'quality' of energy. The Second Law of Thermodynamics tells us that heat naturally flows from hotter objects to colder objects, never the other way around spontaneously.
This process increases the total entropy of the universe. When heat flows from hot to cold, the energy becomes more dispersed and less available to do useful work. So, entropy can also be thought of as a measure of the energy in a system that is unavailable for doing work.
A system with high entropy has its energy spread out so much that it's difficult to harness it for a specific task.
Crucially, entropy is a 'state function,' which means its value depends only on the current state of the system (its temperature, pressure, volume, etc.), not on how it got there. If you start with a system in state A and end up in state B, the change in entropy () will always be the same, regardless of the path (reversible or irreversible) taken between A and B.
However, the *calculation* of often involves considering a hypothetical reversible path because the definition of entropy change is based on reversible heat transfer: .
For irreversible processes, the entropy of the universe (system + surroundings) always increases, while for reversible processes, it remains constant. This fundamental principle helps us understand why certain processes occur spontaneously and in a particular direction.