In my third year of undergraduate physics course on statistical mechanics, I encountered a really puzzling issue. I spent a lot of time thinking about the solution, also discussing with few other students. This journey also led to find that this issue did arise more than 150 years ago, almost at the same time when the whole field was being invented. In this post I will dive into the paradox and also its resolution, which is a great testament to the intrinsic beauty of thermodynamics.
We have to imagine a rectangular box, with a partition that divides the box into equal volumes, which we will label \(L\) (Left) and \(R\) (Right). The box is also assumed to have poor conductivity of heat, therefore no heat exchange can take place between inside and outside the box. Then, in the left partition we will fill up an ideal gas which occupies a volume of \(V/2\). We wait until the gas reaches equilibrium, and then remove the partition. Our intuition suggests that the gas will expand to fill the complete box.
Any undergraduate text will tell you that this process leads to an entropy change given by the formula \[ \Delta S = nR \log 2\] independent of the gas.
Now you could in principle start with a different situation, and fill both the sides of the partition with the same gas, and the same state, i.e the same pressure, temperature and density. What happens when you remove the partition in this case? There should be absolutely no change in the system, atleast in the macroscopic state of the gas, because it is already an equilibrium situation. We can even verify it explicitly in the diagram.
What’s the entropy change here? Entropy is a state function and since there is no state change, the change in the function should be \[\Delta S = 0\]
This seems very reasonable until now. Now here’s the issue. We can also imagine a situation where you fill both the partitions with a different gas. Now as you remove the partition, since the gases do not interact with each other, we can treat it as independent systems. In such a case, the final entropy change should just be
\[\Delta S = n R \log 2 + n R \log 2 = 2 n R \log 2\]
Now here’s why this was weird to me at first sight. We don’t specify how different the gases are. It could be a very slight difference in their nature, and not even experimentally detectable. If you are like me, and always imagine a gas as balls bouncing around, the difference of the gases might be that the balls are coloured differently. But let’s say there is a colour-blind person can’t distinguish them apart, so will he ascribe the change of entropy to be \(0\) or \(2 n R \log 2\). Yes, most of us are not color-blind but maybe there exists some features of the gas which we cannot “see”? Are we doing thermodynamics wrong then? Or we can’t even do thermodynamics at all?
The more learned people will know that this problem is well known as the Mixing Paradox.
A lot of people assume that the answer to the Mixing Paradox lies in Quantum Mechanics, as there the particles become truly indistinguishable. Two electrons are similar in all respects, there cannot be any more labels that you can attach and no more degrees of freedom. That means that you can only ascribe the \(\Delta S = 0\) when you have two electron gases? But, the inventors of statistical mechanics didn’t know anything much about quantum mechanics. Were they doing thermodynamics wrong?
To understand the solution, I had to look around. Finally, and I don’t remember how, I stumbled across the paper on Gibbs Paradox by Jaynes. The resolution of this problem is deeper than what it appears on the surface. It is not just that thermodynamics is a theory that is universally applicable, it is also applicable with partial knowledge of the world.