|
|
Entropy is a fucky one to understand.
It's a statistical thing. It doesn't exist in individual quantum mechanics interactions between particles. It's emergent when you look at many interactions.
It's not a rigid law on the micro scales. The density of air in a box at thermodynamic equilibrium is not constant. Even after infinite time has passed, the particles will never reach 0 speed. The moving particles tend to be evenly distributed in the box, but they are not rigidly confined to always be so. Tiny fluctuations in the position of the particles will create local "hot spots" albeit briefly. The entropy of each particle is constant, but the entropy of each finite volume of space is not. The entropy in the finite volume fluctuates, both up and down.
I.e. on the microscale, the law of entropy only increasing in an isolated system doesn't really apply. I can imagine a finite volume over a finite time where nothing crosses the boundary of my imagined volume - it is isolated. Still, within that volume, there may be slightly more particles in 1 half of it than the other, however briefly. This "clumping" (exquisitely loosely stated) is antithetical to Entropy always increasing in isolated systems.
Things get even weirder when we just trust the math and start applying entropy concerns to isolate molecules or particles. The math tends to work out rather well. The math tends to hold up in some cases and the entropy concern makes a good prediction. We can predict the amounts of memory needed to compress data (like mp3's) using entropy concerns. The application of entropy to Information Theory is totally fuckey to me. Even when I get the math, I can't wrap my head around why there would be physical laws for intangible notions like "information." As humans, we have to re-imagine what that word even means to use it in this context, but in so doing, we get a wealth of "good" predictions.
I'm not entirely sure any human understands why that is the case.
|