Select Page
Poker Forum
Over 1,292,000 Posts!
Poker ForumFTR Community

**Ask a monkey a physics question thread**

Results 1 to 75 of 2535

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    https://www.youtube.com/watch?v=qYSKEbd956M

    Latest PBS... gravity is an entropic force! What does that mean? That gravity is not a force, or even a law of physics, it's just statistically likely, in the same way it's statistically likely, ridiculously so, though not an absolute certainty, that tea and milk mix together nicely when stirred.

    I'll need to watch this again, because it's a bit much. There seems to be a deep connection between gravity and entropy.
    Quote Originally Posted by wufwugy View Post
    ongies gonna ong
  2. #2
    oskar's Avatar
    Join Date
    Apr 2008
    Posts
    7,019
    Location
    in ur accounts... confiscating ur funz
    Quote Originally Posted by OngBonga View Post
    My mistake was to not convert km to m when considering height of water, which accounts for the three orders of magnitude.
    As it so often does.

    Quote Originally Posted by OngBonga View Post
    I should learn about Entropy. I have like a high-school understanding of Entropy, but it seems like a thing where you have to get into the math to really get it.
    The strengh of a hero is defined by the weakness of his villains.
  3. #3
    Entropy is for the most part really badly explained in layman's terms. I think the "disorder" description fails to adequately describe it.

    Entropy is better described imo as the tendency of useful energy to spread out. When you have a bunch of gas atoms with different velocities, all smashing into each other, they are exchanging energy in such a way that the velocity of each individual atom is approaching the average. The energy of each atom with different velocities is useful energy... the faster atom has done work on the slower atom, causing it to gain kinetic energy. In turn, the faster atom loses kinetic energy. As the atoms approach average velocity, the acceleration from collisions is lower, until the system reaches thermal equilibrium and each atom collides with equal velocity, meaning they just bounce without an exchange of energy. The amount of work each higher-velocity atom does when it collides is always getting lower, on average. That's basically the same as saying entropy is always increasing. It happens because useful energy is spreading out over the system. The total energy of the system of course remains constant.

    How this relates to gravity is not something I even remotely understand.
    Last edited by OngBonga; 04-28-2024 at 11:13 AM.
    Quote Originally Posted by wufwugy View Post
    ongies gonna ong
  4. #4
    MadMojoMonkey's Avatar
    Join Date
    Apr 2012
    Posts
    10,456
    Location
    St Louis, MO
    Entropy is a fucky one to understand.

    It's a statistical thing. It doesn't exist in individual quantum mechanics interactions between particles. It's emergent when you look at many interactions.

    It's not a rigid law on the micro scales. The density of air in a box at thermodynamic equilibrium is not constant. Even after infinite time has passed, the particles will never reach 0 speed. The moving particles tend to be evenly distributed in the box, but they are not rigidly confined to always be so. Tiny fluctuations in the position of the particles will create local "hot spots" albeit briefly. The entropy of each particle is constant, but the entropy of each finite volume of space is not. The entropy in the finite volume fluctuates, both up and down.

    I.e. on the microscale, the law of entropy only increasing in an isolated system doesn't really apply. I can imagine a finite volume over a finite time where nothing crosses the boundary of my imagined volume - it is isolated. Still, within that volume, there may be slightly more particles in 1 half of it than the other, however briefly. This "clumping" (exquisitely loosely stated) is antithetical to Entropy always increasing in isolated systems.

    Things get even weirder when we just trust the math and start applying entropy concerns to isolate molecules or particles. The math tends to work out rather well. The math tends to hold up in some cases and the entropy concern makes a good prediction. We can predict the amounts of memory needed to compress data (like mp3's) using entropy concerns. The application of entropy to Information Theory is totally fuckey to me. Even when I get the math, I can't wrap my head around why there would be physical laws for intangible notions like "information." As humans, we have to re-imagine what that word even means to use it in this context, but in so doing, we get a wealth of "good" predictions.
    I'm not entirely sure any human understands why that is the case.
    Normalize Inter-Community Sense-Making
  5. #5
    Quote Originally Posted by mojo
    It's a statistical thing. It doesn't exist in individual quantum mechanics interactions between particles. It's emergent when you look at many interactions.
    At what point does entropy emerge? We can say with confidence that in a one-particle universe, there is no entropy. What about a two-particle universe? Given infinite time, will the two particles collide more than once? Will they collide an infinite number of times? Entropy seems to emerge when there are an infinite number of collisions over an infinite period of time, allowing energy to spread out evenly in the system. If gravity demands that the two particles will continue to interact and collide, then surely entropy exists in the two-particle universe. Maybe there's the link between gravity and entropy in its most basic form.
    Quote Originally Posted by wufwugy View Post
    ongies gonna ong

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •