December 30, 2024, 12:17:14 PM
Forum Rules: Read This Before Posting


Topic: On the idea of Entropy.  (Read 2900 times)

0 Members and 1 Guest are viewing this topic.

Offline Ajathashatru

  • New Member
  • **
  • Posts: 5
  • Mole Snacks: +0/-0
On the idea of Entropy.
« on: May 12, 2020, 03:12:16 AM »
Hi all,

Sorry if this sounds stupid. I just want to get my concept right.

If I'm to add energy to a system, then it would spread out in such a way that the entropy of the resultant state is maximized.

For example, If I'm to heat up a certain portion of all the air molecules in a room, then with time, that heat energy will be equally dispersed throughout the room.

Now the explanations that I've heard for this phenomenon are that the system 'tries out' all the possible combinations and since the most spread out state, with the highest entropy, is most numerous in number, it has the highest possibility of occurrence.

But how exactly is this 'Trying out' happening? What is the physical meaning of this 'trying out'?

Is this analogous to the example of a flowing stream, that the water tries to go in all the ways possible, but the flow of the stream is defined by the direction of the flow of most of the water molecules; downstream?

Please explain.

Offline pm133

  • Regular Member
  • ***
  • Posts: 47
  • Mole Snacks: +5/-0
Re: On the idea of Entropy.
« Reply #1 on: May 12, 2020, 08:49:34 AM »
I try to think of this in terms of what is happening to molecules.

Molecules have a ton of energy in them in various forms (rotation, vibration, translation etc) and as such are in constant movement.
If you start with a perfect cube full of molecules and release them, they'll naturally spread out because they are moving. Over time they'll all go out in difference directions and the localised densities of these molecules will reduce to an average.

The permutations and combinations thing is a mathematical model to help explain this. Molecules are not "trying out" anything - they don't know anything. This is science over-simplifying things to try and get an intuitive picture into your head. The molecules don't go back to their cube because out of all the possible paths to a particular arrangement, there is only one way to do that out of countless billions of possibilities. In comparison, there are countless degenerate paths to a distributed arrangement and so it's more likely to find the molecules in one of those states.

That's how I view it.

Offline Ajathashatru

  • New Member
  • **
  • Posts: 5
  • Mole Snacks: +0/-0
Re: On the idea of Entropy.
« Reply #2 on: May 12, 2020, 12:37:57 PM »
I try to think of this in terms of what is happening to molecules.

Molecules have a ton of energy in them in various forms (rotation, vibration, translation etc) and as such are in constant movement.
If you start with a perfect cube full of molecules and release them, they'll naturally spread out because they are moving. Over time they'll all go out in difference directions and the localised densities of these molecules will reduce to an average.

The permutations and combinations thing is a mathematical model to help explain this. Molecules are not "trying out" anything - they don't know anything. This is science over-simplifying things to try and get an intuitive picture into your head. The molecules don't go back to their cube because out of all the possible paths to a particular arrangement, there is only one way to do that out of countless billions of possibilities. In comparison, there are countless degenerate paths to a distributed arrangement and so it's more likely to find the molecules in one of those states.

That's how I view it.

Isn't that more or less, analogous to the example of flowing water?

How should we see the example of heating up a certain part of a system?

If I'm to heat up a part of my room using a heater, the molecules there will experience greater molecular mobility. Hence, they would transfer heat from one molecule to another. Since it is of tiny probability that the set of molecular motions that facilitates all the energy to be returned back to the starting region, it never happens.

Is that right?

Offline pm133

  • Regular Member
  • ***
  • Posts: 47
  • Mole Snacks: +5/-0
Re: On the idea of Entropy.
« Reply #3 on: May 12, 2020, 01:30:27 PM »
The problem with the water model of thinking is that the molecules are not as independent as they are in the gas phase so it adds a bit more complexity - strong intermolecular interactions, eddy currents etc. and the water definitely interacts with the walls of the container but there's nothing inherently wrong with it if you consider that a liquid also fills up the volume of the container it finds itself in - just like a gas. Remember that water will pool rather than smear out on a surface because of interactions with the surface (wettability will explain this) so it's not a perfect analogy as a gas wouldn't do this sort of localised thing.

I prefer to use your water model for understanding things like vector calculus - div, curl, grad etc.

For me, you just want to use the simplest model and that is of an ideal gas in a cube which suddenly has the lid removed.

Your last paragraph I think is a good way of thinking about it. I've essentially said a similar thing using different words but I think you understand what's going on now.

Offline Corribus

  • Chemist
  • Sr. Member
  • *
  • Posts: 3551
  • Mole Snacks: +546/-23
  • Gender: Male
  • A lover of spectroscopy and chocolate.
Re: On the idea of Entropy.
« Reply #4 on: May 12, 2020, 01:40:00 PM »
If I'm to heat up a part of my room using a heater, the molecules there will experience greater molecular mobility. Hence, they would transfer heat from one molecule to another. Since it is of tiny probability that the set of molecular motions that facilitates all the energy to be returned back to the starting region, it never happens.
The probability is low that you would observe the system is in such a state, unless energy is used to make it favorable.

Imagine you are a shepherd with 100 sheep and a large square enclosure. In the morning all the sheep are in the same place, but as the day goes on, and assuming sheep motion is random (foraging possibilities are evenly distributed), they will drift apart, such that the average distance between sheep is the maximum value allowed by the system. In this state of maximum entropy, the probability of finding a sheep in any given area element is identical. Even though sheep continue to move, and may indeed cluster by accident of random motion, the average probability of finding a sheep in any spot at any time is constant. You can change that fact of course, by using energy to make localization more favorable. If I throw a bag of tasty carrots in one corner of the field, chances are they'll come running, because I introduced a force that draws them, such that motion toward the carrots is more favorable than motion away. So the probability density changes as a result. This doesn't mean entropy is violated, of course. Force field of the system has just changed.
What men are poets who can speak of Jupiter if he were like a man, but if he is an immense spinning sphere of methane and ammonia must be silent?  - Richard P. Feynman

Offline Enthalpy

  • Chemist
  • Sr. Member
  • *
  • Posts: 4036
  • Mole Snacks: +304/-59
Re: On the idea of Entropy.
« Reply #5 on: May 13, 2020, 06:46:16 AM »
That's how I view it.
So do I. Landau & Lifshitz too, if I read them properly.

The next refinement step is to consider not only the degeneracy of the states, but also their proximity, in terms of the observed quantity, for instance the density, the temperature...

All states being equally probable, and measurements being allowed some tolerance, the most likely measure outcome is the one that packs more states within the tolerance.

Side note: entropy drives the equilibrium where only heat gets exchanged. If reactions can happen too, G and µ tell what equilibrium is.

Other side note: when speaking about states, we're dealing about the microscopic entropy. It is not dQ/T. S is good enough when building engines, where essentially heat and work move. The microscopic entropy differs, for instance it is not extensive nor intensive, as its value per mol increases with the amounts. The microscopic entropy reflects for instance that mixing deuterium and protium is irreversible, while no dQ is exchanged.

Offline Ajathashatru

  • New Member
  • **
  • Posts: 5
  • Mole Snacks: +0/-0
Re: On the idea of Entropy.
« Reply #6 on: May 14, 2020, 10:21:52 AM »
That's how I view it.
So do I. Landau & Lifshitz too, if I read them properly.

The next refinement step is to consider not only the degeneracy of the states, but also their proximity, in terms of the observed quantity, for instance the density, the temperature...

All states being equally probable, and measurements being allowed some tolerance, the most likely measure outcome is the one that packs more states within the tolerance.

Side note: entropy drives the equilibrium where only heat gets exchanged. If reactions can happen too, G and µ tell what equilibrium is.

Other side note: when speaking about states, we're dealing about the microscopic entropy. It is not dQ/T. S is good enough when building engines, where essentially heat and work move. The microscopic entropy differs, for instance it is not extensive nor intensive, as its value per mol increases with the amounts. The microscopic entropy reflects for instance that mixing deuterium and protium is irreversible, while no dQ is exchanged.

I'm not sure I understand what you mean. Can you explain what you just said about the proximity and tolerance?

Offline Enthalpy

  • Chemist
  • Sr. Member
  • *
  • Posts: 4036
  • Mole Snacks: +304/-59
Re: On the idea of Entropy.
« Reply #7 on: May 14, 2020, 01:48:59 PM »
Microscopic thermodynamics views a finite number of states whose T, U, H, S... have discrete values.

Though, the number of possible states is so huge (like N! where N is a Avogadro's number) that the values can't be technically separated. In addition, there is some physical uncertainty.

So when counting how many states give some T, H and so on, one must count some tolerance ΔT... around the value. Now, the number of possible states depends not only on their degeneracy, but also on how dense they are on the T, H... scale.

Offline Ajathashatru

  • New Member
  • **
  • Posts: 5
  • Mole Snacks: +0/-0
Re: On the idea of Entropy.
« Reply #8 on: May 17, 2020, 06:27:46 AM »
Microscopic thermodynamics views a finite number of states whose T, U, H, S... have discrete values.

Though, the number of possible states is so huge (like N! where N is a Avogadro's number) that the values can't be technically separated. In addition, there is some physical uncertainty.

So when counting how many states give some T, H and so on, one must count some tolerance ΔT... around the value. Now, the number of possible states depends not only on their degeneracy, but also on how dense they are on the T, H... scale.

Well then, technically you're giving a tolerance to the very definition of degeneracy itself.

Instead of saying that degeneracy implies same value for energy, we may as well as now say the degenerate states fall within a short range of energies?

Offline Enthalpy

  • Chemist
  • Sr. Member
  • *
  • Posts: 4036
  • Mole Snacks: +304/-59
Re: On the idea of Entropy.
« Reply #9 on: May 18, 2020, 05:00:06 AM »
If you wish.

Sponsored Links