Talk:Entropy/Disorder

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

2004[edit]

Hm, no mention of symmetry here. It struck me tonight that what's meant by order in entropy discussions is broken symmetry, and by disorder, symmetry. So entropy should be measuring some sense of the amount of symmetry. A little Googling indicates I'm on the right track, but my IB Chem in highschool totally ruined me for understanding entropy =p so I'm just gonna try to prompt someone not buffaloed by the shebang into checking out entropy-symmetry stuff. I'm fairly sure there's something important there

The whole "disorder" thing is a bit misleading. What people typically mean by an "ordered" state is that the positions and identities of the parts are well known. If a deck of cards is "ordered", then you know exactly where each card is, and therefore the deck is in exactly one state. If you shuffle the deck, or make it "disordered", the position of the cards is unknown, and therefore there are more possible states in which the deck could be. The possible number of states (or the log thereof) is what entropy measures. In your question, "symmetry" by definition means that we know something about the state, which reduces its entropy. Jellyvista 07:42, Apr 12, 2004 (UTC)

As for entropy as disorder[edit]

Can I raise the issue of our definition again? I wonder if it's best to compare entropy to disorder this early in the article. To be sure, we're looking for a kind of basic, intuitive definition that everyone will understand. But consider that "disorder" is a sort of vague (and often ambiguous) concept in its own right. It's often ambiguous because, for example, some might say that a cup of water with a bunch of crushed ice in it is more disordered than a cup of water with no ice; but, of course, the former system has a lower entropy than the latter.

Since entropy is more or less the multiplicity of a system (but toned down in hugeness by the natural logarithm and given units by Boltzmann's constant), what about saying "Entropy of a system in the context of thermodynamics is a measure of the uniqueness of its state in terms of its energy." And then elaborating... "In particular, if the system is a gas with energy E then the entropy of the system is the number of states accessible to the gas all having energy E."

This is the way I have it written as of this edit and I believe it is conceptually "graspable" enough without sacrificing precision.

I agree
I think we should consider getting rid of the term "disorder" in this article because the vast majority of the public at large seems to think that entropy is the same as disorganization, and textbooks are perpetuating this fallacy. What does everyone think about greatly simplifying the introduction by moving most of the heavy science lingo to a section further down the article and then including a more prominent section (early in the article) about how "disorder" in this context does not mean "disorganization?" I started this process with the new section at the beginning...perhaps we can add more and more examples of how disorder is not disorganization, and still keep the language as basic as possible.Pkeck 18:30, 14 November 2005 (UTC)[reply]
I wonder
Hello Pkeck - please read the recent section I added at the end entitled "Entropy and the term Disorder". I am a believer in having a simple introduction, perhaps introducing the names of the more sophisticated concepts. I also believe there should be a distinct line drawn between the thermodynamic concept of entropy which makes no use of the macro-microstate concepts and the statistical mechanical explanation of the thermodynamic concept. Also, I think the idea that entropy is not a measure of disorganization needs some discussion. I mean, it does correspond to disorganization in the popular sense. To use the bedroom analogy you have introduced, there are many ways a bedroom can be a mess, but only a few ways it can be "organized". The fact that there are many really subtle points about the definition of what is order and what is disorder doesn't change this. PAR 19:28, 14 November 2005 (UTC)[reply]
More thoughts
Yeah, discussion of this topic is so tricky with trying to use the right words to get the right idea across :). Here's another analogy I was thinking of: If I have a blanket on my bed, and I mess it all up and scrunch it up in some places, you could say that the blanket is disorganized. It isn't straight on the bed, it isn't laying flat, and it isn't evenly distributed over the surface of the bed. Some places of the bed are high in blanket content, and some places are low in blanket content(and some places have no blanket at all). It's all unevenly distributed.

That's kind of like how the universe is now. Some places are high in energy and some are low. The energy isn't distributed evenly over the universe.

Back to the blanket...if I take the blanket and flatten it out so that it's perfectly smooth and straight, and evenly distributed over the surface of the bed, then you could say that the blanket is now more organized. It's straight and square and flat. It's organized.

This is what is happening in the universe. The distribution of energy is heading from being piled up in some places and low in others, to being evenly distributed everywhere. It's entropy is increasing. Entropy is undoubtedly increasing in the universe. But depending on how you define "organization," you could say that the organization is increasing (if you view an even distribution as "organized") or you could say that the organization is decreasing (if you view a lumpy distribution as "organized").

So I guess I don't agree that there are only a few ways a bedroom (or the universe at large) can be labeled organized, because we all have different definitions of what "organized" means. And an "organized" universe to one person is a "disorganized" universe to another depending on what they think each term means. A lumpy energy distribution is the polar opposite of a smoooth energy distribution, and you could call either one "organized" or either one "disorganized." So if one universe is the polar opposite of the other, and we can assign one adjective to it -- or that adjectives polar opposite -- I'd say that the terms are in need of revision.
Entropy, however, is something that we can all agree on.

Here's a paper from the journal of chem ed that talks about this concept: http://jchemed.chem.wisc.edu/Journal/Issues/2002/Feb/abs187.html

And a page that copied the text: http://www.entropysite.com/cracked_crutch.html

I haven't closely examined the second site, so it is possible that they made changes to the original paper that was in the journal. But there it is, anyhow. Pkeck 00:46, 15 November 2005 (UTC)[reply]

Quick Follow Up[edit]

The first example in the paper I linked to above is a great example of what's confusing about these terms. They mention a jar half full of water and half full of oil. At one end of the organized/disorganized spectrum, you have the oil completely separated from the water, and at the other end of the spectrum, you have the oil completely mixed with the water. Which one is more organized? Completely separate, or completely mixed? Where, in the gray area between completely mixed and completely separate, is the system closer to being organized or disorganized? Pkeck 00:55, 15 November 2005 (UTC)[reply]


Ok - I read the second reference and its good. I agree that the word "disorder" can be misleading, but I don't think it should be discarded, just qualified. Should we get rid of the word "work" in physics - its way more misleading than "disorder" is. Entropy IS disorder, its just that you have to define what "order" is.

The rest of this is trying to clear up the idea of what is ordered, what is disordered. Please don't just skip through it and ignore it if I'm not being clear. Tell me what is unclear about it. Also, I'm telling you my understanding, which is never perfect.

  • Disorganized is what you define it to be. If you don't define it, then you can't wonder about it. The oil and water is neither organized nor disorganized until you define organization. Same with the bedroom. Same with a thermo/statmech system.
  • You have to define macrostates. They are nothing until you define them. Same for microstates. You don't ask what is the macrostate, you tell.
  • Once you define your micro and macro states, the entropy of a macrostate is k times the log of the number of microstates that present themselves as that macrostate. More microstates, more entropy.
  • THE MOST IMPORTANT STATEMENT - thermo/statmech doesn't care how you define these. If you redefine your macrostate, thermodynamics will not become invalid. It will be perfectly consistent. If you redefine your microstates, again thermo/statmech will not become invalid. Just don't suddenly try to call two states that you've defined as the same macrostate different.

Here is a really good example (mixing paradox)- I have a box with a partition. Both sides have equal amounts of gas, same volume, pressure, temperature, everything, except one side is a gas of type A molecules, the other side is type B. Each side has entropy S, for a total of 2S. I remove the partition. After a while they mix, and the total entropy is 2S times the square root of 2. The extra entropy came from the mixing - there are more ways to have a mixed gas than there are to have A on one side and B on the other. Now I do the experiment again, but with gas A on both sides. I remove the partition, wait a while, and the total entropy is now 2S - no change.

IMPORTANT POINT - in both cases the temperature times the change in entropy is the minimum amount of work you need to return the gas to its original state. When the gas is the same, how much work do I need to do to restore the original macrostate? NONE - I just drop the partition in, I'm done. With the mixed gases, trust me, I have to work to separate them again, and TΔS is the least amount of work I need to do.

Here is the THE MOST IMPORTANT STATEMENT again - If for some reason I couldn't tell that I had two different gases, I would say the entropy was 2S after the partition was removed. When asked to restore the original state of the gas, I would drop the partition, do no work, and there would be no problem, both sides would be the same to me - no work needed. In other words, by ignoring the difference between the two gases I have redefined the macrostate, and thermodynamics offers no objection, no internal inconsistency, no problem. PAR 03:58, 15 November 2005 (UTC)[reply]


Great points. I'll have to read this again a couple of times to makes sure I'm clear on everything, as my actual "understanding" of entropy is minimal at best. Here's the part I'm confused about right off the bat...are you saying that if, to your knowledge, you have two different gases separated in containers, and then you let them mix, the entropy goes up by amount x; but if, to your knowledge, you have two separate containers full of the same gases, and they mix, the entropy goes up by amount y? Is the entropy of the system dependent on whether you're observing it accurately or not?

Thats about the best question I can imagine on this problem. If I were to answer it carefully, I would say no, what I am saying is that there will be neither an experimental nor theoretical problem until you become able to detect the difference between the gases. Any problem that you do run into will, by definition imply that a method of detecting the difference has been found. If I were less careful, I would say that entropy and the thermo equations involving it are a statistical technique of organizing our knowledge rather than being a completely objective "out there" kind of thing. I believe the quantum wave function also qualifies for the above statement. But if you tell anyone I said that, I'll deny it. Check out Jaynes, E.T. (1996). "The Gibbs Paradox" (PDF). Retrieved November 8, 2005.

How does your example relate to points 2 and 3 in the paper I linked to? Are those examples at odds with yours, or do they support one another, and why?

No. 2 is just like the "different gas" scenario, except the other gas is a vacuum. The point he made here was that a disorderly mob is not more disorderly when it occupies a larger volume, therefore "disorder" should not be applied to entropy. I could make the same case for any technical term that has colloquial usage. No. 3 is just like the "same gas" example, no difference.

I'm certainly not trying to question the validity of the concept of entropy, I'm trying to figure out what language will best describe it. I did a quick search on yahoo and google about the definition of disorder, and none of them talk about entropy...and then a quick search on entropy, and most of them talk about disorder without any description of what that means.

I noticed that after the first bullet point in your post, the words "order/organized" and "disorder/disorganized" weren't used again. What do these words contribute to the definition and description of entropy that other words can't do more clearly? Pkeck 04:58, 15 November 2005 (UTC)[reply]

The point I was trying to make was that if you define all these concepts, then you can define disorder to be a measure of how many microstates make up a macrostate. After reading the article you mentioned, I see that that is kind of forcing the situation. So I want to back off and just say that the concept of disorder has some utility in describing entropy, but it can be carried too far. I mean its intuitively true that using the colloquial definition of disorder, there are many more ways to have a disordered bedroom than an ordered one. If you start picking apart the analogy, it will chip and break. At that point, the real deal should be introduced. PAR 06:11, 15 November 2005 (UTC)[reply]
Also, I agree that the term "work" shouldn't be abandoned. When I say, "The work a system is able to do," most people would understand that I don't mean, "The ability for that system to clock in and earn a paycheck."
Yes - thats what I mean.
However, this is exactly the gross misconception that people have when they hear "disorder." They don't think, "occupation of a greater number of microstates," they think, "Socks flinging themselves out of drawers and books jumping off of shelves." The misconception is much more severe with "disorder," it really seems to be in a class all its own. Pkeck 05:11, 15 November 2005 (UTC)[reply]
How about "that concept works" yet its working produced no energy or "what works for me may not work for you" which denies the conservation of energy. No, I think the word "work" is extremely misleading :) PAR 06:11, 15 November 2005 (UTC)[reply]

Holy mackerel, I think I'm gonna have to read the article you sent about ten times. This is probably going to be super annoying since it's annoying when people skim through stuff....but I skimmed through the paper and searched for the words "order" and "disorder," and they didn't need to use those terms anywhere to get the point across. In fact, the only mention they make is to say that these terms still clutter up textbooks.

I'll agree with you that we could clearly define disorder and say that, "Disorder, by definition, is a measure of how many microstates make up a macrostate." The thing I want to get away from is saying that "Disorder is a measure of how many microstates make up a macrostate; in other words, a dirty room has high entropy." I would say that there are just as many ways to order a room as there are to disorder it...for every single example of a disordered room, there is another example of an ordered one. The analogy seems more detrimental than helpful; and the language we use in all of these analogies needs work as well (I use the term "work" loosely here). There has to be another analogy we could use that's just as approachable and understandable but much less misleading. Like something about people occupying open seats in a theater, and spreading out more if new seats become available, or something? Any thoughts? 67.108.70.18 19:45, 15 November 2005 (UTC)[reply]

ok, I agree, the term "disorder" should not be used as if it had a precise technical definition. My point is that we should not avoid using it in an informal way to introduce the concept of entropy. Eventually its limitations will be evident, as with any other informal analogy. For the bedroom example, I disagree. For every object in the ordered bedroom there is one or a few places for it. For every place where it is ordered, there are a huge number of places where it could be in a disordered state. There are a few ways of hanging a shirt in a closet, but many ways it can be thrown randomly on the floor.
I have removed the first "way in which it is misleading" because that part was definitely wrong, in the sense that no mention is ever made of how the bedroom was messed up, so rejecting the analogy for a point that it never made is wrong. PAR 22:38, 15 November 2005 (UTC)[reply]

Analogies for entropy[edit]

I agree that my analysis of the bedroom analogy is flawed. I think we should abandon it all together, it's just a mess. There has to be a better analogy that doesn't involve this debate of how many ordered states there are versus how many disordered states there are. We should pick something that deals with the number of available states and the occupancy of these states. I think that something like people occupying seats in a theater, or something like that is much more clear. Then we could say that there are a limited number of seats in this theater. Everyone wants a seat, so each spot is slowly filled up. If we continue to add more people after all of the seats are taken, the pressure in the theater goes up...and if we suddenly add a whole new section of seats, the people will try and relax and occupy those new microst---er, seats.

Or we could use a bench, and say that it has a barrier at one end, and we keep cramming more and more people on the bench. Then if we remove the barrier and give people more room, they will scoot down to occupy the new space.

Or maybe a better analogy would be urinals in a mens restroom, those tend to be occupied in the most diffuse way...and if any more urinals were to suddenly appear, you can bet that everyone in the room would spread out to occupy a wider range of spots.:)

In the interest of clearly pointing out the weaknesses of these analogies, perhaps we can include a section on what's wrong with them. We should also include something on common misconceptions about entropy and the term disorder. I think we should include the whole "spontaneously getting messy thing" for sure. And maybe we could include something about the idea of a "closed system" and how the earth is not a closed system...I still see that second one pop up in discussions of evolution. What do you think?

In any event, we should use an easy analogy that's mathless and as devoid of confusing aspects as possible. Pkeck 02:16, 16 November 2005 (UTC)[reply]

Ok - heres an idea - it may be a bit too complicated but maybe not. You take a deck of cards and order them, 2345..A of clubs, then hearts, then diamonds, then spades. Thats zero entropy, completely ordered, BY DEFINITION. The microstates are the exact order that the cards are in. Theres 52 factorial =8x1067 different microstates. (if we had 3 cards there would be 3!=6 and they would be ABC, ACB, BAC, BCA, CAB, and CBA.) Now I take two cards at random and have them change places. Now I have a "one flip" microstate, its one exchange away from the standard state. There are many ways to create a one-flip microstate, 1326 ways to be exact. The set of one-flip microstates DEFINE the "one-flip" MACROstate and its entropy is log(1326)=7.19. Now we go to 2-flip MACROstates. Theres more of them, I dont know the number, but lets say its 10,000. The entropy of the 2-flip MACROstate is log(10000)=9.21. Its getting bigger. I think that the maximum number of flips you can have is 277 before you unavoidably start repeating yourself. I also think that the number of 277-flip MACROstates is relatively small. Just look at the case for three cards. ABC is 0-flip, BAC, CBA, ACB are 1-flip and BCA and CAB are 2-flip, and thats it. The number of microstates are 1-3-2, it goes up then down. We could plot the entropy versus the number of flips for the deck of cards and the plot would have a big peak in the middle. Maximum entropy. If you take the deck of cards and shuffle it thoroughly, its a good bet that its going to wind up at an entropy pretty near that maximum. Check out http://www.av8n.com/physics/thermo-laws.pdf for more on this idea. PAR 03:03, 16 November 2005 (UTC)[reply]
If you had to define entropy in the absolute least number of words, using words that didn't require any (or at least very very few) qualifications, how would you do it? I'm a fan of what you wrote before, "more microstates, more entropy," so maybe that could be turned into,
  1. "Entropy is a measure of the number of microstates; more microstates = more entropy, and vice versa."
  2. I also like the definition on one of the pages we talked about, "Entropy is the index of the dispersal of energy within a system and between the system and its surroundings."
  3. This is another good one, "Entropy measures the dispersal of energy among molecules in microstates. An entropy increase in a system involves energy dispersal among more microstates in the system's final state than in its initial state." Pkeck 05:08, 16 November 2005 (UTC)[reply]

I don't like the last two. Take the case of two gases A and B, one on one side, one on the other. Both at the same temperature and pressure, etc. If you remove the partition, the entropy increases, but there has been no redistribution of energy! Entropy doesn't measure the amount of dispersal of energy, it measures how many ways it can be dispersed. When you remove the partition, and the gases mix, theres no redistribution of energy, but there is an increase in the number of ways it could have that energy. There is an increase in the number of microstates which have that energy.

Only for reversible processes are the last two statements true. The above example is definitely irreversible. I think the playing card analogy is really good, it goes a really long way before it breaks down.

As far as the definition of entropy, I don't know. In my mind there are two definitions, the statistical (k times the log of the number of microstates) and the thermodynamic (the integral of the heat transferred divided by the temperature). I think both definitions should be given, then the link made. If I had to define entropy with the fewest words, yes it would be the log of the number of microstates comprising a macrostate. But the implications of the words microstate and macrostate are HUGE. PAR 05:32, 16 November 2005 (UTC)[reply]

Are you saying that entropy is a measure of how many microstates are available, and not a measure of whether or not those microstates are occupied?
Yes, absolutely. At any given moment, they are all unoccupied except one.
If so, then for statistical entropy, how about, "Entropy is an index of the number of available microstates in a macrostate. An increase in entropy corresponds to an increase in the number of available microstates, and a decrease in entropy corresponds to a decrease in the number of microstates."
sounds good
If you think we should include something about those microstates being occupied, how about, "Entropy is an index of the number of available microstates and how evenly they are occupied. If there are many microstates and they are evenly occupied, that corresponds to high entropy." I like this second one more, because if we had a large container of a gas, and all of the molecules of gas were piled together in the middle, then there are many microstates, but few of them are occupied...so I think we should include something about the microstates as well as the dispersal of energy among those microstates.
I think you are not thinking of microstates right. A microstate is a discription of how a bunch of energy levels are populated. If I throw 10 balls into 3 boxes, thats a microstate. If I take them out again, and throw them in again, thats another microstate. Only one microstate is occupied at a time.
And for the thermodynamic, how about, "Entropy is a measure of the amount of energy that flows into a system per unit temperature." Pkeck 18:33, 16 November 2005 (UTC)[reply]
Again, not true except for reversible. I'll write this again, (with a minor modification :) PLEASE READ IT, OK?
Take the case of two gases A and B, one on one side, one on the other. Both at the same temperature and pressure, etc. If you remove the partition, the entropy increases, but there has been no redistribution of energy! Entropy doesn't measure the amount of dispersal of energy, it measures how many ways it can be dispersed. When you remove the partition, and the gases mix, theres no redistribution of energy, but there is an increase in the number of ways it could have that energy. There is an increase in the number of microstates which would have that energy, if they were the microstate the system was in.
No reason to get upset, I've read everything you've written. I was simply restating what you said. You said:
..."the thermodynamic (the integral of the heat transferred divided by the temperature). "
I said:
..."the thermodynamic, how about, "Entropy is a measure of the amount of energy that flows into a system per unit temperature.""
How should that be rephrased? Should it include a qualifier for the difference between irreversible and reversible situations? Pkeck 04:22, 17 November 2005 (UTC)[reply]

I'm not upset, I just drank too much coffee. Anyway, if I said what you said I said without qualifying it by saying "for reversible processes", then I'm at fault. We should always try to qualify it that way. The statistical definition "more microstates, more entropy" is pretty much infallible as long as the system is large, like more than a few hundred particles. The thermodynamic definition, its clear to me for reversible, and its clear to me for that one irreversible mixing problem, but I will have to think about a general thermodynamic definition.

Oh, and the balls into boxes thing...that's a perfect analogy. Simple, no math, everyone can relate. That should definitely go into the article. More boxes more entropy...correct? Pkeck 04:28, 17 November 2005 (UTC)[reply]

Yes, for the same number of particles - but there are more ways to put ten balls in two boxes than there are to put one ball in ten boxes. Each "way" is a microstate. Ten balls in two boxes has more entropy than one ball in ten boxes. PAR 04:58, 17 November 2005 (UTC)[reply]

May I join your discussion? To me entropy is incomprehensible without reference to energy. Many microstates has the same energy U. So, the total set of microstates are classified according to their energy. Some classes are big, some are small. Adding a little energy dU to a system multiplies the number of accessible microstates by the factor edS where dS is the increase of entropy. This is the definition. The temperature is dU/dS.

In the previous example of shuffling cards, take for simplicity 5 cards: 12345. They can be shuffled in 120 ways (or permutations): 12345, 12354, 12435, 12543, ... , 54321. The 'energy' of 12345 is U = 1·1+2·2+3·3+4·4+5·5 = 1+4+9+16+25 = 55. The 'energy' of 54321 is U = 5·1+4·2+3·3+2·4+1·5 = 5+8+9+8+5 = 35. U=35 is the minimum energy, so 54321 is the ground state. No other state has this energy U=35. The entropy of the ground state is S=log(1)=0. The state 54312 has energy U = 5·1+4·2+3·3+1·4+2·5 = 5+8+9+4+10 = 36. This is the energy of first excited state. There are 4 states having this energy U=36, namely 21345, 13245, 12435, 12354. The energy increment is dU=36-35=1 and the entropy increment is dS=log(4)-log(1)=log(4). The temperature is dU/dS=1/log(4). The energy U=45 is populated with the maximum number of states. The entropy is maximum and the temperature is infinite. If you shuffle five cards and compute the energy of the permutation then the result is very likely to be close to 45. Energies above 45 give decreasing entropies and negative temperatures. When you have computed energies for the 120 states, and entropies and temperatures for the energies from 35 to 55, then you know what entropy is. Bo Jacoby 12:53, 17 November 2005 (UTC)[reply]

I agree that in the terminology we have been using, a particular macrostate is one with a particular total energy, and there are many microstates which have that total energy. But dU is not equal to TdS, it is smaller than or equal to TdS, with equality holding only for the equilibrium or maximum entropy state. That's why the calculation of temperature in the above example is not right. You cannot have negative temperature. Temperature is only defined for equilibrium states, and for two equilibrium states, if dU is positive, then dS will certainly be positive, and so T will be positive. I can have two different gases at the same temperature and pressure, separated by a wall. When I remove the wall and they mix, the entropy increases, yet the internal energy remains exactly the same. The temperature is not dU/dS=zero. PAR 15:26, 17 November 2005 (UTC)[reply]
I remember reading somewhere that in some systems (like lasers), you can isolate certain modes of excitation and in doing so, you can set up a system where when you add energy, you get a decrease in entropy, which corresponds to a negative temperature. I'm apprehensive about even bringing it up though since it seems like a very specific case and we're still trying to iron out a good solid general definition of entropy. Pkeck 17:00, 17 November 2005 (UTC)[reply]
If you could find that reference, that would be good. I mean the problem we are having right now is that the very specific case of a system in equilibrium is causing a mixup in the idea of entropy, with ideas that dS=dU/T is a definition of entropy. Lets see what it says. PAR 20:02, 17 November 2005 (UTC)[reply]
Here's wikipedia's article: http://en.wikipedia.org/wiki/Negative_temperature . And if you google "negative temperature," you get other references.
Going back to analogies for entropy, maybe we could say something about a box that's full of ping pong balls. When you shake the box, the ping pong balls bounce around and want to fill up the entire volume of the container. Shaking would be the equivalent of heating. We could include a bit about shaking a box with a dividing wall, and bringing together two boxes and connecting them. It goes well with this definition of entropy (which I think is great), "More precisely, the entropy is a measure of the number of ways that a system might be arranged microscopically and yet give the same macroscopic appearance." Pkeck 20:31, 23 November 2005 (UTC)[reply]