Talk:Gibbs paradox
This is the talk page for discussing improvements to the Gibbs paradox article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
This article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
‹See TfM›
|
Suggestions
[edit]The phrase "Gibbs Paradox" has a specific historical meaning that deserves a short, standalone, reference article based on a standard reference source. For an online version of such a reference see this [1] from MIT's OpenCourseWare lectures on basic thermodynamics and statistical mechanics. While I appreciate LinShuKun's ideas and interest in this topic, I believe that his ideas and the ideas of Jayne's belong in a separate article; perhaps "Current Topics in Mixing Entropy" would be an appropriate title. Mixing these not-yet-widely-used ideas with such a well-established concept violates the principles of neutral point of view and no original research. This is a great example of why these principles exist: as the article currently stands, a non-expert reader cannot distinguish which ideas are the historical topic of Gibbs Paradox and which ideas are new. The new ideas deserve a separate article. jrf 21:18, 15 April 2007 (UTC)
- Agree with you: "Current Topics in Mixing Entropy" can be set up.Linshukun 09:33, 28 October 2007 (UTC) Trolou Moved this important "suggestion" paragraph to the top, updated link 05:27 UTC, 9 Feb 2010
I added some references that show the apparently growing acceptance (since the 1960's) that the paradox need not be solved through QM indistinguishability. One who knows these subjects (or me soon, hopefully) can add to the current "calculating the gibbs paradox" section (and brush it up, hopefully!). Also, I definitely didn't give the "mixing paradox" section the time it deserves. Thanks to you contributors. Buena suerte. (talk) 18:07, 11 February 2010 (UTC)) Sorry for not getting back sooner. Wish I could do more: focus less on Jaynes and add more of the good references and revise organization. From my research, I think the non-quantum explanation is now obvious, but it's seemingly not widely known to the community, so both are presented. (Trolou (talk) 19:07, 9 October 2010 (UTC))
- I believe that the 'Gibbs Paradox' article SHOULD NOT be merged with the 'Mixing Paradox' article as Gibbs Paradox can be used outside of a scientific context. To merge the two articles suggests that it must be confined to a scientific scenario.
- I am happy with both articles, however I would suggest that the 'Gibbs Paradox' article be positioned into a more understandable context.
- HarryPellegrini 13:01, 7 May 2007 (UTC)
Category - Quantum mechanics
[edit]Gibbs paradox is resolved through the introduction of Planck's constant, or at least the quantization of phase space, resulting in the Sackur-Tetrode equation. PAR 11:46, 13 May 2005 (UTC)
- If the question is wether the article should be also in the category "qm", then I completely agree with you. Please go ahead :-) -- mkrohn 11:49, 13 May 2005 (UTC)
Dimensions
[edit]The number of states , of which one takes the logarithm to get the entropy, has dimension (mass × length / time)-1, since the dimension of the constrained phase space is lenght3N×momentum3N-1 and not lenght3N×momentum3N . Shouldn't this be made dimensionless somehow? --V79 22:17, 15 October 2005 (UTC)
- Good point. Talking about the "area" of the cylinder wall is bad form, it has zero volume, in contradiction to the idea that you can't specify a volume smaller than h^3N. I fixed it in a hand-waving kind of way, but maybe it needs to be more fully explained. PAR 00:24, 16 October 2005 (UTC)
No paradox, and no need to introduce indistinguishability of particles
[edit]Actually the "resolution" of the Gibbs "paradox" as described here, even though standard, is not correct. If you look at the thermodynamical definition of the entropy you will see that 1) it is defined up to an arbitrary function of the number of particles, and 2) there are no a priori reasons to assume it should be extensive (and it is actually not extensive in systems with long-range forces, e.g. gravitational systems). I don't want to enter into too much details here, since all of this is perfectly explained in a beautiful paper by Jaynes, available at this address :
http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf
Please have a look. One should at least mention this and cite this paper. There is no paradox, and no need to introduce indistiguishability of particles (for this particular problem).
- Well, I started out thinking this was another crackpot paper written by someone who went off their medication, but its not. This is an EXCELLENT article. I always had a sneaking suspicion that I didn't really "get" entropy, and now I feel better, because I realize that nobody does, except maybe Gibbs and Pauli. I also thought this might be a dense, impenetrable article, but the whole point of the article is that this guy Jaynes is trying to make sense of some of the impenetrable writings of Gibbs. That means he is very dedicated to clarity, and for anyone who took college thermodynamics and statistical mechanics, and more or less got it, this is a pretty clear article. Which is not to say I understand it on first reading, its going to take a number of iterations. Thanks for that reference, and we should be able to incorporate these ideas into a number of wikipedia articles, since they are not so much "revolutionary" ideas, but rather a realization that certain (published!) insights of giants like Gibbs and Pauli have not made their way to the mainstream. PAR 18:57, 2 November 2005 (UTC)
- I am happy that you like it. Actually, I also recommend most of the other papers he has written, many of which deal with entropy (including his celebrated 1957 Phys. Rev. paper, in which he provides foundation to statistical mechanics based on information theory - much, much better than the ergodic approach, in my opinion; see also his more recent contribution to this topic). All his papers can be found at the same place,
- All these papers are written in the same very clear way, and many professional statistical physicists (and probabilists) would profit from reading them. --YVelenik 08:51, 3 November 2005 (UTC)
- Well, I think it is really a pity that these clearly wrong claims (that 1) there is a paradox, and 2) it is resolved by postulating indistinguishability of particles) not only remain here more than 5 months after my first comment, but that the article actually is getting worse in this respect. I don't want to spend time giving convincing arguments for my claim, since this is done, and very well, in the paper I cited above... I would like to think that people interested enough in these topics would be curious enough to read it (or at least browse through it)!
--YVelenik 16:47, 21 April 2006 (UTC)
- I have not looked at the article for a while, but now that I do, I agree. Lets fix it. I do, however, think we need to respect the conventional paradox, and explain its origin, before introducing Jayne's analysis. Note that I entered Jaynes explanation of Gibb's explanation of the mixing paradox in that page. PAR 17:13, 21 April 2006 (UTC)
- My opinion: Jaynes seems to be a respected reference on statistical mechanics. I just read Jaynes article once entirely, and it seems fantastic. However, I don't know that it is yet accepted (c.f. MIT OpenCourseWare link above), and should probably not be presented as so. Trolou 05:27 UTC, 9 Feb 2010
- I don't know whether I get entropy, or ever got. HOWEVER, you see there is a catch. Entropy cannot really be subjective. Suppose the particles were different, but we could not tell. Then (unless you subscribe to subjectivism and HEr GRace fooling us) the difference in mixing entropy would STILL show up in the fact that we could use it as a dumping ground for entropy from elsewhere. I.e. decrease S from environment. There are two objections I see: a) this would be an experiment "to tell", and b) it wouldn't be easy to do since the mixing tends to generate all its additional entropy itself. --88.74.147.213 (talk) 20:12, 28 December 2014 (UTC)
Disputed
[edit]The density of states is wrong by a small, but significant factor. Anyone have an authoritative reference?
- I think "Statistical Mechanics" by Huang contains the derivation, and I will include it as a reference, but I have to make sure it agrees. How would you write the density of states? PAR 01:54, 19 January 2006 (UTC)
- The correct answer is, I think,
'd' is the spacial dimension.
too technical tag
[edit]i removed the tag. since there's a "context" tag on the article itself, it seems redundant. if you want, go ahead and reinsert the tag, but please leave some specific suggestions about what you think the article needs. thanks. Lunch 04:43, 24 September 2006 (UTC)
RFC: Review of new material requested
[edit]There has been an enormous amount of material inserted into this article, primarily by one author, which includes reference to the author's own work. I would like people skilled in the field to review and verify that everything is OK.Kww 01:59, 2 November 2007 (UTC)
- This does not look OK to me. It begins with messy definitions and proceeds to voodoo similar to voodoo that's often used to equate information theory with real particles. Nearly every sane physicist, engineer, thermodynamicist, and child is perfectly comfortable with discontinuities in the physical world when undergoing the mental transition from microscopic to macroscopic thinking. Here's ice. Here's liquid water. It takes an information scientist to imagine a nice linear range of values between the two that can be plotted on the x-axis of a chart to make it seem like something paradoxical is taking place.
- Ben-Naim's paper looks OK to me. It says "for ideal gases, the mixing, in itself, does not play any role in determining the value of the so-called entropy of mixing. Once we recognize that it is the expansion, not the mixing, which causes a change in the entropy, then the puzzling fact that the change in entropy is independent of the kind of molecules evaporates." In other words, with a barrier being removed in gas A, you begin and end with two volumes of A. With a barrier removed between two separated gases, you begin with one volume of A and one volume of B and you end up with two volumes of diluted A and two volumes of diluted B. The word "expand" or "expansion" doesn't appear in the present version of the article anywhere, and the word "volume" doesn't appear in the classical thermo section. Instead, it says, "The fact that the different gases have mixed does not yield a detectable change in the state of the gas." Wrong, wrong, wrong. It does yield a detectable change (increase in volume, decrease in concentration) in the state of two gases, so it yields a detectable change in the state of the gas as a whole. Ben-Naim goes on to write: "The fact that entropy changes discontinuously as one changes the indistinguishability continuously, is viewed as a paradox. However, there is no paradox here, and there was no allusion to any paradox in Gibbs writings. There are many examples that a discontinuous transition follows a continuous change of a parameter. For instance the fact that the density of water changes discontinuously when the temperature changes continuously, say between 90C to 110C is not viewed as a paradox. Furthermore, the presumed continuous change in the extent of indistinguishability of the particles is now recognized as, in principle, invalid. Particles are either distinguishable or indistinguishable-there are no intermediate values of indistinguishability." Flying Jazz (talk) 20:59, 22 November 2007 (UTC)
- Ignoring the first paragraph, I agree with everything written above by User:Flying Jazz. In addition, I also object to the statement by Lin:
Unfortunately it is well-known that none of the typical mixing processes have a detectable amount of heat and work transferred, even though a large amount of heat, up to the value calculated as TΔS (where T is temperature and S is thermodynamic entropy), should have been measured and a large amount of work up to the amount calculated as ΔG (where G is the Gibbs free energy) should have been observed.[9]
- Lin completely ignores the fact that the entropy of mixing is offset by a change in the chemical potential(s) of the gas(es) yielding no change in the total energy. There is no PV work that "should have been observed". The reference [9] is not a reference at all but a "do not try this at home" warning. Lin has published papers on this subject in what seems to be respectable journals (e.g. Wiley interscience) which is why I hesitate to declare this stuff OR. It seems to me that these papers should be read before dumping these edits, which I have not yet done. Im really puzzled as to how this stuff got into a peer reviewed journal.
- Articles by Lin:
- Gibbs paradox of entropy of mixing: Experimental facts, its rejection, and the theoretical consequences Electronic Journal of Theoretical Chemistry ,1996,1, 135-150
- Understanding structural stability and process spontaneity based on the rejection of the Gibbs paradox of entropy of mixing. Journal of Molecular Structure (Theorochem), 1997, 398, 145-153
- Correlation of Entropy with Similarity and Symmetry Journal of Chemical Information and Computer Sciences, 1996, 36, 367-376
- My first paragraph above may have been too strong. Ben-Naim applies information theory to thermodynamics, so I shouldn't paint the entire discipline with such a broad brush. I agree with what PAR wrote. Simply mixing two ideal gases does not involve heat transfer and does not involve mechanical work. Alberty's IUPAC paper calls this type of work "chemical:no reactions" in table 1 of section 1.3, so it could be described as work being entirely spent on an entropy increase if system boundaries are drawn internal to the apparatus. I think if the barrier were replaced with a semi-permeable membrane, it would act like a piston, so PV work could be obtained from the same starting conditions. Obtaining useful work from isothermal isobaric concentration differences of distinguishable but very similar molecules is a huge part of how living systems generate useful work. See Template:Membrane transport. It probably takes a larger more complicated enzyme in most cases to distinguish between two similar molecules than it does to distinguish between two dissimilar molecules, so I could imagine charts with "similarity" on the x-axis and something on the y-axis like "enzyme kilodaltons", but it wouldn't be "entropy of mixing." I agree with PAR that this is not original research in Wikipedia because it has been published elsewhere, but I would call this a very NPOV article as it's currently written and possibly fringe because it lends undue weight to ideas that relate to other fields but would not be publishable in those fields. The best solution in my view would be to lend more relative weight in the article to people like Ben-Naim (hopefully there are many others like him) who make strong and sound criticisms of the concept and what Lin has done with it. Flying Jazz (talk) 18:01, 24 November 2007 (UTC)
WP:COI. Wikipedia editors must not use WP to advertise their own work, especially when it flies in the face of the general understanding of the topic.
Revert the lot. Jheald (talk) 17:10, 26 November 2007 (UTC)
- Amen to that. dbw (talk) 01:19, 23 September 2009 (UTC)
RFC: Can a physicist check this article?
[edit]This article makes no sense. Unfortunately, I have to say I think the contributions by Linshukun are entirely to b. You can see Lin's very first edit here, a paragraph of raw text jammed at the top of what used to be a pretty concise article: http://en.wikipedia.org/w/index.php?title=Gibbs_paradox&diff=prev&oldid=109172335 As you can see, it consists of simply adding her opinion to the article. This has continued in progressively more wikified forms for 2 years. Finally the article is unrecognizable. All the math is gone, as is the encyclopedic tone. The article is not even logically consistent in its current state.
There is no statement of what Gibbs' paradox is. Perhaps it is the discontinuity in "figure A"... which Lin's "figure C" simply omits.
It is true that Lin appears to have written a paper on this topic. However, this violates the spirit if not the letter of WP:OR. 76.208.71.19 (talk) 03:40, 23 September 2009 (UTC)
This article might just need 2 years of reverts. I'm pretty sure Linshukun is not a physicist (though admittedly neither am I) and quite frankly, this is the sort of crazy stuff that used to get written and energetically defended by amateurs on physics usenet lists. dbw (talk) 01:56, 23 September 2009 (UTC)
For what it's worth, I've always thought this article stood a fair chance of being a hoax. The author has a fairly close relationship with Lim Thibbs, aka banned editor Sadi Carnot (see WP:Requests for arbitration/Sadi Carnot), as browsing these contributions shows.—Kww(talk) 03:56, 23 September 2009 (UTC)
- The article itself looks genuine enough. But Linshukun's extensive additions to it appear to be a joke. At best, they are original research and should be removed for that reason. Maproom (talk) 23:10, 5 October 2009 (UTC)
- Just for clarity, could somebody list all the disputed parts - or tag them as {{dubious}}? I realize this may not be easy due to the amount of material in question. --catslash (talk) 00:11, 6 October 2009 (UTC)
- For reasons I have stated above (RFC: Review of new material requested), I agree that Linshukun's contributions are completely wrong, but since they are published, I was reluctant to remove them. The contentious parts, in my view, are the inclusion of any material from the Linshukun references. I agree that the mathematics of the statistical mechanical "explanation" should be restored. The "Classical explanation" is fine. The "quantum explanation" of Von Neumann is not familiar to me, but if it is indeed an accurate description of an opinion of Von Neumann, it cannot be ignored. PAR (talk) 10:43, 6 October 2009 (UTC)
- Please note some of Linshukun's articles seems to be published through http://www.mdpi.org/ which he evidently co-founded and for which he served as editor-in-chief. http://www.mdpi.org/lin/modi601001.pdf Note the "see also" link is maintained by him, and led to some major confusion for me as the reader. Thanks. (Trolou (talk) 18:07, 11 February 2010 (UTC))
Reverting to last known good version from 18 feb 2007
[edit]This is a bit drastic. Would it be possible to combine the 2007 version with some of the material from the latest version? If not, then the article needs some simple explanation for those (like myself) with little grasp of thermodynamics. If this is not possible then it needs at least the {{context}} tag restoring. --catslash (talk) 01:13, 1 November 2009 (UTC)
- I am not capable of accurately sorting the true tidbits from the false information that was added. I posted on WP:FTN that I had done this. Hopefully someone that can accurately restore portions of the later material will do so.—Kww(talk) 01:23, 1 November 2009 (UTC)
- I think that this is a rather poor move. Just getting rid of all pieces related to Linshukun's contributions would be much better, and not that harder to do! The 2007 version that is currently used is not good. It contains inaccuracies, and does not discuss the only "solution" I find correct, which is essentially due to Gibbs and explained in Jaynes' beautiful paper on this "paradox". Terrible move... —Preceding unsigned comment added by 77.56.122.123 (talk • contribs)
- If you have the skill to do so, feel free to proceed.—Kww(talk) 17:03, 1 November 2009 (UTC)
- I think that this is a rather poor move. Just getting rid of all pieces related to Linshukun's contributions would be much better, and not that harder to do! The 2007 version that is currently used is not good. It contains inaccuracies, and does not discuss the only "solution" I find correct, which is essentially due to Gibbs and explained in Jaynes' beautiful paper on this "paradox". Terrible move... —Preceding unsigned comment added by 77.56.122.123 (talk • contribs)
I worked on this paper today. It looks better. The references can be added to the text more precisely and listed at the end in a better way. The section of von Newmann's "Quantum Mechaniccal resolution of Gibbs paradox" must be expanded.Linshukun 09:33, 28 October 2007 (UTC)
I deleted "...known as mixing paradox". I have never heard of this word "mixing paradox" among people publishing on Gibbs paradox. Let us continue to say "Gibbs paradox" to remember Gibbs, one of the greatest American scientists. Linshukun 20:56, 27 October 2007 (UTC)
The section "Calculating the Gibbs paradox" should be deleted. This is merged to the other section. Linshukun 20:43, 27 October 2007 (UTC)
for what it's worth, a summary of what Gibbs said
[edit]In two different works, published thirty years apart, Gibbs remarked two different ways on how the notion of distinguishability affects entropy. I have been reading Gibbs lately and studying how he saw the issue. For the benefit of future edits on this article, I've included below direct links to Gibbs' remarks, and some interpretation of mine since Gibbs' prose can be painful to read at times.
In neither case did he find there to be an actual problem, since the correct answer fell naturally out of his mathematical formalisms. However it's evident that he saw these things as possibly tricky points for the reader, because in each case he did devote time to carefully talking about the issues.
-
On the Equilibrium of Heterogeneous Substances (1876):
Internet archive copy of Part I, see section "Considerations relating to the Increase Entropy due to the Mixture of Gases by Diffusion." (starting at page 125 of the scanned file, which is page 227 of the manuscript). The key points:
- Gibbs begins by deriving (from thermodynamics) that the entropy increase by mixing two gaseous masses of different kinds, of equal volume and at constant temperature and pressure will be , where V is the final volume. When mixing two masses of the same kind, there is no entropy increase.
- He emphasizes the striking property that the mixing entropy does not depend on which kinds gases are involved, whether they are very different or very similar. They only need to be of distinguishable types.
- Why entropy should increase in one case and not the other:
- "When we say that when two different gases mix by diffusion, as we have supposed, the energy of the whole remains constant, and the entropy receives a certain increase, we mean that the gases could be separated and brought to the same volume and temperature which they had at first by means of certain changes in external bodies, for example, by the passage of a certain amount of heat from a warmer to a colder body. But when we say that when two gas-masses of the same kind are mixed under similar circumstances there is no change of energy or entropy, we do not mean that the gases which have been mixed can be separated without change to external bodies. On the contrary, the separation of the gases is entirely impossible. We call the energy and entropy of the gas-masses when mixed the same as when they were unmixed, because we do not recognize any difference in the substance of the two masses. So when gases of different kinds are mixed, if we ask what changes in external bodies are necessary to bring the system to its original state, we do not mean a state in which each particle shall occupy more or less exactly the same position as at some previous epoch, but only a state which shall be undistinguishable from the previous one in its sensible properties. It is to states of systems thus incompletely defined that the problems of thermodynamics relate."
- Gibbs then goes on to consider the hypothetical situation of having two gases that are distinct (i.e. that can be later separated) but are so similar that for the dynamics of the mixing are exactly the same as the dynamics of mixing two same gases. On the atomic scale these processes would appear identical and yet the entropy would increase in one case and not the other. Gibbs says "In such respects, entropy stands strongly contrasted with energy", meaning that we cannot identify entropy with any kind of microscopic dynamics. Gibbs notes that the apparently irreversible dynamics of diffusion are not special to the distinct gas case; they are also just as well taking place in the mixing of same gases, however there is no entropy increase. To Gibbs, entropy has nothing to do with the motions of atoms but rather has to do with what we (as human observers) are able to distinguish in thermodynamic states.
-
Elementary Principles in Statistical Mechanics (1902): Gibbs' 1902 book addresses what is often called "correct Boltzmann counting". This counting is often explained as necessary to obtain extensive thermodynamics, however Gibbs introduces it in a different way (as a tool to avoid overcounting in phase space). The discussion takes place at the beginning and end of Chapter XV (transcription on Wikisource).
- Gibbs says, if an ensemble is meant to represent an actual collection of systems, then the particles in some sense must be indistinguishable because each particle needs to be copied many times. On the other hand, when the ensemble is a probability distribution used to represent the possible states of one system, we might truly speak about unique particles. At this point Gibbs notes there is nothing formally wrong with having distinguishable particles or indistinguishable particles: "The question [whether particles are indistinguishable] is one to be decided in accordance with the requirements of practical convenience in the discussion of the problems with which we are engaged."
- Classical mechanics can be done equally well with distinguishable or indistinguishable particles. When considering indistinguishable particles, Gibbs notes that for integrating over the state space, the easiest way to do it is to pretend the particles are distinguishable, then divide the phase space integral by the appropriate overcounting factor. Gibbs is not mystified by why he should divide by N!, it is simply a mathematical convenience in order to perform calculations with indistinguishable particles. If one wanted to do it the hard way, the same results could be had by integrating over phase space including each distinguishable state once and only once. "For the analytical description of a specific phase is more simple than that of a generic phase. And it is a more simple matter to make a multiple integral extend over all possible specific phases than to make one extend without repetition over all possible generic phases."
- Gibbs then goes on to make tons of calculations for the grand canonical ensemble, using the framework of indistinguishable particles. Along the line he mentions why he is focussing on indistinguishable particles: "The interest of the ensemble which has been described lies in the fact that it may be in statistical equilibrium, both in respect to exchange of energy and exchange of particles, with other grand ensembles canonically distributed and having the same values of Θ and of the coefficients μ1, μ2, etc., when the circumstances are such that exchange of energy and of particles are possible, and when equilibrium would not subsist, were it not for equal values of these constants in the two ensembles." It's worth noting that his demonstration of equilibrium would not work out using the framework of distinguishable particles: he demonstrates this in equations (514) and (515).
- At the end of the chapter he works his way towards the measure of entropy, and finally remarks on the entropy difference (in the canonical ensemble) between considering particles distinguishable or indistinguishable. At the very end of this work (and apparently the very last paragraph he published before death), Gibbs mentions a sort of mixing paradox again, but this time it is regarding extensivity:
- "For the principle that the entropy of any body has an arbitrary additive constant is subject to limitation, when different quantities of the same substance are concerned. In this case, the constant being determined for one quantity of a substance, is thereby determined for all quantities of the same substance.
To fix our ideas, let us suppose that we have two identical fluid masses in contiguous chambers. The entropy of the whole is equal to the sum of the entropies of the parts, and double that of one part. Suppose a valve is now opened, making a communication between the chambers. We do not regard this as making any change in the entropy, although the masses of gas or liquid diffuse into one another, and although the same process of diffusion would increase the entropy, if the masses of fluid were different. It is evident, therefore, that it is equilibrium with respect to generic phases, and not with respect to specific, with which we have to do in the evaluation of entropy, and therefore, that we must use the average of H or of ηgen, and not that of η, as the equivalent of entropy, except in the thermodynamics of bodies in which the number of molecules of the various kinds is constant."
- "For the principle that the entropy of any body has an arbitrary additive constant is subject to limitation, when different quantities of the same substance are concerned. In this case, the constant being determined for one quantity of a substance, is thereby determined for all quantities of the same substance.
The 1876 work was entirely concerned with thermodynamics. The Gibbs' paradox of 1876 is therefore a pure thermodynamic thing. He does allude to molecular dynamics and probabilities but that is not essential to the argument. Whether entropy increases or not is, in the end, pragmatically determined by whether we need to perform work to return the systems back to their initial thermodynamic states. As to whether the individual particles themselves are distinguishable is not relevant to his argument here: there is anyway no thought to returning each particle back to its original position (returning the systems back to their initial microscopic state) as this would anyway be impossible. Rather it is only required in thermodynamics to return the systems back to their initial thermodynamic state. For what it's worth I think the Jaynes paper does an okay job explaining and extending Gibbs' arguments here.
To sum up Gibbs' remarks in his 1902 work, Gibbs starts out by noting that classical mechanics can be done equally well with distinguishable or indistinguishable particles. Both options can be considered in classical statistical mechanics, and it is really down to the details of the system we are studying, which one we choose. Gibbs then motivates why we must use indistinguishable particles, for the study of statistical thermodynamics, as so: First of all, if they were distinguishable we could not have such an easily defined grand canonical ensemble that is in equilibrium with respect to particle exchange. Secondly, if they were distinguishable then in the canonical ensemble the average logarithm of probability would not give us a quantity behaving like entropy, when the number of particles varies. The Jaynes paper says that Gibbs' 1902 book was "the work of an old man in rapidly failing health" and that Gibbs would have written more on this paradox if he had been in better health. Perhaps true, but I suspect Gibbs didn't want to distract from the rest of the discussion in the chapter.
Are there actually two Gibbs paradoxes? Should they be called "paradoxes"? I would say yes and yes. Gibbs addressed both the mixing-of-very-similar-gasses paradox in thermodynamics, and the mixing-of-identical-gases extensivity paradox in statistical mechanics. These are two different but closely related things, and there very well may be two Gibbs paradoxes, as discussed in the article. Gibbs did not see either as an actual problem, and he never called them paradoxes, yet he did remark on them as possible places where one could make a logical error. In some sense then they really are paradoxes, along the lines of the two-envelope paradox and the Monty Hall paradox: things that can seem on the surface mysterious or counterintuitive yet which have deep down a well-defined, consistent answer. Nanite (talk) 13:09, 16 February 2014 (UTC)
- I like your comment to Jaynes' suggestion that Gibbs' causal treatment of the paradox late in life might have been due to a desire to keep to the point, and not due to his poor health. Who knows? (And perhaps, one should ask, who cares?). I also liked your concluding phrase referring to the paradox as (one of those) "things that can seem on the surface mysterious or counterintuitive yet which have deep down a well-defined, consistent answer."
- I hate to see your thoughtful comments be ignored, only to disappear into archive-space. Nor am I inclined to condense your ideas into prose suitable for this WP article (I simplify equations, not words...) So, I have decided to move your comments into the Wikiversity article this subject. In keeping with the Wiki principle that editors don't put their names into articles, I will reference this talk page and keep both our usernames out of it. I'm sure that would be inappropriate in Wikipedia, but Wikiversity is sort of the Wild West of Wikipedia. Anything goes there...(well almost anything). --guyvan52 (talk) 08:36, 17 February 2014 (UTC)
- Here is a link to that Wikiversity 'learning resource': Statistical thermodynamics (Wikiversity) --guyvan52 (talk) 09:06, 17 February 2014 (UTC)
- BTW, by copying what you wrote into Wikiversity, I did NOT wish to imply that it should stay out of the Wikipedia (WP) article. The discussion in the current WP (Gibbs paradox) needs improvement. For example, it seems to imply that the paradox has no solution within the context of classical physics, or that Gibbs was somehow "stymied" by it. If you want to contribute to this article, be bold, certainly. But also consider discussing your plans here so that we can figure out how to consolidate the two themes (mathematical and philosophical) into one article. Wikipedia articles tend to get disorganized if editors fail to communicate with each other.--guyvan52 (talk) 12:37, 17 February 2014 (UTC)
- Thanks, I'm honoured you put my notes up somewhere! I really want to improve this article, but it seems to be a very long running controversial topic and so it's tricky to write a good article. Moreover I have my own ideas about the 1902 paradox: I agree with Gibbs' solution of generic phases and I think it is brilliant, however I think it is also possible to obtain extensive entropies with distinguishable particles (though not with naive specific phases). In other words I think there is a second resolution that is compatible. So I am being extra careful about editing this article, because I don't want to inject my original research here. On the other hand, given the thousands of articles published over the last century, I doubt that anything I come up with will actually be "original" :).
- Along these lines I just found this recent article, Demonstration and resolution of the Gibbs paradox of the first kind by Hjalmar Peters. His conclusion of extensivity from distinguishability is close to what I have in mind, though I hope to find something mathematically simpler. He also supports my argument (in my notes above) that there are two Gibbs paradox, referring to the 1902 paradox as GP1, and the 1876 paradox as GP2. Nanite (talk) 20:26, 17 February 2014 (UTC)
- Thank you. Perhaps there could be an article on the history of the Gibbs paradox, but perhaps that would be controversial as well. I agree, there are two distinct paradoxes, yes, they appear to be more-or-less resolved. The only complaint from me is that the resolutions are somehow done at a level of detail that isn't entirely comfortable, to me: either they are done by word-slinging, or they are done with eye-wateringly tedious undergrad-level algebra. Something in between these two extremes would be nice. Something compact, abstract and accurate all at the same time. 67.198.37.16 (talk) 01:02, 20 May 2024 (UTC)
Ergodic hypothesis, dimension jump
[edit](The ergodic hypothesis underlies the ability of a physical system to reach thermal equilibrium, but this may not always hold for computer simulations (see the Fermi–Pasta–Ulam problem) or in certain real-world systems such as non-thermal plasmas.)
I found these remarks "opinionated" but at the same time really interesting, so I'm in favor of leaving them in. I rewrote them slightly and hope I haven't distorted them to the wrong. In plasmas, I don't think we really know that the ergodic hypothesis is violated (it is a very pure-math statement and often a theorem), but rather we infer it indirectly because something has to break down. The more direct observation is that the non-thermal plasmas are not in thermal equilibrium, but have two temperatures for two different species. (Though how is this any more surprising than having two temperatures in neighboring containers in a lab? In both cases the equilibration is merely slow compared to what we're studying.)
Liouville's theorem assumes a fixed number of dimensions that the system 'explores'.
Is it specifically Liouville's theorem that's responsible for being so "rigid" about the dimension? Isn't choosing a number of dimensions a feature of classical phase spaces in general?
In calculations of entropy, the number dimensions is proportional to the number of particles in the system, which forces phase space to abruptly change dimensionality when particles are added or subtracted. This may explain the difficulties in constructing a clear and simple derivation for the dependence of entropy on the number of particles.
These sentences are really interesting, but it has the ring of late-night philosophizing. Does a discussion of this tension appear in the literature somewhere? Perhaps in Swendsen's paper?
Adding a bit to the philosophizing: in the derivations of entropy I've seen (for example, in Kittel and Kroemer), the phase spaces are never compared directly, but instead, certain summary numbers (like the "number of states" compatible with a macrostate) are derived and it is these numbers that are passed to limits as the number of particles increases. Of course, one has to normalize by a standard "small volume" in phase space. K&K uses h-bar in an ad-hoc way IIRC, but one could just decide to divide by a small classical volume. Comparability of volume between phase spaces of different dimension is assumed, without introducing any extra factors. Adding the right factor (justified by interchangeability or some other way) solves the problem. I guess this is the point that's being made in the section Alternative ways to make classical entropy extensive.
178.38.99.51 (talk) 19:45, 16 March 2015 (UTC)
- At the end of Liouville's theorem (Hamiltonian) we find these two lonely sentences:
- The Liouville equation is integral to the proof of the fluctuation theorem from which the second law of thermodynamics can be derived. It is also the key component of the derivation of Green–Kubo relations for linear transport coefficients such as shear viscosity, thermal conductivity or electrical conductivity.
- If these were to be expanded upon, perhaps a response to your comments would be possible. However, nothing currently in WP builds the needed bridges. 67.198.37.16 (talk) 01:07, 20 May 2024 (UTC)
Clarified the structure of section now entitled "Non-extensive entropy of two ideal gases and how to fix it"
[edit]I did a series of edits on this section, putting in new section and subsection titles and a fuller paragraph at the beginning explaining the plan of the section. Here is the current outline of this section.
4 Non-extensive entropy of two ideal gases and how to fix it
4.1 Setup
4.2 Gibb's paradox in a one-dimensional gas
4.3 Two standards ways to make the classical entropy extensive
4.4 Swendsen's particle-exchange approach
4.5 Visualizing the particle-exchange approach in three dimensions
Together with the fuller paragraph at the beginning, for me this makes the section much easier to follow because it tells you at the beginning where you're going. Here are two remaining difficulties with the structure:
(A) The derivation in subsections 4.1 and 4.2 of the non-extensive entropy for a 1-dimensional gas largely repeats the calculations that are already done in section 2, Calculating the entropy of ideal gas, and making it extensive.
In defense of the redundancy (which is not due to me!): it's nice to read the same thing said two different ways and there are a number of interesting remarks included that aren't found in section 2.
(B) Even if 4.1 and 4.2 are left alone, perhaps the section would be improved if 4.3 were removed and absorbed into the earlier part of the article. One of the "fixes" in section 4.3 was already treated earlier, and the other one is rather simple to state and it could perhaps be moved out of the way.
89.217.15.87 (talk) 00:11, 17 March 2015 (UTC)
Complaints about the Swendsen approach
[edit]♦ Strangely, the net result of the Swendson approach appears to be the equation
followed by the remark This can be interpreted as the sum of the entropy of system-A and system-B, both extensive. And there is a term, , that is not extensive.
My question is this: since it has a term that is not extensive, then how does it remotely fix the problem of the non-extensive entropy? This is totally puzzling.
Even more puzzling is the next sentence (at the start of the following section): The correct (extensive) formulas for systems A and B were obtained because we included all the possible ways that the two systems could exchange particles.
So now suddenly the above-cited entropy formula is correct?
♦ The whole Swendsen approach is a mystery to me. Somehow, the most central part of the explanation is missing.
(a) What are the two systems? Are they supposed to be two volumes that are adjacent to each other, with a barrier between, like in the Gibbs paradox? If so, why do they have two different sizes, and appear to overlap?
(b) How is the exchange actually implemented at the formal level?
I understand that if you select a subset containing n_A indices from {1,...,N}, then you are somehow saying that the first set of particles is in the first "box" (actually an interval) and the complimentary set is in the second "box". And that if you consider this for various values of n_A, then you are countenancing the possibility that particles move between boxes.
But how is this implemented formally? Shouldn't there be a sum or product somewhere where n_A varies? What concrete part of the formalism gives us the right to say that the number of particles actually does vary, and that we are accounting for them?
(c) An off-putting remark a bit later on is This counting is not justified on physical grounds, but on the need to integrate over phase space.
This seems to echo the fact that the setup was never really explained, and that the article simply would rather not "go there".
♦ Miscellaneous questions:
(a) What are the question marks in the integral expression for ? I understand that they stand for all subsets of {1,...,N}, but unless you spell out these variables as actual symbols, it's hard to tell what you're doing with these variables. Such as summing or producting over them.
(b) What is in this expression? An ordered pair?
♦ Upshot:
I'm not going to say that the Swendsen approach is nonsense, because it has some right elements. But it's not possible to understand the presentation in this article and there are a number of gaps. Given the level of mathematical effort that has been expended in these sections of the article, there ought to be at least an adequate explanation of the conceptual ansatz.
89.217.15.87 (talk) 00:11, 17 March 2015 (UTC)
- In the entropy of mixing, one considers the entropy before and after mixing: i.e. the change. In that setting, the N log N term is there, both before and also after, so it cancels: it does not contribute to the change. I don't know how to reconcile that with the large raft of text in this article. 67.198.37.16 (talk) 01:15, 20 May 2024 (UTC)
paradox is a nonsence
[edit]since impossible to get this situation in reality (in other words entropy will be exactly the same in any volume or number of identical particles)....This statement is a scientific disgrace: "There is a certain amount of entropy S associated with each container of gas, and this depends on the volume of each container." — Preceding unsigned comment added by 134.7.190.150 (talk) 04:05, 29 March 2015 (UTC)
- I have no idea what this comment is talking about because entropy always increases with volume and/or number of particles if the temperature is constant. Ywaz (talk) 11:17, 25 January 2016 (UTC)
Assessment comment
[edit]The comment(s) below were originally left at Talk:Gibbs paradox/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.
Article needs editorial reorganization to properly distinguish the established idea associated with the article title "Gibbs Paradox" from new ideas added by various editors. Suggest a "current topics" article linked from this article. jrf 21:32, 15 April 2007 (UTC) |
Last edited at 21:32, 15 April 2007 (UTC). Substituted at 16:15, 29 April 2016 (UTC)