Misplaced Pages

Gibbs paradox

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

This is an old revision of this page, as edited by Linshukun (talk | contribs) at 19:25, 29 October 2007 (An information theory resolution of Gibbs paradox). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Revision as of 19:25, 29 October 2007 by Linshukun (talk | contribs) (An information theory resolution of Gibbs paradox)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)
File:Thermodynamicist Willard Gibbs.jpg
Josiah Willard Gibbs
(1839 – 1903)

In thermodynamics the Gibbs paradox (Gibbs' paradox or Gibbs's paradox) involves the discontinuous nature of the entropy of mixing. It was first considered by Josiah Willard Gibbs in his paper On the Equilibrium of Heterogeneous Substances. Suppose we have a box divided in half by a movable partition. On one side of the box is an ideal gas A, and on the other side is an ideal gas B at the same temperature and pressure. When the partition is removed, two gases mix, and the entropy of the system increases because there is a larger degree of uncertainty in the position of the particles. The paradox is the discontinuous nature of the entropy of mixing. It can be shown that the entropy of mixing multiplied by the temperature is equal to the amount of work one must do in order to restore the original conditions: gas A on one side, gas B on the other. If the gases are the same, no work is needed, but with the tiniest difference between the two, the work needed jumps to a large value, and furthermore it is the same value as when the difference between the two gases is great.

Similarity and entropy of mixing

When Gibbs paradox is discussed, it is always very controversial regarding the correlation of the entropy of mixing with similarity and there are three very different opinions regarding entropy value and the similarity (Figures a, b and c). Similarity may change continuously: similarity Z=0 if the components are distinguishable; similarity Z=1 if the parts are indistinguishable. Entropy of mixing does not change continuously in the Gibbs paradox.

There are many claimed resolutions and all of them fall into one of the three kinds of entropy of mixing-similarity relation (Figures a, b and c).

Generally speaking, a so-called resolution leading to the entropy of mixing and similarity relation as shown in Figure (a) cannot be accepted as a resolution because the paradox is still there. This kind of resolution is the explanation of the Gibbs paradox.

John von Neumann provided a real resolution of the Gibbs paradox by removing the discontinuity of the entropy of mixing: it decreases continuously with the increase in the property similarity of the individual components (See Figure b). More recently Shu-Kun Lin provided another relation (See Figure c). They will be explained in detail in a following section.

Entropy discontinuity

Classical explanation in thermodynamics

Gibbs himself posed a solution to the problem which many scientists take as Gibbs's own resolution of the Gibbs paradox. The crux of his resolution is the fact that if one develops a classical theory based on the idea that the two different types of gas are indistinguishable, and one never carries out any measurement which reveals the difference, then the theory will have no internal inconsistencies. In other words, if we have two gases A and B and we have not yet discovered that they are different, then assuming they are the same will cause us no theoretical problems. If ever we perform an experiment with these gases that yields incorrect results, we will certainly have discovered a method of detecting their difference.

This insight suggests that the idea of thermodynamic state and entropy are somewhat subjective. The increase in entropy as a result of mixing multiplied by the temperature is equal to the minimum amount of work we must do to restore the gases to their original separated state. Suppose that the two different gases are separated by a partition, but that we cannot detect the difference between them. We remove the partition. How much work does it take to restore the original thermodynamic state? None—simply reinsert the partition. The fact that the different gases have mixed does not yield a detectable change in the state of the gas, if by state we mean a unique set of values for all parameters that we have available to us to distinguish states. The minute we become able to distinguish the difference, at that moment the amount of work necessary to recover the original macroscopic configuration becomes non-zero, and the amount of work does not depend on the magnitude of the difference.

Unfortunately the paradox is still there. This should be taken as Gibbs's explanation of the Gibbs paradox.

Explanation in statisctical mechanics and quantum mechanics, N! and entropy extensivity

A large number of scientists believe that this paradox is resolved in statistical mechanics or in quantum mechanics by realizing that if the two gases are composed of indistinguishable particles, they obey different statistics than if they are distinguishable. Since the distinction between the particles is discontinuous, so is the entropy of mixing. The resulting equation for the entropy of a classical ideal gas is extensive, and is known as the Sackur-Tetrode equation.

The state an ideal gas of energy U, volume V and with N particles, each particle having mass m, is represented by specifying the momentum vector p and the position vector x for each particle. This can be thought of as specifying a point in a 6N-dimensional phase space, where each of the axes corresponds to one of the momentum or position coordinates of one of the particles. The set of points in phase space that the gas could occupy is specified by the constraint that the gas will have a particular energy:

U = 1 2 m i = 1 N j = 1 3 p i j 2 {\displaystyle U={\frac {1}{2m}}\sum _{i=1}^{N}\sum _{j=1}^{3}p_{ij}^{2}}

and be contained inside of the volume V (let's say V is a box of side X so that X³=V):

0 x i j X {\displaystyle 0\leq x_{ij}\leq X}

for i=1..N and j=1..3.

The first constraint defines the surface of a 3N-dimensional hypersphere of radius (2mU) and the second is a 3N-dimensional hypercube of volume V. These combine to form a 6N-dimensional hypercylinder. Just as the area of the wall of a cylinder is the circumference of the base times the height, so the area φ of the wall of this hypercylinder is:

ϕ ( U , V , N ) = V N ( 2 π 3 N 2 ( 2 m U ) 3 N 1 2 Γ ( 3 N / 2 ) ) {\displaystyle \phi (U,V,N)=V^{N}\left({\frac {2\pi ^{\frac {3N}{2}}(2mU)^{\frac {3N-1}{2}}}{\Gamma (3N/2)}}\right)}

The entropy is proportional to the logarithm of the number of states that the gas could have while satisfying these constraints. Another way of stating Heisenberg's uncertainty principle is to say that we cannot specify a volume in phase space smaller than h where h is Planck's constant. The above "area" must really be a shell of a thickness equal to the uncertainty in momentum Δ p {\displaystyle \Delta p} so we therefore write the entropy as:

S = k ln ( ϕ Δ p / h 3 N ) {\displaystyle \left.\right.S=k\,\ln(\phi \Delta p/h^{3N})}

where the constant of proportionality is k, Boltzmann's constant.

We may take the box length X as the uncertainty in position, and from Heisenbergs uncertainty principle, X Δ p = / 2 {\displaystyle X\Delta p=\hbar /2} . Solving for Δ p {\displaystyle \Delta p} , using Stirling's approximation for the Gamma function, and keeping only terms of order N the entropy becomes:

S = k N log [ V ( U N ) 3 2 ] + 3 2 k N ( 1 + log 4 π m 3 h 2 ) {\displaystyle S=kN\log \left+{\frac {3}{2}}kN\left(1+\log {\frac {4\pi m}{3h^{2}}}\right)}

This quantity is not extensive as can be seen by considering two identical volumes with the same particle number and the same energy. Suppose the two volumes are separated by a barrier in the beginning. Removing or reinserting the wall is reversible, but the entropy difference after removing the barrier is

δ S = k [ 2 N log ( 2 V ) N log V N log V ] = 2 k N log 2 > 0 {\displaystyle \delta S=k\left=2kN\log 2>0}

which is in contradiction to thermodynamics. This is the Gibbs paradox. It was resolved by J.W. Gibbs himself, by postulating that the gas particles are in fact indistinguishable. This means that all states that differ only by a permutation of particles should be considered as the same point. For example, if we have a 2-particle gas and we specify AB as a state of the gas where the first particle (A) has momentum p1 and the second particle (B) has momentum p2, then this point as well as the BA point where the B particle has momentum p1 and the A particle has momentum p2 should be counted as the same point. It can be seen that for an N-particle gas, there are N! points which are identical in this sense, and so to calculate the volume of phase space occupied by the gas we must divide Equation 1 by N!. This will give for the entropy:

S = k N log [ ( V N ) ( U N ) 3 2 ] + 3 2 k N ( 5 3 + log 4 π m 3 h 2 ) {\displaystyle S=kN\log \left+{\frac {3}{2}}kN\left({\frac {5}{3}}+\log {\frac {4\pi m}{3h^{2}}}\right)}

which can be easily shown to be extensive. This is the Sackur-Tetrode equation. If this equation is used, the entropy value will have no difference after mixing two parts of the identical gases.

Once again, strictly speaking, this cannot be taken as a resolution because the paradoxical discontinuity of entropy still exists.

Entropy continuity

John von Neumann
(1903 – 1957) )

Whereas many scientists feel comfortable with the entropy discontinuity shown in Figure (a) and satisfied with the classical or the quantum mechanical explanations in thermodynamics or in statistical mechanics, other people admit that Gibbs paradox is a real paradox which should be resolved by showing entropy continuity.

A quantum mechanics resolution of Gibbs paradox

Not many scientists set out to prove that entropy of mixing is actually continuous. In his book Mathematical Foundations of Quantum Mechanics, John von Neumann provided, for the first time, a resolution of the Gibbs paradox by removing the discontinuity of the entropy of mixing: it decreases continuously with the increase in the property similarity of the individual components (See Figure b). A few scientists agree with this resolution, others are still not convinced.

An information theory resolution of Gibbs paradox

Calorimetric measurement can be used to determine the entropy of mixing and resolve this paradox. Unfortunately it is well-known that mixing ideal gases has no detectable heat ( δ Q T d S {\displaystyle \delta Q\leq TdS} ) and no detectable amount of work (w = ΔG, where ΔG is the Gibbs free energy change). This may suggest that entropy of mixing have nothing to do with energy (heat TΔS or work ΔG) and an ideal gas mixing process may be a process of information loss which can be pertinently discussed only in the realm of information theory.

In a new approach to resolve the Gibbs paradox, another similarity - entropy of mixing relation has been set up by Shu-Kun Lin. This is depicted in Figure c. Instead of the word "mixing", word "merging" can be used for the processes of combining several parts of substance originally in several containers. Then, it is always a merging process whether the substances are very different or very similar or the same. Entropy of mixing calculation would predict that the merging process of different (distinguishable) substances is more spontaneous than the merging process of the same (indistinguishable) substances. However, this contradicts to all the observed facts in the physical world where the merging process of the same (indistinguishable) substances is the most spontaneous one; immediate examples are spontaneous merging of oil droplets in water and spontaneous crystallization where the indistinguishable unite lattice cells ensemble together.

This new entropy-similarity relation has been set up based on information theory and the major conclusion is that, at least in solid state, the entropy of mixing (or better, entropy of merging) is a negative value for distinguishable solids: at solid state, mixing different substances will decrease the (information theory) entropy, and the entropy of merging indistinguishable molecules (from a large number of containers) to form a phase of pure substance has the maximal value of increase in (information theory) entropy (By (information theory) entropy, we mean it is a dimensionless logarithmic function S = ln w. It is not a function of temperature T. It is not necessarily related to energy).

References and Notes

  1. Gibbs, Willard, J. (1876). Transactions of the Connecticut Academy, III, pp. 108-248, Oct. 187-May, 1876, and pp. 343-524, May, 1877-July, 1878.
  2. Gibbs, J. Willard (1993). The Scientific Papers of J. Willard Gibbs - Volume One Thermodynamics. Ox Bow Press. ISBN 0-918024-77-3.
  3. Jaynes, E.T. (1996). "The Gibbs Paradox" (PDF). Retrieved November 8. {{cite web}}: Check date values in: |accessdate= (help); Unknown parameter |accessyear= ignored (|access-date= suggested) (help)
  4. von Neumann, John (1932.). Mathematical Foundations of Quantum Mechanics. Translated by Beyer, R. T., trans.,. Princeton U. Press. 1996 edition: ISBN 0-691-02893-1. {{cite book}}: Check date values in: |year= (help)CS1 maint: extra punctuation (link) CS1 maint: multiple names: translators list (link) CS1 maint: year (link)
  5. Recent papers listed at Gibbs paradox and its resolutions.

External links

Categories: