Thermodynamic potential of entropy, analogous to the free energy
A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy . Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In statistical mechanics , free entropies frequently appear as the logarithm of a partition function . The Onsager reciprocal relations in particular, are developed in terms of entropic potentials. In mathematics , free entropy means something quite different: it is a generalization of entropy defined in the subject of free probability .
A free entropy is generated by a Legendre transformation of the entropy. The different potentials correspond to different constraints to which the system may be subjected.
Examples
See also: List of thermodynamic properties
The most common examples are:
Name
Function
Alt. function
Natural variables
Entropy
d
S
=
1
T
d
U
+
P
T
d
V
−
∑
i
=
1
s
μ
i
T
d
N
i
{\displaystyle dS={\frac {1}{T}}dU+{\frac {P}{T}}dV-\sum _{i=1}^{s}{\frac {\mu _{i}}{T}}dN_{i}\,}
U
,
V
,
{
N
i
}
{\displaystyle ~~~~~U,V,\{N_{i}\}\,}
Massieu potential \ Helmholtz free entropy
Φ
=
S
−
1
T
U
{\displaystyle \Phi =S-{\frac {1}{T}}U}
=
−
A
T
{\displaystyle =-{\frac {A}{T}}}
1
T
,
V
,
{
N
i
}
{\displaystyle ~~~~~{\frac {1}{T}},V,\{N_{i}\}\,}
Planck potential \ Gibbs free entropy
Ξ
=
Φ
−
P
T
V
{\displaystyle \Xi =\Phi -{\frac {P}{T}}V}
=
−
G
T
{\displaystyle =-{\frac {G}{T}}}
1
T
,
P
T
,
{
N
i
}
{\displaystyle ~~~~~{\frac {1}{T}},{\frac {P}{T}},\{N_{i}\}\,}
where
Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is
ψ
{\displaystyle \psi }
, used by both Planck and Schrödinger . (Note that Gibbs used
ψ
{\displaystyle \psi }
to denote the free energy.) Free entropies were invented by French engineer François Massieu in 1869, and actually predate Gibbs's free energy (1875).
Dependence of the potentials on the natural variables
Entropy
S
=
S
(
U
,
V
,
{
N
i
}
)
{\displaystyle S=S(U,V,\{N_{i}\})}
By the definition of a total differential,
d
S
=
∂
S
∂
U
d
U
+
∂
S
∂
V
d
V
+
∑
i
=
1
s
∂
S
∂
N
i
d
N
i
.
{\displaystyle dS={\frac {\partial S}{\partial U}}dU+{\frac {\partial S}{\partial V}}dV+\sum _{i=1}^{s}{\frac {\partial S}{\partial N_{i}}}dN_{i}.}
From the equations of state ,
d
S
=
1
T
d
U
+
P
T
d
V
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
.
{\displaystyle dS={\frac {1}{T}}dU+{\frac {P}{T}}dV+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i}.}
The differentials in the above equation are all of extensive variables , so they may be integrated to yield
S
=
U
T
+
P
V
T
+
∑
i
=
1
s
(
−
μ
i
N
T
)
+
constant
.
{\displaystyle S={\frac {U}{T}}+{\frac {PV}{T}}+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}N}{T}}\right)+{\textrm {constant}}.}
Massieu potential / Helmholtz free entropy
Φ
=
S
−
U
T
{\displaystyle \Phi =S-{\frac {U}{T}}}
Φ
=
U
T
+
P
V
T
+
∑
i
=
1
s
(
−
μ
i
N
T
)
−
U
T
{\displaystyle \Phi ={\frac {U}{T}}+{\frac {PV}{T}}+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}N}{T}}\right)-{\frac {U}{T}}}
Φ
=
P
V
T
+
∑
i
=
1
s
(
−
μ
i
N
T
)
{\displaystyle \Phi ={\frac {PV}{T}}+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}N}{T}}\right)}
Starting over at the definition of
Φ
{\displaystyle \Phi }
and taking the total differential, we have via a Legendre transform (and the chain rule )
d
Φ
=
d
S
−
1
T
d
U
−
U
d
1
T
,
{\displaystyle d\Phi =dS-{\frac {1}{T}}dU-Ud{\frac {1}{T}},}
d
Φ
=
1
T
d
U
+
P
T
d
V
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
−
1
T
d
U
−
U
d
1
T
,
{\displaystyle d\Phi ={\frac {1}{T}}dU+{\frac {P}{T}}dV+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i}-{\frac {1}{T}}dU-Ud{\frac {1}{T}},}
d
Φ
=
−
U
d
1
T
+
P
T
d
V
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
.
{\displaystyle d\Phi =-Ud{\frac {1}{T}}+{\frac {P}{T}}dV+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i}.}
The above differentials are not all of extensive variables, so the equation may not be directly integrated. From
d
Φ
{\displaystyle d\Phi }
we see that
Φ
=
Φ
(
1
T
,
V
,
{
N
i
}
)
.
{\displaystyle \Phi =\Phi ({\frac {1}{T}},V,\{N_{i}\}).}
If reciprocal variables are not desired,
d
Φ
=
d
S
−
T
d
U
−
U
d
T
T
2
,
{\displaystyle d\Phi =dS-{\frac {TdU-UdT}{T^{2}}},}
d
Φ
=
d
S
−
1
T
d
U
+
U
T
2
d
T
,
{\displaystyle d\Phi =dS-{\frac {1}{T}}dU+{\frac {U}{T^{2}}}dT,}
d
Φ
=
1
T
d
U
+
P
T
d
V
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
−
1
T
d
U
+
U
T
2
d
T
,
{\displaystyle d\Phi ={\frac {1}{T}}dU+{\frac {P}{T}}dV+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i}-{\frac {1}{T}}dU+{\frac {U}{T^{2}}}dT,}
d
Φ
=
U
T
2
d
T
+
P
T
d
V
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
,
{\displaystyle d\Phi ={\frac {U}{T^{2}}}dT+{\frac {P}{T}}dV+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i},}
Φ
=
Φ
(
T
,
V
,
{
N
i
}
)
.
{\displaystyle \Phi =\Phi (T,V,\{N_{i}\}).}
Planck potential / Gibbs free entropy
Ξ
=
Φ
−
P
V
T
{\displaystyle \Xi =\Phi -{\frac {PV}{T}}}
Ξ
=
P
V
T
+
∑
i
=
1
s
(
−
μ
i
N
T
)
−
P
V
T
{\displaystyle \Xi ={\frac {PV}{T}}+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}N}{T}}\right)-{\frac {PV}{T}}}
Ξ
=
∑
i
=
1
s
(
−
μ
i
N
T
)
{\displaystyle \Xi =\sum _{i=1}^{s}\left(-{\frac {\mu _{i}N}{T}}\right)}
Starting over at the definition of
Ξ
{\displaystyle \Xi }
and taking the total differential, we have via a Legendre transform (and the chain rule )
d
Ξ
=
d
Φ
−
P
T
d
V
−
V
d
P
T
{\displaystyle d\Xi =d\Phi -{\frac {P}{T}}dV-Vd{\frac {P}{T}}}
d
Ξ
=
−
U
d
2
T
+
P
T
d
V
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
−
P
T
d
V
−
V
d
P
T
{\displaystyle d\Xi =-Ud{\frac {2}{T}}+{\frac {P}{T}}dV+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i}-{\frac {P}{T}}dV-Vd{\frac {P}{T}}}
d
Ξ
=
−
U
d
1
T
−
V
d
P
T
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
.
{\displaystyle d\Xi =-Ud{\frac {1}{T}}-Vd{\frac {P}{T}}+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i}.}
The above differentials are not all of extensive variables, so the equation may not be directly integrated. From
d
Ξ
{\displaystyle d\Xi }
we see that
Ξ
=
Ξ
(
1
T
,
P
T
,
{
N
i
}
)
.
{\displaystyle \Xi =\Xi \left({\frac {1}{T}},{\frac {P}{T}},\{N_{i}\}\right).}
If reciprocal variables are not desired,
d
Ξ
=
d
Φ
−
T
(
P
d
V
+
V
d
P
)
−
P
V
d
T
T
2
,
{\displaystyle d\Xi =d\Phi -{\frac {T(PdV+VdP)-PVdT}{T^{2}}},}
d
Ξ
=
d
Φ
−
P
T
d
V
−
V
T
d
P
+
P
V
T
2
d
T
,
{\displaystyle d\Xi =d\Phi -{\frac {P}{T}}dV-{\frac {V}{T}}dP+{\frac {PV}{T^{2}}}dT,}
d
Ξ
=
U
T
2
d
T
+
P
T
d
V
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
−
P
T
d
V
−
V
T
d
P
+
P
V
T
2
d
T
,
{\displaystyle d\Xi ={\frac {U}{T^{2}}}dT+{\frac {P}{T}}dV+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i}-{\frac {P}{T}}dV-{\frac {V}{T}}dP+{\frac {PV}{T^{2}}}dT,}
d
Ξ
=
U
+
P
V
T
2
d
T
−
V
T
d
P
+
∑
i
=
1
s
(
−
μ
i
T
)
d
N
i
,
{\displaystyle d\Xi ={\frac {U+PV}{T^{2}}}dT-{\frac {V}{T}}dP+\sum _{i=1}^{s}\left(-{\frac {\mu _{i}}{T}}\right)dN_{i},}
Ξ
=
Ξ
(
T
,
P
,
{
N
i
}
)
.
{\displaystyle \Xi =\Xi (T,P,\{N_{i}\}).}
References
^ Antoni Planes; Eduard Vives (2000-10-24). "Entropic variables and Massieu-Planck functions" . Entropic Formulation of Statistical Mechanics . Universitat de Barcelona. Archived from the original on 2008-10-11. Retrieved 2007-09-18.
T. Wada; A.M. Scarfone (December 2004). "Connections between Tsallis' formalisms employing the standard linear average energy and ones employing the normalized q-average energy". Physics Letters A . 335 (5–6): 351–362. arXiv :cond-mat/0410527 . Bibcode :2005PhLA..335..351W . doi :10.1016/j.physleta.2004.12.054 . S2CID 17101164 .
^
The Collected Papers of Peter J. W. Debye . New York, New York: Interscience Publishers, Inc. 1954.
Bibliography
Massieu, M.F. (1869). "Compt. Rend". 69 (858): 1057. {{cite journal }}
: Cite journal requires |journal=
(help )
Category :
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.
**DISCLAIMER** We are not affiliated with Wikipedia, and Cloudflare.
The information presented on this site is for general informational purposes only and does not constitute medical advice.
You should always have a personal consultation with a healthcare professional before making changes to your diet, medication, or exercise routine.
AI helps with the correspondence in our chat.
We participate in an affiliate program. If you buy something through a link, we may earn a commission 💕
↑