Free entropy
   HOME

TheInfoList



OR:

A
thermodynamic Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of ...
free entropy is an entropic
thermodynamic potential A thermodynamic potential (or more accurately, a thermodynamic potential energy)ISO/IEC 80000-5, Quantities an units, Part 5 - Thermodynamics, item 5-20.4 Helmholtz energy, Helmholtz functionISO/IEC 80000-5, Quantities an units, Part 5 - Thermod ...
analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials (or functions), or (rarely) free information. In
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic b ...
, free entropies frequently appear as the logarithm of a partition function. The
Onsager reciprocal relations In thermodynamics, the Onsager reciprocal relations express the equality of certain ratios between flows and forces in thermodynamic systems out of equilibrium, but where a notion of local equilibrium exists. "Reciprocal relations" occur betwe ...
in particular, are developed in terms of entropic potentials. In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, free entropy means something quite different: it is a generalization of entropy defined in the subject of free probability. A free entropy is generated by a
Legendre transformation In mathematics, the Legendre transformation (or Legendre transform), named after Adrien-Marie Legendre, is an involutive transformation on real-valued convex functions of one real variable. In physical problems, it is used to convert functions ...
of the entropy. The different potentials correspond to different constraints to which the system may be subjected.


Examples

The most common examples are: where ::S is
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
::\Phi is the Massieu potential ::\Xi is the Planck potential ::U is
internal energy The internal energy of a thermodynamic system is the total energy contained within it. It is the energy necessary to create or prepare the system in its given internal state, and includes the contributions of potential energy and internal kinet ...
::T is
temperature Temperature is a physical quantity that expresses quantitatively the perceptions of hotness and coldness. Temperature is measured with a thermometer. Thermometers are calibrated in various temperature scales that historically have relied o ...
::P is
pressure Pressure (symbol: ''p'' or ''P'') is the force applied perpendicular to the surface of an object per unit area over which that force is distributed. Gauge pressure (also spelled ''gage'' pressure)The preferred spelling varies by country a ...
::V is
volume Volume is a measure of occupied three-dimensional space. It is often quantified numerically using SI derived units (such as the cubic metre and litre) or by various imperial or US customary units (such as the gallon, quart, cubic inch). ...
::A is
Helmholtz free energy In thermodynamics, the Helmholtz free energy (or Helmholtz energy) is a thermodynamic potential that measures the useful work obtainable from a closed thermodynamic system at a constant temperature (isothermal). The change in the Helmholtz en ...
::G is
Gibbs free energy In thermodynamics, the Gibbs free energy (or Gibbs energy; symbol G) is a thermodynamic potential that can be used to calculate the maximum amount of work that may be performed by a thermodynamically closed system at constant temperature an ...
::N_i is
number of particles The particle number (or number of particles) of a thermodynamic system, conventionally indicated with the letter ''N'', is the number of constituent particles in that system. The particle number is a fundamental parameter in thermodynamics which is ...
(or number of moles) composing the ''i''-th chemical component ::\mu_i is the
chemical potential In thermodynamics, the chemical potential of a species is the energy that can be absorbed or released due to a change of the particle number of the given species, e.g. in a chemical reaction or phase transition. The chemical potential of a speci ...
of the ''i''-th chemical component ::s is the total number of
components Circuit Component may refer to: •Are devices that perform functions when they are connected in a circuit.   In engineering, science, and technology Generic systems *System components, an entity with discrete structure, such as an assemb ...
::i is the ith components. Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is \psi, used by both Planck and Schrödinger. (Note that Gibbs used \psi to denote the free energy.) Free entropies where invented by French engineer
François Massieu François Jacques Dominique Massieu (4 August 1832 – 5 February 1896) was a French thermodynamics engineer noted for his two 1869 characteristic functions, each of which known as a Massieu function (the first of which sometimes called free entrop ...
in 1869, and actually predate Gibbs's free energy (1875).


Dependence of the potentials on the natural variables


Entropy

:S = S(U,V,\) By the definition of a total differential, :d S = \frac d U + \frac d V + \sum_^s \frac d N_i. From the
equations of state In physics, chemistry, and thermodynamics, an equation of state is a thermodynamic equation relating state variables, which describe the state of matter under a given set of physical conditions, such as pressure, volume, temperature, or ...
, :d S = \fracdU+\fracdV + \sum_^s \left(- \frac\right) d N_i . The differentials in the above equation are all of extensive variables, so they may be integrated to yield :S = \frac+\frac + \sum_^s \left(- \frac\right).


Massieu potential / Helmholtz free entropy

:\Phi = S - \frac :\Phi = \frac+\frac + \sum_^s \left(- \frac\right) - \frac :\Phi = \frac + \sum_^s \left(- \frac\right) Starting over at the definition of \Phi and taking the total differential, we have via a Legendre transform (and the
chain rule In calculus, the chain rule is a formula that expresses the derivative of the composition of two differentiable functions and in terms of the derivatives of and . More precisely, if h=f\circ g is the function such that h(x)=f(g(x)) for every , ...
) :d \Phi = d S - \frac dU - U d \frac , :d \Phi = \fracdU + \fracdV + \sum_^s \left(- \frac\right) d N_i - \frac dU - U d \frac , :d \Phi = - U d \frac +\fracdV + \sum_^s \left(- \frac\right) d N_i. The above differentials are not all of extensive variables, so the equation may not be directly integrated. From d \Phi we see that :\Phi = \Phi(\frac ,V, \) . If reciprocal variables are not desired, :d \Phi = d S - \frac , :d \Phi = d S - \frac d U + \frac d T , :d \Phi = \fracdU + \fracdV + \sum_^s \left(- \frac\right) d N_i - \frac d U + \frac d T, :d \Phi = \frac d T + \fracdV + \sum_^s \left(- \frac\right) d N_i , :\Phi = \Phi(T,V,\) .


Planck potential / Gibbs free entropy

:\Xi = \Phi -\frac :\Xi = \frac + \sum_^s \left(- \frac\right) -\frac :\Xi = \sum_^s \left(- \frac\right) Starting over at the definition of \Xi and taking the total differential, we have via a Legendre transform (and the
chain rule In calculus, the chain rule is a formula that expresses the derivative of the composition of two differentiable functions and in terms of the derivatives of and . More precisely, if h=f\circ g is the function such that h(x)=f(g(x)) for every , ...
) :d \Xi = d \Phi - \frac d V - V d \frac :d \Xi = - U d \frac + \fracdV + \sum_^s \left(- \frac\right) d N_i - \frac d V - V d \frac :d \Xi = - U d \frac - V d \frac + \sum_^s \left(- \frac\right) d N_i. The above differentials are not all of extensive variables, so the equation may not be directly integrated. From d \Xi we see that :\Xi = \Xi \left(\frac , \frac , \ \right) . If reciprocal variables are not desired, :d \Xi = d \Phi - \frac , :d \Xi = d \Phi - \frac d V - \frac d P + \frac d T , :d \Xi = \frac d T + \fracdV + \sum_^s \left(- \frac\right) d N_i - \frac d V - \frac d P + \frac d T , :d \Xi = \frac d T - \frac d P + \sum_^s \left(- \frac\right) d N_i , :\Xi = \Xi(T,P,\) .


References


Bibliography

* *{{cite book , first = Herbert B. , last = Callen , author-link = Herbert Callen , year = 1985 , title = Thermodynamics and an Introduction to Thermostatistics , edition = 2nd , publisher = John Wiley & Sons , location = New York , isbn = 0-471-86256-8 Thermodynamic entropy