Entropy Of Mixing
   HOME

TheInfoList



OR:

In
thermodynamics Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of the ...
, the entropy of mixing is the increase in the total
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the
thermodynamic operation A thermodynamic operation is an externally imposed manipulation that affects a thermodynamic system. The change can be either in the connection or wall between a thermodynamic system and its surroundings, or in the value of some variable in the sur ...
of removal of impermeable partition(s) between them, followed by a time for establishment of a new thermodynamic state of internal equilibrium in the new unpartitioned closed system. In general, the mixing may be constrained to occur under various prescribed conditions. In the customarily prescribed conditions, the materials are each initially at a common temperature and pressure, and the new system may change its volume, while being maintained at that same constant temperature, pressure, and chemical component masses. The volume available for each material to explore is increased, from that of its initially separate compartment, to the total common final volume. The final volume need not be the sum of the initially separate volumes, so that work can be done on or by the new closed system during the process of mixing, as well as heat being transferred to or from the surroundings, because of the maintenance of constant pressure and temperature. The internal energy of the new closed system is equal to the sum of the internal energies of the initially separate systems. The reference values for the internal energies should be specified in a way that is constrained to make this so, maintaining also that the internal energies are respectively proportional to the masses of the systems. For concision in this article, the term 'ideal material' is used to refer to either an
ideal gas An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is a ...
(mixture) or an ideal solution. In the
special case In logic, especially as applied in mathematics, concept is a special case or specialization of concept precisely if every instance of is also an instance of but not vice versa, or equivalently, if is a generalization of . A limiting case i ...
of mixing ideal materials, the final common volume is in fact the sum of the initial separate compartment volumes. There is no heat transfer and no work is done. The entropy of mixing is entirely accounted for by the diffusive expansion of each material into a final volume not initially accessible to it. In the general case of mixing non-ideal materials, however, the total final common volume may be different from the sum of the separate initial volumes, and there may occur transfer of work or heat, to or from the surroundings; also there may be a departure of the entropy of mixing from that of the corresponding ideal case. That departure is the main reason for interest in entropy of mixing. These energy and entropy variables and their temperature dependences provide valuable information about the properties of the materials. On a molecular level, the entropy of mixing is of interest because it is a macroscopic variable that provides information about constitutive molecular properties. In ideal materials, intermolecular forces are the same between every pair of molecular kinds, so that a molecule feels no difference between other molecules of its own kind and of those of the other kind. In non-ideal materials, there may be differences of intermolecular forces or specific molecular effects between different species, even though they are chemically non-reacting. The entropy of mixing provides information about constitutive differences of intermolecular forces or specific molecular effects in the materials. The statistical concept of randomness is used for statistical mechanical explanation of the entropy of mixing. Mixing of ideal materials is regarded as random at a molecular level, and, correspondingly, mixing of non-ideal materials may be non-random.


Mixing of ideal species at constant temperature and pressure

In ideal species, intermolecular forces are the same between every pair of molecular kinds, so that a molecule "feels" no difference between itself and its molecular neighbors. This is the reference case for examining corresponding mixing of non-ideal species. For example, two ideal gases, at the same temperature and pressure, are initially separated by a dividing partition. Upon removal of the dividing partition, they expand into a final common volume (the sum of the two initial volumes), and the entropy of mixing \Delta S_\text is given by :\Delta S_\text = -nR(x_1\ln x_1 + x_2\ln x_2)\,. where R is the
gas constant The molar gas constant (also known as the gas constant, universal gas constant, or ideal gas constant) is denoted by the symbol or . It is the molar equivalent to the Boltzmann constant, expressed in units of energy per temperature increment per ...
, n the total number of
moles Moles can refer to: * Moles de Xert, a mountain range in the Baix Maestrat comarca, Valencian Community, Spain * The Moles (Australian band) *The Moles, alter ego of Scottish band Simon Dupree and the Big Sound People *Abraham Moles, French engin ...
and x_i the
mole fraction In chemistry, the mole fraction or molar fraction (''xi'' or ) is defined as unit of the amount of a constituent (expressed in moles), ''ni'', divided by the total amount of all constituents in a mixture (also expressed in moles), ''n''tot. This ex ...
of component i\,, which initially occupies volume V_i = x_iV\,. After the removal of the partition, the n_i = nx_i moles of component i may explore the combined volume V\,, which causes an entropy increase equal to nx_i R \ln(V/V_i) = - nR x_i \ln x_i for each component gas. In this case, the increase in entropy is entirely due to the irreversible processes of expansion of the two gases, and involves no heat or work flow between the system and its surroundings.


Gibbs free energy of mixing

The
Gibbs free energy In thermodynamics, the Gibbs free energy (or Gibbs energy; symbol G) is a thermodynamic potential that can be used to calculate the maximum amount of work that may be performed by a thermodynamically closed system at constant temperature and pr ...
change \Delta G_\text = \Delta H_\text - T\Delta S_\text determines whether mixing at constant (absolute) temperature T and pressure p is a
spontaneous process In thermodynamics, a spontaneous process is a process which occurs without any external input to the system. A more technical definition is the time-evolution of a system in which it releases free energy and it moves to a lower, more thermodynamic ...
. This quantity combines two physical effects—the
enthalpy of mixing In thermodynamics, the enthalpy of mixing (also heat of mixing and excess enthalpy) is the enthalpy liberated or absorbed from a substance upon mixing. When a substance or compound is combined with any other substance or compound, the enthalpy o ...
, which is a measure of the energy change, and the entropy of mixing considered here. For an
ideal gas An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is a ...
mixture or an ideal solution, there is no enthalpy of mixing (\Delta H_\text \,), so that the Gibbs free energy of mixing is given by the entropy term only: :\Delta G_\text = - T\Delta S_\text For an ideal solution, the Gibbs free energy of mixing is always negative, meaning that mixing of ideal solutions is always spontaneous. The lowest value is when the mole fraction is 0.5 for a mixture of two components, or 1/n for a mixture of n components.


Solutions and temperature dependence of miscibility


Ideal and regular solutions

The above equation for the entropy of mixing of ideal gases is valid also for certain liquid (or solid) solutions—those formed by completely random mixing so that the components move independently in the total volume. Such random mixing of solutions occurs if the interaction energies between unlike molecules are similar to the average interaction energies between like molecules. Atkins, P.W., de Paula, J. (2006). ''Atkins' Physical Chemistry'', eighth edition, W.H. Freeman, New York, .K. Denbigh, "The Principles of Chemical Equilibrium" (3rd ed., Cambridge University Press 1971) p.432 The value of the entropy corresponds exactly to random mixing for ideal solutions and for
regular solution In chemistry, a regular solution is a solution whose entropy of mixing is equal to that of an ideal solution with the same composition, but is non-ideal due to a nonzero enthalpy of mixing.P. Atkins and J. de Paula, ''Atkins' Physical Chemistry'' ( ...
s, and approximately so for many real solutions. For binary mixtures the entropy of random mixing can be considered as a function of the mole fraction of one component. :\Delta S_\text = -nR(x_1\ln x_1 + x_2\ln x_2) = -nR \ln x + (1-x) \ln (1-x)/math> For all possible mixtures, 0 < x < 1, so that \ln x and \ln (1-x) are both negative and the entropy of mixing \Delta S_\text is positive and favors mixing of the pure components. Also the curvature of \Delta S_\text as a function of x is given by the second derivative \left(\frac\right)_ = -nR\left( \frac +\frac \right) This curvature is negative for all possible mixtures (0 < x < 1), so that mixing two solutions to form a solution of intermediate composition also increases the entropy of the system. Random mixing therefore always favors miscibility and opposes phase separation. For ideal solutions, the enthalpy of mixing is zero so that the components are miscible in all proportions. For regular solutions a positive enthalpy of mixing may cause incomplete miscibility (phase separation for some compositions) at temperatures below the
upper critical solution temperature The upper critical solution temperature (UCST) or upper consolute temperature is the critical temperature above which the components of a mixture are miscible in all proportions. The word ''upper'' indicates that the UCST is an upper bound to a tem ...
(UCST). This is the minimum temperature at which the -T \Delta S_\text term in the Gibbs energy of mixing is sufficient to produce miscibility in all proportions.


Systems with a lower critical solution temperature

Nonrandom mixing with a lower entropy of mixing can occur when the attractive interactions between unlike molecules are significantly stronger (or weaker) than the mean interactions between like molecules. For some systems this can lead to a
lower critical solution temperature The lower critical solution temperature (LCST) or lower consolute temperature is the critical temperature below which the components of a mixture are miscible in all proportions. The word ''lower'' indicates that the LCST is a lower bound to a t ...
(LCST) or lower limiting temperature for phase separation. For example,
triethylamine Triethylamine is the chemical compound with the formula N(CH2CH3)3, commonly abbreviated Et3N. It is also abbreviated TEA, yet this abbreviation must be used carefully to avoid confusion with triethanolamine or tetraethylammonium, for which TEA ...
and water are miscible in all proportions below 19 °C, but above this critical temperature, solutions of certain compositions separate into two phases at equilibrium with each other. This means that \Delta G_\text is negative for mixing of the two phases below 19 °C and positive above this temperature. Therefore, \Delta S_\text = -\left(\frac\right)_P is negative for mixing of these two equilibrium phases. This is due to the formation of attractive
hydrogen bond In chemistry, a hydrogen bond (or H-bond) is a primarily electrostatic force of attraction between a hydrogen (H) atom which is covalently bound to a more electronegative "donor" atom or group (Dn), and another electronegative atom bearing a ...
s between the two components that prevent random mixing. Triethylamine molecules cannot form hydrogen bonds with each other but only with water molecules, so in solution they remain associated to water molecules with loss of entropy. The mixing that occurs below 19 °C is due not to entropy but to the enthalpy of formation of the hydrogen bonds. Lower critical solution temperatures also occur in many polymer-solvent mixtures.Cowie, J.M.G. "Polymers: Chemistry and Physics of Modern Materials" (2nd edn, Blackie 1991) p.174-176 For polar systems such as
polyacrylic acid Poly(acrylic acid) (PAA; trade name Carbomer) is a polymer with the formula (CH2-CHCO2H)n. It is a derivative of acrylic acid (CH2=CHCO2H). In addition to the homopolymers, a variety of copolymers and crosslinked polymers, and partially deproto ...
in 1,4-dioxane, this is often due to the formation of hydrogen bonds between polymer and solvent. For nonpolar systems such as
polystyrene Polystyrene (PS) is a synthetic polymer made from monomers of the aromatic hydrocarbon styrene. Polystyrene can be solid or foamed. General-purpose polystyrene is clear, hard, and brittle. It is an inexpensive resin per unit weight. It is a ...
in cyclohexane, phase separation has been observed in sealed tubes (at high pressure) at temperatures approaching the liquid-vapor critical point of the solvent. At such temperatures the solvent expands much more rapidly than the polymer, whose segments are covalently linked. Mixing therefore requires contraction of the solvent for compatibility of the polymer, resulting in a loss of entropy.


Statistical thermodynamical explanation of the entropy of mixing of ideal gases

Since thermodynamic entropy can be related to
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
or to
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, it is possible to calculate the entropy of mixing using these two approaches. Here we consider the simple case of mixing ideal gases.


Proof from statistical mechanics

Assume that the molecules of two different substances are approximately the same size, and regard space as subdivided into a
square lattice In mathematics, the square lattice is a type of lattice in a two-dimensional Euclidean space. It is the two-dimensional version of the integer lattice, denoted as . It is one of the five types of two-dimensional lattices as classified by their ...
whose cells are the size of the molecules. (In fact, any lattice would do, including
close packing In geometry, close-packing of equal spheres is a dense arrangement of congruent spheres in an infinite, regular arrangement (or lattice). Carl Friedrich Gauss proved that the highest average density – that is, the greatest fraction of space occu ...
.) This is a
crystal A crystal or crystalline solid is a solid material whose constituents (such as atoms, molecules, or ions) are arranged in a highly ordered microscopic structure, forming a crystal lattice that extends in all directions. In addition, macros ...
-like
conceptual model A conceptual model is a representation of a system. It consists of concepts used to help people knowledge, know, understanding, understand, or simulation, simulate a subject the model represents. In contrast, physical models are physical object su ...
to identify the molecular centers of mass. If the two phases are liquids, there is no spatial uncertainty in each one individually. (This is, of course, an approximation. Liquids have a "free volume". This is why they are (usually) less
dense Density (volumetric mass density or specific mass) is the substance's mass per unit of volume. The symbol most often used for density is ''ρ'' (the lower case Greek letter rho), although the Latin letter ''D'' can also be used. Mathematically ...
than
solid Solid is one of the State of matter#Four fundamental states, four fundamental states of matter (the others being liquid, gas, and Plasma (physics), plasma). The molecules in a solid are closely packed together and contain the least amount o ...
s.) Everywhere we look in component 1, there is a molecule present, and likewise for component 2. After the two different substances are intermingled (assuming they are miscible), the liquid is still dense with molecules, but now there is uncertainty about what kind of molecule is in which location. Of course, any idea of identifying molecules in given locations is a
thought experiment A thought experiment is a hypothetical situation in which a hypothesis, theory, or principle is laid out for the purpose of thinking through its consequences. History The ancient Greek ''deiknymi'' (), or thought experiment, "was the most anc ...
, not something one could do, but the calculation of the uncertainty is well-defined. We can use
Boltzmann's equation The Boltzmann equation or Boltzmann transport equation (BTE) describes the statistical behaviour of a thermodynamic system not in a state of equilibrium, devised by Ludwig Boltzmann in 1872.Encyclopaedia of Physics (2nd Edition), R. G. Lerne ...
for the entropy change as applied to the mixing process :\Delta S_\text= k_\text \ln\Omega where k_\text is the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, ...
. We then calculate the number of ways \Omega of arranging N_1 molecules of component 1 and N_2 molecules of component 2 on a lattice, where :N = N_1 + N_2 is the total number of molecules, and therefore the number of lattice sites. Calculating the number of
permutations In mathematics, a permutation of a set is, loosely speaking, an arrangement of its members into a sequence or linear order, or if the set is already ordered, a rearrangement of its elements. The word "permutation" also refers to the act or pr ...
of N objects, correcting for the fact that N_1 of them are ''identical'' to one another, and likewise for N_2, :\Omega = N!/N_1!N_2! After applying
Stirling's approximation In mathematics, Stirling's approximation (or Stirling's formula) is an approximation for factorials. It is a good approximation, leading to accurate results even for small values of n. It is named after James Stirling, though a related but less p ...
for the factorial of a large integer m: :\ln m! = \sum_k \ln k \approx \int_^dk \ln k = m\ln m - m + 1 \approx m\ln m - m, the result is \Delta S_\text = -k_\text _1\ln(N_1/N) + N_2\ln(N_2/N)= -k_\text N _1\ln x_1 + x_2\ln x_2/math> where we have introduced the
mole fraction In chemistry, the mole fraction or molar fraction (''xi'' or ) is defined as unit of the amount of a constituent (expressed in moles), ''ni'', divided by the total amount of all constituents in a mixture (also expressed in moles), ''n''tot. This ex ...
s, which are also the probabilities of finding any particular component in a given lattice site. ::x_1 = N_1/N = p_1 \;\;\text \;\;x_2 = N_2/N = p_2 Since the Boltzmann constant k_\text = R / N_\text, where N_\text is the Avogadro constant, and the number of molecules N = nN_\text, we recover the thermodynamic expression for the mixing of two ideal gases, \Delta S_\text = -nR _1\ln x_1 + x_2\ln x_2/math> This expression can be generalized to a mixture of r components, N_i, with i = 1, 2, 3,\ldots, r : \Delta S_\text =-k_\text\sum_^r N_i\ln(N_i/N) = -N k_\text\sum_^r x_i\ln x_i = -n R\sum_^r x_i\ln x_i


Relationship to information theory

The entropy of mixing is also proportional to the
Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum W ...
or compositional uncertainty of
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, which is defined without requiring Stirling's approximation.
Claude Shannon Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American people, American mathematician, electrical engineering, electrical engineer, and cryptography, cryptographer known as a "father of information theory". As a 21-year-o ...
introduced this expression for use in
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, but similar formulas can be found as far back as the work of
Ludwig Boltzmann Ludwig Eduard Boltzmann (; 20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodyn ...
and
J. Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
. The Shannon uncertainty is not the same as the
Heisenberg Werner Karl Heisenberg () (5 December 1901 – 1 February 1976) was a German theoretical physicist and one of the main pioneers of the theory of quantum mechanics. He published his work in 1925 in a breakthrough paper. In the subsequent series ...
uncertainty principle In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the values for certain pairs of physic ...
in
quantum mechanics Quantum mechanics is a fundamental theory in physics that provides a description of the physical properties of nature at the scale of atoms and subatomic particles. It is the foundation of all quantum physics including quantum chemistry, ...
which is based on variance. The Shannon entropy is defined as: : H = - \sum_^r p_i \ln (p_i) where ''pi'' is the probability that an information source will produce the ''i''th symbol from an ''r''-symbol alphabet and is independent of previous symbols. (thus ''i'' runs from 1 to ''r'' ). ''H'' is then a measure of the expected amount of information (log ''pi'') missing before the symbol is known or measured, or, alternatively, the expected amount of information supplied when the symbol becomes known. The set of messages of length ''N'' symbols from the source will then have an entropy of ''NH''. The thermodynamic entropy is only due to positional uncertainty, so we may take the "alphabet" to be any of the ''r'' different species in the gas, and, at equilibrium, the probability that a given particle is of type ''i'' is simply the mole fraction ''xi'' for that particle. Since we are dealing with ideal gases, the identity of nearby particles is irrelevant. Multiplying by the number of particles ''N'' yields the change in entropy of the entire system from the unmixed case in which all of the ''pi'' were either 1 or 0. We again obtain the entropy of mixing on multiplying by the Boltzmann constant k_\text. : \Delta S_\text = -N k_\text\sum_^r x_i\ln x_i So thermodynamic entropy with ''r'' chemical species with a total of ''N'' particles has a parallel to an information source that has ''r'' distinct symbols with messages that are ''N'' symbols long.


Application to gases

In gases there is a lot more spatial uncertainty because most of their volume is merely empty space. We can regard the mixing process as allowing the contents of the two originally separate contents to expand into the combined volume of the two conjoined containers. The two lattices that allow us to conceptually localize molecular centers of mass also join. The total number of empty cells is the sum of the numbers of empty cells in the two components prior to mixing. Consequently, that part of the spatial uncertainty concerning whether ''any'' molecule is present in a lattice cell is the sum of the initial values, and does not increase upon "mixing". Almost everywhere we look, we find empty lattice cells. Nevertheless, we do find molecules in a few occupied cells. When there is real mixing, for each of those few occupied cells, there is a contingent uncertainty about which kind of molecule it is. When there is no real mixing because the two substances are identical, there is no uncertainty about which kind of molecule it is. Using
conditional probabilities In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) has already occurred. This particular method relies on event B occur ...
, it turns out that the analytical problem for the small
subset In mathematics, Set (mathematics), set ''A'' is a subset of a set ''B'' if all Element (mathematics), elements of ''A'' are also elements of ''B''; ''B'' is then a superset of ''A''. It is possible for ''A'' and ''B'' to be equal; if they are ...
of occupied cells is exactly the same as for mixed liquids, and the ''increase'' in the entropy, or spatial uncertainty, has exactly the same form as obtained previously. Obviously the subset of occupied cells is not the same at different times. But only when there is real mixing and an occupied cell is found do we ask which kind of molecule is there. See also:
Gibbs paradox In statistical mechanics, a semi-classical derivation of entropy that does not take into account the indistinguishability of particles yields an expression for entropy which is not extensive (is not proportional to the amount of substance in qu ...
, in which it would seem that "mixing" two samples of the ''same'' gas would produce entropy.


Application to solutions

If the
solute In chemistry, a solution is a special type of homogeneous mixture composed of two or more substances. In such a mixture, a solute is a substance dissolved in another substance, known as a solvent. If the attractive forces between the solvent ...
is a
crystal A crystal or crystalline solid is a solid material whose constituents (such as atoms, molecules, or ions) are arranged in a highly ordered microscopic structure, forming a crystal lattice that extends in all directions. In addition, macros ...
line
solid Solid is one of the State of matter#Four fundamental states, four fundamental states of matter (the others being liquid, gas, and Plasma (physics), plasma). The molecules in a solid are closely packed together and contain the least amount o ...
, the argument is much the same. A crystal has no spatial uncertainty at all, except for
crystallographic defect A crystallographic defect is an interruption of the regular patterns of arrangement of atoms or molecules in crystalline solids. The positions and orientations of particles, which are repeating at fixed distances determined by the unit cell para ...
s, and a (perfect) crystal allows us to localize the molecules using the crystal symmetry group. The fact that volumes do not add when dissolving a solid in a liquid is not important for condensed
phase Phase or phases may refer to: Science *State of matter, or phase, one of the distinct forms in which matter can exist *Phase (matter), a region of space throughout which all physical properties are essentially uniform * Phase space, a mathematic ...
s. If the solute is not crystalline, we can still use a spatial lattice, as good an approximation for an amorphous solid as it is for a liquid. The
Flory–Huggins solution theory Flory–Huggins solution theory is a lattice model of the thermodynamics of polymer solutions which takes account of the great dissimilarity in molecular sizes in adapting the usual expression for the entropy of mixing. The result is an equation ...
provides the entropy of mixing for
polymer A polymer (; Greek '' poly-'', "many" + ''-mer'', "part") is a substance or material consisting of very large molecules called macromolecules, composed of many repeating subunits. Due to their broad spectrum of properties, both synthetic a ...
solutions, in which the macromolecules are huge compared to the solvent molecules. In this case, the assumption is made that each
monomer In chemistry, a monomer ( ; ''mono-'', "one" + '' -mer'', "part") is a molecule that can react together with other monomer molecules to form a larger polymer chain or three-dimensional network in a process called polymerization. Classification Mo ...
subunit in the polymer chain occupies a lattice site. Note that solids in contact with each other also slowly interdiffuse, and solid mixtures of two or more components may be made at will (
alloy An alloy is a mixture of chemical elements of which at least one is a metal. Unlike chemical compounds with metallic bases, an alloy will retain all the properties of a metal in the resulting material, such as electrical conductivity, ductility, ...
s,
semiconductor A semiconductor is a material which has an electrical resistivity and conductivity, electrical conductivity value falling between that of a electrical conductor, conductor, such as copper, and an insulator (electricity), insulator, such as glas ...
s, etc.). Again, the same equations for the entropy of mixing apply, but only for homogeneous, uniform phases.


Mixing under other constraints


Mixing with and without change of available volume

In the established customary usage, expressed in the lead section of this article, the entropy of mixing comes from two mechanisms, the intermingling and possible interactions of the distinct molecular species, and the change in the volume available for each molecular species, or the change in concentration of each molecular species. For ideal gases, the entropy of mixing at prescribed common temperature and pressure has nothing to do with mixing in the sense of intermingling and interactions of molecular species, but is only to do with expansion into the common volume.Bailyn, M. (1994). ''A Survey of Thermodynamics'', American Institute of Physics, New York, . According to Fowler and Guggenheim (1939/1965),Fowler, R., Guggenheim, E.A. (1939/1965). ''Statistical Thermodynamics. A version of Statistical Mechanics for Students of Physics and Chemistry'', Cambridge University Press, Cambridge UK, pages 163-164 the conflating of the just-mentioned two mechanisms for the entropy of mixing is well established in customary terminology, but can be confusing unless it is borne in mind that the
independent variables Dependent and independent variables are variables in mathematical modeling, statistical modeling and experimental sciences. Dependent variables receive this name because, in an experiment, their values are studied under the supposition or deman ...
are the common initial and final temperature and total pressure; if the respective partial pressures or the total volume are chosen as independent variables instead of the total pressure, the description is different.


Mixing with each gas kept at constant partial volume, with changing total volume

In contrast to the established customary usage, "mixing" might be conducted reversibly at constant volume for each of two fixed masses of gases of equal volume, being mixed by gradually merging their initially separate volumes by use of two ideal semipermeable membranes, each permeable only to one of the respective gases, so that the respective volumes available to each gas remain constant during the merge. Either one of the common temperature or the common pressure is chosen to be independently controlled by the experimenter, the other being allowed to vary so as to maintain constant volume for each mass of gas. In this kind of "mixing", the final common volume is equal to each of the respective separate initial volumes, and each gas finally occupies the same volume as it did initially.Planck, M. (1897/1903). ''Treatise on Thermodynamics'', translated with the author's sanction by Alexander Ogg, Longmans, Green and Co., London, Sections 235-236. Partington, J.R. (1949). ''An Advanced Treatise on Physical Chemistry'', Volume 1, ''Fundamental Principles. The Properties of Gases'', Longmans, Green, and Co., London.Callen, H.B. (1960/1985). ''Thermodynamics and an Introduction to Thermostatistics'', second edition, Wiley, New York, , pages 69-70.Iribarne, J.V., Godson, W.L. (1973/1981), ''Atmospheric Thermodynamics'', second edition, D. Reidel, Kluwer Academic Publishers, Dordrecht, , pages 48-49. This constant volume kind of "mixing", in the special case of perfect gases, is referred to in what is sometimes called Gibbs' theorem. It states that the entropy of such "mixing" of perfect gases is zero.


Mixing at constant total volume and changing partial volumes, with mechanically controlled varying pressure, and constant temperature

An experimental demonstration may be considered. The two distinct gases, in a cylinder of constant total volume, are at first separated by two contiguous pistons made respectively of two suitably specific ideal semipermeable membranes. Ideally slowly and fictively reversibly, at constant temperature, the gases are allowed to mix in the volume between the separating membranes, forcing them apart, thereby supplying work to an external system. The energy for the work comes from the heat reservoir that keeps the temperature constant. Then, by externally forcing ideally slowly the separating membranes together, back to contiguity, work is done on the mixed gases, fictively reversibly separating them again, so that heat is returned to the heat reservoir at constant temperature. Because the mixing and separation are ideally slow and fictively reversible, the work supplied by the gases as they mix is equal to the work done in separating them again. Passing from fictive reversibility to physical reality, some amount of additional work, that remains external to the gases and the heat reservoir, must be provided from an external source for this cycle, as required by the second law of thermodynamics, because this cycle has only one heat reservoir at constant temperature, and the external provision of work cannot be completely efficient.


Gibbs' paradox: "mixing" of identical species versus mixing of closely similar but non-identical species

For entropy of mixing to exist, the putatively mixing molecular species must be chemically or physically detectably distinct. Thus arises the so-called ''
Gibbs paradox In statistical mechanics, a semi-classical derivation of entropy that does not take into account the indistinguishability of particles yields an expression for entropy which is not extensive (is not proportional to the amount of substance in qu ...
'', as follows. If molecular species are identical, there is no entropy change on mixing them, because, defined in thermodynamic terms, there is no
mass transfer Mass transfer is the net movement of mass from one location (usually meaning stream, phase, fraction or component) to another. Mass transfer occurs in many processes, such as absorption, evaporation, drying, precipitation, membrane filtration ...
, and thus no thermodynamically recognized process of mixing. Yet the slightest detectable difference in constitutive properties between the two species yields a thermodynamically recognized process of transfer with mixing, and a possibly considerable entropy change, namely the entropy of mixing. The "paradox" arises because any detectable constitutive distinction, no matter how slight, can lead to a considerably large change in amount of entropy as a result of mixing. Though a continuous change in the properties of the materials that are mixed might make the degree of constitutive difference tend continuously to zero, the entropy change would nonetheless vanish discontinuously when the difference reached zero. From a general physical viewpoint, this discontinuity is paradoxical. But from a specifically thermodynamic viewpoint, it is not paradoxical, because in that discipline the degree of constitutive difference is not questioned; it is either there or not there. Gibbs himself did not see it as paradoxical. Distinguishability of two materials is a constitutive, not a thermodynamic, difference, for the laws of thermodynamics are the same for every material, while their constitutive characteristics are diverse. Though one might imagine a continuous decrease of the constitutive difference between any two chemical substances, physically it cannot be continuously decreased till it actually vanishes. Larmor, J. (1929), ''Mathematical and Physical Papers'', volume 2, Cambridge University Press, Cambridge UK, p. 99.Partington (1949) cites Larmor (1929). It is hard to think of a smaller difference than that between ortho- and para-hydrogen. Yet they differ by a finite amount. The hypothesis, that the distinction might tend continuously to zero, is unphysical. This is neither examined nor explained by thermodynamics. Differences of constitution are explained by quantum mechanics, which postulates discontinuity of physical processes. For a detectable distinction, some means should be physically available. One theoretical means would be through an ideal semi-permeable membrane. It should allow passage, backwards and forwards, of one species, while passage of the other is prevented entirely. The entirety of prevention should include perfect efficacy over a practically infinite time, in view of the nature of thermodynamic equilibrium. Even the slightest departure from ideality, as assessed over a finite time, would extend to utter non-ideality, as assessed over a practically infinite time. Such quantum phenomena as tunneling ensure that nature does not allow such membrane ideality as would support the theoretically demanded continuous decrease, to zero, of detectable distinction. The decrease to zero detectable distinction must be discontinuous. For ideal gases, the entropy of mixing does not depend on the degree of difference between the distinct molecular species, but only on the fact that they are distinct; for non-ideal gases, the entropy of mixing can depend on the degree of difference of the distinct molecular species. The suggested or putative "mixing" of identical molecular species is not in thermodynamic terms a mixing at all, because thermodynamics refers to states specified by state variables, and does not permit an imaginary labelling of particles. Only if the molecular species are different is there mixing in the thermodynamic sense.Kondepudi, D. (2008). ''Introduction to Modern Thermodynamics'', Wiley, Chichester, , pages 197-199.


See also

* CALPHAD *
Enthalpy of mixing In thermodynamics, the enthalpy of mixing (also heat of mixing and excess enthalpy) is the enthalpy liberated or absorbed from a substance upon mixing. When a substance or compound is combined with any other substance or compound, the enthalpy o ...
*
Gibbs energy In thermodynamics, the Gibbs free energy (or Gibbs energy; symbol G) is a thermodynamic potential that can be used to calculate the maximum amount of work that may be performed by a thermodynamically closed system at constant temperature and pre ...


Notes


References

{{reflist


External links


Online lecture
Statistical mechanics Thermodynamic entropy