Boltzmann's entropy formula

Instatistical mechanics,Boltzmann's equation(also known as theBoltzmann–Planck equation) is a probability equation relating theentropy,also written as,of anideal gasto themultiplicity(commonly denoted asor), the number of realmicrostatescorresponding to the gas'smacrostate:

Boltzmann's equation—carved on his gravestone.[1]
(1)

whereis theBoltzmann constant(also written as simply) and equal to 1.380649 × 10−23J/K, andis thenatural logarithmfunction (orlogbasee,as in the image above).

In short, theBoltzmannformula shows the relationship between entropy and the number of ways theatomsormoleculesof a certain kind ofthermodynamic systemcan be arranged.

History

edit
Boltzmann's grave in theZentralfriedhof,Vienna, with bust and entropy formula.

The equation was originally formulated byLudwig Boltzmannbetween 1872 and 1875, but later put into its current form byMax Planckin about 1900.[2][3]To quote Planck, "thelogarithmicconnection betweenentropyandprobabilitywas first stated by L. Boltzmann in hiskinetic theoryof gases ".[4]

A 'microstate' is a state specified in terms of the constituent particles of a body of matter or radiation that has been specified as a macrostate in terms of such variables as internal energy and pressure. A macrostate is experimentally observable, with at least a finite extent inspacetime.A microstate can be instantaneous, or can be a trajectory composed of a temporal progression of instantaneous microstates. In experimental practice, such are scarcely observable. The present account concerns instantaneous microstates.

The value ofWwas originally intended to be proportional to theWahrscheinlichkeit(the German word for probability) of amacroscopicstate for someprobability distributionof possiblemicrostates—the collection of (unobservable microscopic single particle) "ways" in which the (observable macroscopic)thermodynamicstate of a system can be realized by assigning differentpositionsandmomentato the respective molecules.

There are many instantaneous microstates that apply to a given macrostate. Boltzmann considered collections of such microstates. For a given macrostate, he called the collection of all possible instantaneous microstates of a certain kind by the namemonode,for which Gibbs' termensembleis used nowadays. For single particle instantaneous microstates, Boltzmann called the collection anergode.Subsequently, Gibbs called it amicrocanonical ensemble,and this name is widely used today, perhaps partly because Bohr was more interested in the writings of Gibbs than of Boltzmann.[5]

Interpreted in this way, Boltzmann's formula is the most basic formula for the thermodynamicentropy.Boltzmann'sparadigmwas anideal gasofNidenticalparticles, of whichNiare in thei-th microscopic condition (range) of position and momentum. For this case, the probability of each microstate of the system is equal, so it was equivalent for Boltzmann to calculate the number of microstates associated with a macrostate.Wwas historically misinterpreted as literally meaning the number of microstates, and that is what it usually means today.Wcan be counted using the formula forpermutations

(2)

whereiranges over all possible molecular conditions and "!"denotesfactorial.The "correction" in the denominator is due to the fact that identical particles in the same condition areindistinguishable.Wis sometimes called the "thermodynamic probability" since it is anintegergreater than one, whilemathematical probabilitiesare alwaysnumbersbetween zero and one.

Introduction of the natural logarithm

edit

In Boltzmann’s 1877 paper, he clarifies molecular state counting to determine the state distribution number introducing the logarithm to simplify the equation.

Boltzmann writes: “The first task is to determine the permutation number, previously designated by 𝒫 , for any state distribution. Denoting by J the sum of the permutations 𝒫 for all possible state distributions, the quotient 𝒫 /J is the state distribution’s probability, henceforth denoted by W. We would first like to calculate the permutations 𝒫 for the state distribution characterized by w0molecules with kinetic energy 0, w1molecules with kinetic energy ϵ, etc.…

“The most likely state distribution will be for those w0,w1…values for which 𝒫 is a maximum or since the numerator is a constant, for which the denominator is a minimum. The values w0,w1must simultaneously satisfy the two constraints (1) and (2). Since the denominator of 𝒫 is a product, it is easiest to determine the minimum of its logarithm,…”

Therefore, by making the denominator small, he maximizes the number of states. So to simplify the product of the factorials, he uses their natural logarithm to add them. This is the reason for the natural logarithm in Boltzmann’s entropy formula.[6]

Generalization

edit

Boltzmann's formula applies to microstates of a system, each possible microstate of which is presumed to be equally probable.

But in thermodynamics, the universe is divided into asystemof interest, plus its surroundings; then the entropy of Boltzmann's microscopically specified system can be identified with the system entropy in classical thermodynamics. The microstates of such a thermodynamic system arenotequally probable—for example, high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed temperature by allowing contact with a heat bath. For thermodynamic systems where microstates of the system may not have equal probabilities, the appropriate generalization, called theGibbs entropy,is:

(3)

This reduces to equation (1) if the probabilitiespiare all equal.

Boltzmann used aformula as early as 1866.[7]He interpretedρas a density in phase space—without mentioning probability—but since this satisfies the axiomatic definition of a probability measure we can retrospectively interpret it as a probability anyway.Gibbsgave an explicitly probabilistic interpretation in 1878.

Boltzmann himself used an expression equivalent to (3) in his later work[8]and recognized it as more general than equation (1). That is, equation (1) is a corollary of equation (3)—and not vice versa. In every situation where equation (1) is valid, equation (3) is valid also—and not vice versa.

Boltzmann entropy excludes statistical dependencies

edit

The termBoltzmann entropyis also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle—i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles. This is exact for an ideal gas of identical particles that move independently apart from instantaneous collisions, and is an approximation, possibly a poor one, for other systems.[9]

The Boltzmann entropy is obtained if one assumes one can treat all the component particles of athermodynamic systemas statistically independent. The probability distribution of the system as a whole then factorises into the product ofNseparate identical terms, one term for each particle; and when the summation is taken over each possible state in the 6-dimensionalphase spaceof asingleparticle (rather than the 6N-dimensional phase space of the system as a whole), the Gibbs entropy

(4)

simplifies to the Boltzmann entropy.

This reflects the original statistical entropy function introduced byLudwig Boltzmannin 1872. For the special case of anideal gasit exactly corresponds to the properthermodynamic entropy.

For anything but the most dilute of real gases,leads to increasingly wrong predictions of entropies and physical behaviours, by ignoring the interactions and correlations between different molecules. Instead one must consider theensembleof states of the system as a whole, called by Boltzmann aholode,rather than single particle states.[10]Gibbs considered several such kinds of ensembles; relevant here is thecanonicalone.[9]

See also

edit

References

edit
  1. ^See: photo ofBoltzmann's gravein theZentralfriedhof,Vienna, with bust and entropy formula.
  2. ^Boltzmann equation.Eric Weisstein's World of Physics (states the year was 1872).
  3. ^Perrot, Pierre (1998).A to Z of Thermodynamics.Oxford University Press.ISBN0-19-856552-6.(states the year was 1875)
  4. ^Max Planck(1914) The theory of heat radiationequation 164, p.119
  5. ^Cercignani, C. (1998).Ludwig Boltzmann: the Man who Trusted Atoms,Oxford University Press, Oxford UK,ISBN9780198501541,p. 134, pp. 141–142.
  6. ^Sharp, K.; Matschinsky, F. Translation of Ludwig Boltzmann’s Paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium” Sitzungberichte der Kaiserlichen Akademie der Wissenschaften. Mathematisch-Naturwissen Classe. Abt. II, LXXVI 1877, pp 373-435 (Wien. Ber. 1877, 76:373-435). Reprinted in Wiss. Abhandlungen, Vol. II, reprint 42, p. 164-223, Barth, Leipzig, 1909. Entropy 2015, 17, 1971-2009.https://doi.org/10.3390/e17041971This article incorporates text from this source, which is available under theCC BY 3.0license.
  7. ^Ludwig Boltzmann (1866). "Über die Mechanische Bedeutung des Zweiten Hauptsatzes der Wärmetheorie".Wiener Berichte.53:195–220.
  8. ^Ludwig Boltzmann (1896).Vorlesungen über Gastheorie, vol. I.J.A. Barth, Leipzig.; Ludwig Boltzmann (1898).Vorlesungen über Gastheorie, vol. II.J.A. Barth, Leipzig.
  9. ^abJaynes, E. T.(1965).Gibbs vs Boltzmann entropies.American Journal of Physics,33,391-8.
  10. ^Cercignani, C. (1998).Ludwig Boltzmann: the Man who Trusted Atoms,Oxford University Press, Oxford UK,ISBN9780198501541,p. 134.
edit