### D.60 Checks on the ex­pres­sion for en­tropy

Ac­cord­ing to the mi­cro­scopic de­f­i­n­i­tion, the dif­fer­en­tial of the en­tropy should be

where the sum is over all sys­tem en­ergy eigen­func­tions and is their prob­a­bil­ity. The dif­fer­en­tial can be sim­pli­fied to

the lat­ter equal­ity since the sum of the prob­a­bil­i­ties is al­ways one, so 0.

This is to be com­pared with the macro­scopic dif­fer­en­tial for the en­tropy. Since the macro­scopic ex­pres­sion re­quires ther­mal equi­lib­rium, in the mi­cro­scopic ex­pres­sion above can be equated to the canon­i­cal value where is the en­ergy of sys­tem eigen­func­tion . It sim­pli­fies the mi­cro­scopic dif­fer­en­tial of the en­tropy to

 (D.38)

the sec­ond in­equal­ity since is a con­stant in the sum­ma­tion and 0.

The macro­scopic ex­pres­sion for the dif­fer­en­tial of en­tropy is given by (11.18),

Sub­sti­tut­ing in the dif­fer­en­tial first law (11.11),

and plug­ging into that the de­f­i­n­i­tions of and ,

and dif­fer­en­ti­at­ing out the prod­uct in the first term, one part drops out ver­sus the sec­ond term and what is left is the dif­fer­en­tial for ac­cord­ing to the mi­cro­scopic de­f­i­n­i­tion (D.38). So, the macro­scopic and mi­cro­scopic de­f­i­n­i­tions agree to within a con­stant on the en­tropy. That means that they agree com­pletely, be­cause the macro­scopic de­f­i­n­i­tion has no clue about the con­stant.

Now con­sider the case of a sys­tem with zero in­de­ter­mi­nacy in en­ergy. Ac­cord­ing to the fun­da­men­tal as­sump­tion, all the eigen­func­tions with the cor­rect en­ergy should have the same prob­a­bil­ity in ther­mal equi­lib­rium. From the en­tropy’s point of view, ther­mal equi­lib­rium should be the sta­ble most messy state, hav­ing the max­i­mum en­tropy. For the two views to agree, the max­i­mum of the mi­cro­scopic ex­pres­sion for the en­tropy should oc­cur when all eigen­func­tions of the given en­ergy have the same prob­a­bil­ity. Re­strict­ing at­ten­tion to only the en­ergy eigen­func­tions with the cor­rect en­ergy, the max­i­mum en­tropy oc­curs when the de­riv­a­tives of

with re­spect to the are zero. Note that the con­straint that the sum of the prob­a­bil­i­ties must be one has been added as a penalty term with a La­grangian mul­ti­plier, {D.48}. Tak­ing de­riv­a­tives pro­duces

show­ing that, yes, all the have the same value at the max­i­mum en­tropy. (Note that the min­ima in en­tropy, all zero ex­cept one, do not show up in the de­riva­tion; is zero when 0, but its de­riv­a­tive does not ex­ist there. In fact, the in­fi­nite de­riv­a­tive can be used to ver­ify that no max­ima ex­ist with any of the equal to zero if you are wor­ried about that.)

If the en­ergy is un­cer­tain, and only the ex­pec­ta­tion en­ergy is known, the pe­nal­ized func­tion be­comes

and the de­riv­a­tives be­come

which can be solved to show that

with and con­stants. The re­quire­ment to con­form with the given de­f­i­n­i­tion of tem­per­a­ture iden­ti­fies as and the fact that the prob­a­bil­i­ties must sum to one iden­ti­fies as 1/.

For two sys­tems and in ther­mal con­tact, the prob­a­bil­i­ties of the com­bined sys­tem en­ergy eigen­func­tions are found as the prod­ucts of the prob­a­bil­i­ties of those of the in­di­vid­ual sys­tems. The max­i­mum of the com­bined en­tropy, con­strained by the given to­tal en­ergy , is then found by dif­fer­en­ti­at­ing

can be sim­pli­fied by tak­ing apart the log­a­rithm and not­ing that the prob­a­bil­i­ties and sum to one to give

Dif­fer­en­ti­a­tion now pro­duces

which pro­duces and and the com­mon con­stant then im­plies that the two sys­tems have the same tem­per­a­ture.