Müller I. A history of thermodynamics. The doctrine of energy and entropy (1185104), страница 30
Текст из файла (страница 30)
Next, consider that you drive a car yourself forsome long time randomly through the city and register the fraction of seconds that yourspeed is 50 km/h. The ergodic hypothesis implies that the two fractions are equal.Mathematicians have tried to prove the ergodic hypothesis and their efforts have led to abranch of set theory, the ergodic theory. That theory, however, offers little to the physicist.87 I found this simple problem in the book by J.D. Fast: “Entropie.
Die Bedeutung desEntropiebegriffs und seine Anwendung in Wissenschaft und Technik.” [Entropy. The85Other Extrapolations. InformationEn2π 2e 4µ È1ØÉ1 2 ÚÙ2Ê(4πε 0 h)n1231ØÈ2.171¹10 18 J É1 2 ÙÊ n Úaccording to Bohr’s model of atomic structure.88 In the jargon developed instatistical mechanics we place the atom in a heat bath of temperature T andform its partition functionPÈ E ØÇ n 1 2n2 exp ÊÉ kTn ÚÙ.Hence follows the entropy and the free energy of the atomSk ÈÈÈ E ØØØT ln É Ç 2n 2 exp É n Ù Ù Ù and FÉn1Ê kT Ú Ú ÚÊT ÊÈÈ E ØØ kT ln É Ç 2n 2 exp É n Ù Ù .n1Ê kT Ú ÚÊFor any normal earthly temperature only the first term with n = 1contributes appreciably to the sum so that we haveS = kln2andF = –kTln2.The 2 in these equations represents the two possibilities in which theelectron may realize the energy zero: spin up and spin down.Writing this I am reminded of an exchange between two eminent thermodynamicists at a conference, which I attended as a young man.
One, a Nobelprize winner – call him P – emphatically opposed the other one – let him becalled T – for having applied statistical mechanics to a single atom. Thediscussion culminated in this dialogue:P: Your application is not permissible and, if you had read mybook carefully, you would know it.T: I read your book more carefully than you wrote it, and …The rest of the answer was lost in an outbreak of hilarity in the audience.Other Extrapolations.
InformationThe interpretations of entropy as a number of realizations of a distributionand as a measure of order and disorder have led to extrapolations of theconcept to fields other than gases. We have already discussed the case ofrubber properties and we shall later discuss the power of the entropy ofmeaning of the concept of entropy and its application in science and technology] Philips’sTechnische Bibliothek (1960).
Also available in Dutch, English, French, and Spanish.88 The energy of the ground state is set equal to zero. e and µ are the electric charge and massof the electron respectively; h = 6.625 10–34 Js is the Planck constant, and İ0 is the vacuumdi-electricity.124 4 Entropie as S = k ln Wmixing in mixtures. Both applications belong to main-stream thermodynamics.
However, there are also fairly esoteric extrapolations, popularamong physicists affecting sensitivity for the unusual – and there are manyof those.Sometimes such arguments come along as challenges, like when it ispointed out that a great piece of literature – usually Hamlet, or Faust, noless (!) – is obviously highly ordered in comparison to a random distributionof its words, or letters. It should therefore have a small entropy, and soShakespeare, or Goethe must have defeated the universal tendency ofentropy to increase.
In this case the challenge is: How did the poets do that,and where is the inevitable overall increase of entropy to offset the decreaseeffected by the dramas? No serious answer is available!And then, there is information theory, invented in 1948 by ClaudeElwood Shannon (1916–2001). Shannon89 put a number on a messagewhich somehow represents its informational value, cf. Insert 4.8. Theexpression for the calculation of the number can look – under certainN!circumstances – like Boltzmann’s entropy S = klnW with W.P N xc !xcAnd so Shannon called his number the entropy of the message. There is astory about this which is reported by Denbigh90:When Shannon had invented his quantity and consulted von Neumann on whatto call it, von Neumann replied: Call it entropy.
It is already in use under thatname and besides, it will give you a great edge in debates because nobodyknows what entropy is anyway.No doubt Shannon and von Neumann thought that this was a funny joke,but it is not, – it merely exposes Shannon and von Neumann as intellectualsnobs. Indeed, it may sound philistine, but a scientist must be clear, – asclear as he can be –, and avoid wanton obfuscation at all cost. And if vonNeumann had a problem with entropy, he had no right to compound thatproblem for others – students and teachers alike – by suggesting thatentropy had anything to do with information.Shannon’s informationIf a message consists of a single “sign” a which naturally occurs with theprobability p(a), Shannon calls its information value – or simply information –Inf89log 21bit.p(a )C.E.
Shannon: “A mathematical theory of communication.” Bell Systems TechnologyJournal 27, (1948), pp. 379–423, 623–657.90K. Denbigh: “How subjective is entropy?”. In: “Maxwell’s demon, entropy, information,computing.” H.S. Leff, A.F. Rex (eds.) Rrinceton University Press (1990) pp. 109–115.Other Extrapolations. Information125The smaller the probability of the “sign” the more information we gain by receivingit.1 bit is the unit of information; it stands for binary indissoluble information unit.1nThe name stems from simple cases when the message a has the probability ( /2) sothat it can be identified by n successive alternatives, binary decisions, with the1probability /2 each.An example occurs when we draw a card from a stack of 32 with {7, 8, 9, 10,knave, queen, king, ace}.
We may then give out messages about our card like this:111“black” p = /2 , or “spades” p = /4 , or “spades unnumbered” p = /8 , or “spades11with queen or king” p = /16 , or “queen of spades” p = /32 . The correspondinginformations come out as 1 bit for “black” through 5 bit for “queen of spades”.They are higher for the less probable “sign” and, when the “sign” is least probable,as the queen of spades is, the information is complete, i.e. the card is fullyidentified. The predilection for the dual logarithm is due to the fact that we wantintegers as information for this simple case, – or Shannon did.The logarithm itself is chosen so that information is additive, when a messageconsists of several (independent) signs a1,a2,…,an (say) with probability p(a1)p(a2)…p(an ).
In that case we haveInfÈ n1 ØÉ Ç log 2 p ( a ) Ù bitÊi 1i Ú,nand if the sign ai occurs Ni times in the message – with Ç NiiN – we obviously1obtainInfÈ n1 ØÉ Ç N i log 2 p ( a ) Ù bitÊi 1i Ú.This is the expression called entropy by Shannon.If the probability p(ai) of sign ai is equal to the relative frequency Ni/N of itsoccurrence – as may perhaps happen in very long messages – we obtainInfÈ nN Ø É Ç N i log 2 i Ù bitN ÚÊi 1orInfÈÉÉ log 2ÉÉÊØN ! ÙÙbitnÙN!Ç i ÙÚi 1,where the Stirling formula has been used for the last step. Thus the analogy toBoltzmann’s entropy is complete to within a multiplicative constant.If we wish, we can now assign an entropy to the message which Shakespeare sentus when he wrote Hamlet: We look up the probability of each letter ai of theEnglish alphabet, count how often they occur in Hamlet and calculate Inf. Peopledo that and we may suppose that they know why.Insert 4.8126 4 Entropie as S = k ln WFor level-headed physicists entropy – or order and disorder – is nothingby itself.
It has to be seen and discussed in conjunction with temperatureand heat, and energy and work. And, if there is to be an extrapolation ofentropy to a foreign field, it must be accompanied by the appropriateextrapolations of temperature and heat and work. Lacking this, such anextrapolation is merely at the level of the following graffito, which issupposed to illustrate the progress of western culture to more and moredisorder, i.e. higher entropy:Hamlet: to be or not to beCamus: to be is to doSartre: to do is to beSinatra: do be do be do be doIngenious as this joke may be, it provides no more than amusement.5 Chemical PotentialsIt is fairly seldom that we find resources in the form in which we needthem, which is as pure substances or, at least, strongly enriched in thedesired substance.
The best known example is water: While there is somesweet water available on the earth, salt water is predominant, and thatcannot be drunk, nor can it be used in our machines for cooling (say),or washing. Similarly, natural gas and mineral oil must be refined beforeuse, and ore must be smelted down in the smelting furnace. Smelting was,of course, known to the ancients – although it was not always doneefficiently – and so was distillation of sea water which provided both, sweetwater and pure salt in one step, the former after re-condensation.
Actually,in ancient times there was perhaps less scarcity of sweet water than today,but – just like today – there was a large demand for hard liquor that had tobe distilled from wine, or from other fermented fruit or vegetable juices.The ancient distillers did a good enough job since time immemorial, butstill their processes of separation and enrichment were haphazard and notoptimal, since the relevant thermodynamic laws were not known.The same was largely true for chemical reactions, when two constituentscombine to form a third one (say), or when the constituents of a compoundhave to be separated.
Sometimes heating is needed to stimulate the reactionand on other occasions the reaction occurs spontaneously or even explosively. The chemists – or alchemists – of early modern times knew a lotabout this, but nothing systematic, because chemical thermodynamics – andchemical kinetics – did not yet exist.Nowadays it is an idle question which is more important, the thermodynamics of energy conversion or chemical thermodynamics. Both areessential for the survival of an ever growing humanity, and both mutuallysupport each other, since power stations need fuel and refineries needpower.
Certainly, however, chemical thermodynamics – the thermodynamics of mixtures, solutions and alloys – came late and it emerged in bitsand pieces throughout the last quarter of the 19th century, although Gibbshad formulated the comprehensive theory in one great memoir as early as1876 through 1878.1285 Chemical PotentialsJosiah Willard Gibbs (1839–1903)Gibbs led a quiet, secluded life in the United States, which during the 19thcentury was as far from the beaten track as Russia.1 As a postdoctoralfellow Gibbs had had a six year period of study in France and Germany,before he became a professor of mathematical physics at Yale University,where he stayed all his life.