My watch list
my.chemeurope.com  
Login  

Entropy (energy dispersal)



The thermodynamic concept of entropy can be described qualitatively as a measure of energy dispersal (energy distribution) at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the energy of a thermodynamic system divided by its temperature. Similar terms have been in use from early in the history of classical thermodynamics, and with the development of statistical thermodynamics and quantum theory, entropy changes have been described in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels.

Such descriptions have tended to be used together with commonly used terms such as disorder and chaos which have the problems of being ambiguous, and in common usage mean the opposite to the equilibrium they are equated to in thermodynamics. As well as causing confusion, this can hamper understanding. A recently developed method for introducing entropy, particularly in relation to chemistry and biology, avoids these ambiguous terms, and instead describes entropy in a context of energy dispersal with a focus on explaining how energy is involved, and how the greater the energy dispersal.

The concept of thermodynamics involving dispersal of energy, particularly with reference to entropy, goes back to the roots of the science, with William Thomson, 1st Baron Kelvin referring to "dissipation of mechanical energy" in 1852. However, early assumptions about molecules being initially in an ordered state and rapidly proceeding to "the disordered most probable state" led to the subject commonly being introduced in terms of entropy as order and disorder.[1] While this suggested a concept of an abstraction closely related to probability, the ambiguity of the terms used such as disorder and chaos often led to misunderstandings. Students were being asked to grasp meanings directly contradicting their normal usage, with equilibrium being equated to "perfect internal disorder" and the mixing of milk in coffee from apparent chaos to uniformity being described as a transition from an ordered state into a disordered state.[2] Studies found that few understood what these terms were intended to convey, and that even though microstates and energy levels were emphasised in the course, for most these points were overwhelmed by simplistic notions of randomness or disorder. Many of those learning by practising calculations did not perceive all the intrinsic meanings of equations, and there was a need for qualitative explanations of thermodynamic relationships.[3][4]

To meet this need, terms such as "energy dispersal" and "spreading of energy" can be used with careful exclusion of references to "disorder" and "chaos" except in the context of explaining misconceptions, always using the new terms in the context of explanations of where and how the energy is dispersing or spreading to emphasise the underlying qualitative meaning.[5] Variations on this approach have been adopted by a significant number of chemistry textbooks, mostly in the United States. The noted British author Peter Atkins, who previously wrote of dispersal leading to a disordered state, has revised the 8th edition of his internationally respected textbook Physical Chemistry to describe entropy in terms of dispersal of energy, and has discarded 'disorder' as a description.[6][7]

Contents

Historical development of the concept

Although it is difficult to pinpoint the exact origin of energy dispersal as a description of entropy change, it is likely that this concept began with the ideas of William Thomson (Lord Kelvin) who in 1852 published his famous article "On a Universal Tendency in Nature to the Dissipation of Mechanical Energy."[8] In this article, Thomson makes a distinction between two types or "stores" of mechanical energy: statical and dynamical. He discusses how these two types of energy can change, from one form to the other, during a thermodynamic transformation and that when heat is created by any unreversible process (such as friction) or when heat is diffused by conduction, there is a dissipation of mechanical energy, and a full restoration of it to its primitive condition is impossible.[9] [10]

In the mid 1950s, with the development of quantum theory, researchers began speaking about entropy changes in terms of the mixing or "spreading" of the total energy of each constituent of a system over its particular quantized energy levels, such as by the the reactants and products of a chemical reaction.[11]

In 1984, Oxford physical chemist Peter Atkins in his popular book The Second Law, written for non-scientists, presents a non-mathematical interpretation of what he called the “infinitely incomprehensible entropy” in simple terms, describing the second law of thermodynamics as "energy tends to disperse", but then writes in terms of a collapse into chaos and the corruption of the quality of energy. terms which can be confusing and problematic. His analogies include the concept of an intelligent being called “Boltzmann’s Demon” who relentlessly runs around re-organizing and dispersing energy to shows how Boltzmann’s “W” from his probability equation relates to energy dispersion, transmitted via atomic vibrations and collisions and other verbal arguments. He states: “each atom carries kinetic energy, and the spreading of the atoms spreads the energy…the Boltzmann equation therefore captures the aspect of dispersal: the dispersal of the entities that are carrying the energy.”[12] While the book had limited impact, Atkins continued to describe entropy as energy dispersal leading to a disordered state in his influential Physical Chemistry, until discarding 'disorder' as a description in the 8th edition of 2006.

Stanley Sandler in Chemical and Engineering Thermodynamics of 1989 described how a quantity TS, in a thermodynamic process, can be interpreted as the amount of mechanical energy that has been converted into thermal energy viscous dissipation, dispersion, and other system irreversibilities.[13] In another example, John Wrigglesworth in 1997 described spatial particle distributions as being represented in terms of distributions of energy states. In this direction, according to the second law of thermodynamics, isolated systems will have the tendency to redistribute the energy of the system to that of a more probable arrangement or a maximum probability of energy distribution, i.e. from that of being concentrated to that of being spread out. In this process, the total energy does not change, as per the first law of thermodynamics, but rather the energy tends to disperse from a coherent to a more incoherent state.[14]

Other authors using variations on these terms included Cecie Starr and Andrew Scott.[15] [16] In the 1999 book Statistical Thermodynamics, M.C. Gupta defined entropy as a function which measures how energy disperses when a system changes from one state to another.[17]

In 1996 the Physicist Harvey S. Leff developed a theoretical foundation for the concept which he called Thermodynamic entropy: The spreading and sharing of energy.[18] Another physicist, Professor Daniel F. Styer, had an article published which showed that “entropy as disorder” was inadequate.[19] In an article published in the Journal of Chemical Education in 2002, Professor Emeritus Frank L. Lambert contended that the confusing portrayal of entropy as "disorder" should be abandoned, and went on to develop detailed resources for chemistry instructors implementing the concept of entropy increase as measuring the spontaneous dispersal of energy: how much energy is spread out in a process, or how widely spread out it becomes – at a specific temperature.[5] [20] By early in 2006 the majority of new editions of US first-year university texts had adopted this approach: see list below.

Introductory approach to entropy

The description of entropy as amounts of "mixedupness" or "disorder" and the abstract nature of statistical mechanics can lead to confusion and considerable difficulty for those beginning the subject.[5][21] An approach to instruction emphasising the qualitative simplicity of entropy has been published and made available online.[22]

In this approach, the second law of thermodynamics is introduced as "Energy spontaneously disperses from being localized to becoming spread out if it is not hindered from doing so." in the context of common experiences such as a rock falling, a hot frying pan cooling down, iron rusting, air leaving a punctured tire and ice melting in a warm room. Entropy is then depicted as a sophisticated kind of "before and after" yardstick — measuring how much energy is spread out/T as a result of a process such as heating a system, or how widely spread out the energy is after something happens in comparison with its previous state, in a process such as gas expansion or fluids mixing (at a constant temperature). The equations are explored with reference to the common experiences, with emphasis that in chemistry the energy that entropy measures as dispersing is internal energy, which beginners can most clearly understand as “motional energy”, the translational and vibrational and rotational energy of molecules.

By giving concrete examples, this approach is effective in explaining entropy to assist those who have great difficulty in grasping mathematical abstractions. The statistical interpretation is related to quantum mechanics in describing the way that energy is distributed (quantized) amongst molecules on specific energy levels, with all the energy of the macrostate always in only one Microstate at one instant. Entropy is described as measuring the energy dispersal for a system by the number of accessible microstates, the number of different arrangements of all its energy at the next instant. Thus, an increase in entropy means a greater number of microstates for the Final state than for the Initial, and hence more choices for the arrangement of a system's total energy at any one instant. Here, the greater 'dispersal of the total energy of a system' means the existence of so many possibilities.[23] Continuous movement and molecular collisions visualised as being like bouncing balls blown by air as used in a lottery can then lead on to showing the possibilities of many Boltzmann distributions and continually changing "distribution of the instant", and so on to the idea that when the system changes, dynamic molecules will have a greater number of accessible microstates. In this approach, all everyday spontaneous physical happenings and chemical reactions are depicted as involving some type of energy flows from being localized or concentrated to becoming spread out to a larger space, always to a state with a greater number of microstates.[22]

This then provides a good basis for understanding the conventional approach, except in very complex cases where the qualitative relation of energy dispersal to entropy change can be so inextricably obscured that it is moot.[22] Thus in situations such as in the entropy of mixing when the two or more different substances being mixed are at the same temperature and pressure so there will be no net exchange of heat or work, the entropy increase will be due to the literal spreading out of the motional energy of each substance in the larger combined final volume. Each component’s energetic molecules become more separated from one another than when in the pure state they were colliding only with identical adjacent molecules, leading to an increase in its number of accessible microstates.[24]

In the field of chemistry education, a significant number of new editions of textbooks for school or college and university use have adopted this approach as a replacement for texts involving the analogy of "disorder". The accessibility of websites putting forward the ideas has made them a common resource for chemistry students and for the general public seeking a basic idea of thermodynamic entropy.

Criticisms

The criticism is made from the viewpoint of entropy being a mathematical abstraction with relevance across a wide range of disciplines that this approach is too closely tied to the thermodynamics of chemistry.[citation needed]

It can be suggested that the term "dispersal" does not apply well to complex situations, and that a precise quantitative definition is required for the term.[citation needed] However it may be noted that the term is presented as a qualitative description, giving an informed introduction to the standard mathematical methods used to determine quantitative aspects.

References

  1. ^ Boltzmann. Ludwig, Lectures on Gas Theory (Part II, 1898) translated by Stephen G. Brush. 1964, University of California Press p. 443
  2. ^ Microsoft ® Encarta ® 2006. © 1993-2005 Microsoft Corporation. All rights reserved.
  3. ^ Carson, E. M. and J. R. Watson (Department of Educational and Professional Studies, Kings College, London), Undergraduate students' understandings of entropy and Gibbs Free energy, University Chemistry Education - 2002 Papers, Royal Society of Chemistry.
  4. ^ Sozbilir, Mustafa, PhD studies: Turkey, A Study of Undergraduates’ Understandings of Key Chemical Ideas in Thermodynamics, PhD Thesis, Department of Educational Studies, The university of York, 2001.
  5. ^ a b c Frank L. Lambert, JCE 2002 (79) 187 [Feb Disorder--A Cracked Crutch for Supporting Entropy Discussions], updated version at Disorder — A Cracked Crutch For Supporting Entropy Discussions
  6. ^ Atkins, Peter (1984). The Second Law. Scientific American Library. ISBN 0-7167-5004-X. 
  7. ^ Atkins, Peter; Julio De Paula (2006). Physical Chemistry , 8th edition. Oxford University Press. ISBN 0-19-870072-5. 
  8. ^ Jensen, William. (2004). "Entropy and Constraint of Motion." Journal of Chemical Education (81) 693, May
  9. ^ Thomson, William (1852). "On a Universal Tendency in Nature to the Dissipation of Mechanical Energy." Proceedings of the Royal Society of Edinburg, April 19.
  10. ^ Thomson, William (1874). "Kinetic Theory of the Dissipation of Energy", Nature, IX. pp 441-444 (April 9).
  11. ^ Denbigh, Kenneth (1981). The Principles of Chemical Equilibrium, 4th Ed.. Cambridge University Press. ISBN 0-521-28150-4. 
  12. ^ Atkins, Peter (1984). The Second Law. Scientific American Library. ISBN 0-7167-5004-X. 
  13. ^ Sandler, Stanley, I. (1989). Chemical and Engineering Thermodynamics. John Wiley & Sons. ISBN 0-471-83050-X. 
  14. ^ Wrigglesworth, John (1997). Energy and Life (Modules in Life Sciences). CRC. ISBN 0-7484-0433-3.  (see excerpt)
  15. ^ Starr, Cecie; Taggart, R. (1992). Biology - the Unity and Diversity of Life. Wadsworth Publishing Co.. ISBN 0-534-16566-4. 
  16. ^ Scott, Andrew (2001). 101 Key ideas in Chemistry. Teach Yourself Books. ISBN 0-07-139665-9. 
  17. ^ Gupta, M.C. (1999). Statistical Thermodynamics. New Age Publishers. ISBN 81-224-1066-9.  (see excerpt)
  18. ^ Leff, H. S. Am. J. Phys. 1996, 64, 1261-1271
  19. ^ Styer D. F., Am. J. Phys. 2000, 68, 1090-1096
  20. ^ Frank L. Lambert, A Student’s Approach to the Second Law and Entropy
  21. ^ Frank L. Lambert, The Second Law of Thermodynamics (6)
  22. ^ a b c Frank L. Lambert, Entropy Is Simple, Qualitatively
  23. ^ Frank L. Lambert, The Molecular Basis for Understanding Simple Entropy Change
  24. ^ Notes for a “Conversation About Entropy”: a brief discussion of both thermodynamic and "configurational" ("positional") entropy in chemistry.

Publications using the energy dispersal approach

Chemistry textbooks

  • Atkins, Peter; Julio De Paula (2006). Physical Chemistry , 8th edition. Oxford University Press. ISBN 0-19-870072-5. , W. H. Freeman ISBN 0-7167-8759-8
  • Physical Chemistry for the Life Sciences, First Edition; P. Atkins and J. de Paula; Oxford University Press, ISBN 0-19-928095-9, W. H. Freeman, 699 pages, ISBN 0-7167-8628-1
  • “Chemistry, The Molecular Science”, Second Edition J. W. Moore, C. L. Stanistski, P. C. Jurs; Thompson Learning, 2005, 1248 pages, ISBN 0-534-42201-2
  • “Chemistry, The Molecular Nature of Matter and Change”, Fourth Edition; M. S. Silberberg; McGraw-Hill, 2006, 1183 pages, ISBN 0-07-255820-2
  • “Conceptual Chemistry”, Second Edition; J. Suchocki; Benjamin Cummings, 2004, 706 pages, ISBN 0-8053-3228-6
  • “Chemistry, Matter and Its Changes”, Fourth Edition; J. E. Brady and F. Senese;

John Wiley, 2004, 1256 pages, ISBN 0-471-21517-1

  • “General Chemistry”, Eighth Edition; D. D. Ebbing and S. D. Gammon; Houghton-Mifflin, 2005, 1200 pages, ISBN 0-618-39941-0
  • “Chemistry: A General Chemistry Project of the American Chemical Society”, First Edition; J. Bell, et al.; W. H. Freeman, 2005, 820 pages, ISBN 0-7167-3126-6
  • “Chemistry: The Central Science”, Tenth Edition; T. L. Brown, H. E. LeMay, and B. E. Bursten;

Prentice Hall, 2006, 1248 pages, ISBN 0-13-109686-9

  • Petrucci, Harwood and Herring, 9th edition of “General Chemistry”
  • Hill, Petrucci, McCreary and Perry, 4th edition of “General Chemistry”
  • Ebbing, Gammon and Ragsdale, 2nd edition of “Essentials of General Chemistry”
  • Moog, Spencer and Farrell, “Thermodynamics, A Guided Inquiry”
  • Kotz, Treichel and Weaver, 6th edition of “Chemistry and Chemical Reactivity”
  • Olmsted and Williams, 4th edition of “Chemistry”
 
This article is licensed under the GNU Free Documentation License. It uses material from the Wikipedia article "Entropy_(energy_dispersal)". A list of authors is available in Wikipedia.
Your browser is not current. Microsoft Internet Explorer 6.0 does not support some functions on Chemie.DE