Entropy Exploration
Introduction
Entropy measures the number of different ways a system can be arranged. It is both a statistical and a thermodynamic concept; it can represent a system’s disorder or how the system approaches thermodynamic equilibrium.
The Basics
A microstate is a description of a state of each molecule within a system like a solution.The number of microstates is represented by Ω, which is dependent on volume(V), internal energy (U), and number of molecules (N) multiplied by the constant (g), which is actually a collection of constants grouped together. The number of microstates is given by Ω(U,V,N)= gVNU3N/2 for a monoatomic ideal gas within a given macrostate.
A macrostate is described by the external constraints that affect a certain system, such as pressure, volume, and temperature; in other words it is the overall state of all the molecules. If you know the specific microstate, you also know the macrostate, but the opposite is not true.
Multiplicity is the number of microstates corresponding to a specific macrostate.
(See below for a different explanation)
Application of micro/macrostate with coin flipping
One way to think about the number of microstates and macrostates is with a coin flipping experiment. If a coin is flipped three times, there are eight possible outcomes (listed below in a table). The eight different outcomes are the number of microstates. A specific microstate would be one configuration of the coin flips (ie. HTH). The macrostate specifies how many heads or tails there are. There are four possible macrostates in this example: 0, 1, 2, or 3 heads/tails. Furthermore, if we know that a specific microstate is HHT it tells that the macrostate will be 2 heads, but the opposite is not true. Knowing that there are two heads does not mean that you know the specific order of the microstate.
Multiplicity in this coin example is the number of microstates corresponding to a certain macrostate. If we consider the two heads macrostate, then there are three microstates that correspond to this macrostate (HHT, HTH, THH); thus the multiplicity is three. If the macrostate were were analyzing was three heads, then the multiplicity is one because only HHH corresponds to that macrostate.
The equation for entropy (S = kBlnΩ) shows the relationship between entropy and the number of ways that atoms or molecules in a thermodynamic system can be arranged. The Boltzmann’s constant, kb, provides the link between the microscopic atoms and molecules and the macroscopic bulk matter and thermodynamics. The logarithm turns the multiplicity -- a very large number -- into a manageable number.
The fundamental assumption of statistical mechanics states that: In an isolated system in thermal equilibrium, all accessible microstates are equally probable (Schroeder). While each microstate has an equal probability, certain macrostates are more probable to occur than others.
Second Law of Thermodynamics: The entropy of the universe never decreases because the universe is always spontaneously moving towards thermodynamic equilibrium (where thermodynamic equilibrium means that there is also thermal, radiative, chemical and mechanical equilibrium, i.e., no temperature, pressure or chemical potential differences exist). In an isolated system, the entropy can decrease. In the case of a refrigerator, the refrigerator's entropy will decrease while there is heat removed in order to create the cold reservoir.
Keeping the Second Law of Thermodynamics in mind, why are the following statements equivalent?
- Any large system in equilibrium will most likely be found in the macrostate with the greatest multiplicity (aside from fluctuations that are normally too small to measure). (The macrostate with the greatest multiplicity means that that certain macrostate is comprised of the largest number of microstates. This means that it is more probable that this macrostate will occur, given that there are more microstates corresponding to it).
- Multiplicity tends to increase. (As the system moves towards thermodynamic equilibrium, the multiplicity will increase as there are more possible microstates to make up the given macrostate of the system)
- The net spontaneous flow of energy is zero when a system is at, or very near, its most probable macrostate (the macrostate with the largest multiplicity). (At the most likely macrostate, the system will be in thermodynamic equilibrium, so this means that the flow of energy will essentially come to a stop because it is also in thermal, radiative, chemical and mechanical equilibrium).
- There is no device that can transfer heat from a colder to a warmer reservoir without net expenditure of work. (Since entropy can also be defined by the equation delta S=Q/T, where S is entropy, Q is heat content, and T is temperature, this means that change in entropy is inversely related to temperature. Therefore, a lower temperature will give a smaller change in entropy, and in order to decrease the entropy of a system, work must be put in).
- No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work. (When there are two reservoirs, one hot (T1) and one cold (T2), the maximum work done is the product of the Carnot efficiency (1-T1/T2) and the heat absorbed at the hot reservoir (Q1). The Carnot efficiency applies to the Carnot cycle, that is the most efficient heat engine cycle. The Carnot cycle consists of two isothermal processes and two adiabatic processes, while the Carnot efficiency addresses the problem that not all heat supplied can be used to do work in the heat engine while also setting the limiting value for the fraction of heat that can be used. For more on the Carnot efficiency, click here).
- Entropy tends to increase. (As the system heads toward thermodynamic equilibrium, it will tend toward the macrostate with the highest multiplicity, which also would correspond to the highest entropy compared to the rest of the macrostates)
Common Pitfalls: Entropy vs. Disorder
The concept of entropy first originated when researchers were trying to relate work and energy, heat, and temperature of bodies. In a classical sense, entropy is simply the relationship between the temperature and heat content of a body (delta S=Q/T), and has nothing to do with disorder. However, idea of entropy and disorder may be related when thinking on a microscopic level. On the molecular level, entropy is the measure of different modes of movement that can hold kinetic energy. It was Boltzmann who claimed that a system is more disorderly if there are more ways that a system could be arranged internally.
Concept Questions
1. How many microstates are there for flipping 4 coins? (Hint: look for a general pattern to find the number of microstates)
2. When would it be appropriate to associate entropy and disorder, and when would it not?
3. Explain multiplicity in terms of microstates and macrostates.
Sources
Fitzpatrick, Richard, Macrostates and Microstates, Farside, 2 Feb 2006, 21 Oct 2013, <http://farside.ph.utexas.edu/teaching/sm1/lectures/node33.html>
Lambert, Frank, What is a microstate?, Entropysite, Nov 2006, 21 Oct 2013, <http://entropysite.oxy.edu/microstate/>
Donaldson, Steve, Entropy is not Disorder, Science 2.0, 4 Jan 2011, 21 Oct 2013,
<http://www.science20.com/train_thought/blog/entropy_not_disorder-75081>
Schroeder, Daniel. Thermal Physics. York: Maple-Vail Book Manufacturing Group, 2000. Print.
Question Answers
1. There are 2^4 microstates.
2. The correlation of entropy and disorder is only acceptable when speaking on microscopic levels. In a macroscopic, classical sense, entropy is defined as the relationship between heat and temperature, and has nothing to do with the order of a body.
3. The multiplicity of a macrostate is its number of microstates.
Comments (0)
You don't have permission to comment on this page.