| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Entropy 2014

Page history last edited by Windy Zhu 9 years, 4 months ago

Introduction: disorder in real life

 

     The concepts of disorder and entropy are central themes when attempting to understand the thermodynamic processes that happen at a large scale. Disorder is easy to relate to on a macroscopic level, as it is just a description of the level of organization and randomness in an area. A room with clothes everywhere on the floor has more disorder than a room where the clothes have been folded and put into place. Entropy is not a description of “random” versus “organized” but rather a description of the number of possible states that can be occupied - the more possible configurations there are, the higher the entropy will be. When applied to the small scale, entropy provides a description for why heat seems to flow from hot to cold areas, and why endothermic reactions can occur spontaneously. An understanding of entropy, then, is key to understand thermodynamic chemistry.1


 

Microstates and Macrostates

 

     Ultimately, the point of thermodynamics is to explain the everyday world with the molecular world. Given a physical system, we can find out different characteristics of that system (e.g. temperature), and try explaining properties like temperature in terms of the system’s molecules. The state of the system, which is a description of the system’s temperature, pressure, volume, etc., is the macrostate, and the state of the component molecules of that system is the microstate. For example, we can explain macrostates like temperature and pressure in terms of microstates like the kinetic energy of the system’s molecules. Interestingly, the macrostate can be determined from the microstate, but not vice versa, since the macrostate is a statistical description of the system’s molecules.2

 

     Let’s look at temperature in order to get a better idea of what microstates are. We have a given amount of energy for the system that is distributed among the molecules of that system. Each molecule can have a specific kinetic energy, and the microstate describes the kinetic energy of each molecule in the system. The total energy of that system can be spread differently among the molecules, giving different kinetic energies for the molecules. Thus, many microstates are possible for a system.2


 

Statistical Mechanics

 

     The Fundamental Assumption of Statistical Mechanics

How do we go from the kinetic energies of molecules to explaining why heat moves from hot to low temperature? Statistical mechanics bridges the gap between the molecular and the visible. The fundamental assumption of statistical mechanics is that, in the long term, a system has an equal chance of being in any microstate. You can understand it as follows: get a deck of cards. Shuffle it. If you shuffle it well, you will probably have an equal chance of getting any order of cards in the deck. The point is that with ~1023molecules, shuffling the speeds of the molecules through random molecular collisions will give a random, unpredictable distribution of molecular kinetic energies, given that the masses of the molecules are identical. 2

 

     This fundamental assumption only applies at the thermodynamic limit, that is, when there are a huge number of molecules in the system and they can interact in an infinite number of ways. 

 

     Though a system has an equal chance in being in any one microstate, not all microstates are the same. Think about the following example: If you had two pennies and two quarters, and randomly chose two, which two coins are you most likely to get? You will have a 50% chance of getting one penny and one quarter, and getting all pennies or all quarters has only a 25% chance each. So even though the two-penny-two-quarter option is made up of either having the first coin be a quarter or penny, we only care about how much money we got in the end.


     Notice that we can discuss which coin is a penny or quarter. The “microstate” which coin is a penny or quarter. The distribution* of quarters and pennies you picked doesn’t distinguish between the microstate of a quarter and a penny, and the microstate of a penny and a quarter. So the distribution is just a count of how many quarters and pennies you have, and doesn’t take into account which specific coin was a penny or quarter. Finally, the “macrostate” is the total amount of money you won.

 

 

 

 

 

 

 

 

Here is a visual of the coin experiment, except that heads and tails are used to represent the two types of coins in the experiment.

Figure taken from http://www.learner.org/courses/learningmath/data/session8/part_c/probability.html

 

 

Statistical Mechanics, Temperature, and Heat

     The analogy applies directly to temperature. We don’t care which molecule has which kinetic energy, but we do care about how many molecules have a specific kinetic energy (i.e. the distribution of molecular kinetic energies). The distribution* can be represented by any microstate which only differs by which molecule has that kinetic energy given in the distribution. Thus, if there are a huge number of combinations allowed by a specific distribution of molecular kinetic energies, then you have a huge number of microstates, since each microstate is both the distribution of molecular kinetic energies and which molecule has which kinetic energy. As a result, by the fundamental assumption, you are more likely to see the system with the distribution of kinetic energy that is represented by all those microstates.2

 

     So we can now explain how heat moves from hot to cold objects. Each molecule can only take on discrete kinetic energies (due to quantum mechanics). So if we put a hot object next to a cold object, we will predict the final state of both the hot object and cold object to be the state that is represented by the greatest number of microstates. By the fundamental assumption, you are taking the kinetic energies of the molecules in the hot and cold objects, and redistributing those kinetic energies randomly.2

 

     Say that only two kinetic energies are allowed: high and low. The hot object has molecules which have all high kinetic energy and the cold object has molecules with low kinetic energy. In addition, both the hot and cold object have three molecules each. We can calculate the probability for each possible microstate (i.e. distribution of low and high kinetic energy molecules). If you do the calculation (see problem below), the microstate with the low and high kinetic energies distributed almost evenly between the objects is the most likely distribution. Thus, the objects will have roughly the same kinetic energy, or equivalently, temperature.

 

     This situation can be scaled up to 1023 molecules, and the end result is that there are so many microstates that split higher and lower kinetic energies is that you will with virtually complete certainty see the temperature decreasing in the hot object and increasing for the cold object as the kinetic energies of the molecules are randomly distributed between the hot and cold object’s molecules.2

 

*Distributions in this discussions are thought of as a histogram. For example, when we discuss distributions of kinetic energies, think of a histogram which each bin representing a specific kinetic energy that a subset of the system’s molecules have.


 

S=kln(Ω) : Basic equation for entropy

 

     With base e assumed, this equation is called the Planck entropy or Boltzmann entropy formula. The formula is an application of statistical mechanics, where S is the entropy of an ideal gas system, k is the Boltzmann constant, and Ω (omega), meaning probability, is the number of “microstates” the particles in the system can be found. The equation is used to calculate entropy of a system based on the possible number of microstates. Simply put, having more possible microstate will increase entropy in a logarithm based scale.3

 

     Entropy depends on the logarithm of Ω is no more than having a convenient way to manipulate it. That is to say,  the way entropy is calculated is not experimentally determined but chosen by scientists because it’s easy to change multiplication into addition (detailed example is given below).

 

     Suppose we have two systems (containers of gas, say) with S1, Ω1 and S2, Ω2. If we now redefine this as a single system (without actually mixing the two gases), then the entropy of the new system will be S = S1 + S2 but the number of microstates will be the product Ω1*Ω2 because for each state of system 1, system 2 can be in any of Ω2 states. With the use of the rule that logarithm of the product of two numbers equals the sum of the logarithms of the individual numbers (i.e. ln(Ω1)+ln(Ω2)=ln(Ω12)), entropy is then additive.3


 

Second law of Thermodynamics

 

     The second law of thermodynamics states that the entropy of the universe never decreases. This fundamental law explains why heat always flows from one end to the other when there exists a difference. This is to say, if you put a cup of hot tea in room temperature, the temperature of the tea will always drop. Heat never flows backwards--if it did, the number of microstates of the combined system would have decreased and the combined hot-cold systems would decrease in entropy, thus disobeys the second law of thermodynamics.4



Screen Shot 2014-11-03 at 6.37.55 PM.png

     Based on our previous discussion of statistical mechanics, this law directly follows from the probability of observing a system that in a macrostate represented by a certain number of indistinguishable microstates (i.e. the distribution of kinetic energy of the molecules are the same between two possible microstates, but the way kinetic energy are distributed to individual molecules is different). For example, if a system has 4 molecules, as indicated on the left, it will has 16 microstates. There are will be 4/16 of the time when one molecule is in the top part; 6/16 of the time when two molecules are in the top part, but only 1/16 of the time that all four molecules are in the top part. As a result, the system probabilistically favours to be in microstates with higher disorder. Thus, it is virtually impossible to observe a decrease in entropy for systems with large numbers of molecules4.

 

     To sum up, there are two basic law for macro- and micro-states: all the microstates have the same probability of existence; the willingness for a system to stay in a macrostate is described by the number of degenerate microstates (essentially probability).4

 

 

 

 

 


 

Third law of Thermodynamics

 

     The third law of thermodynamics states that the entropy of a perfect crystal of a pure substance approaches zero as the absolute temperature approaches zero. Since temperature is a measure of the average kinetic energy of all component particles, and since absolute zero is the floor for temperature at which the kinetic energy of each individual particle is zero, the third law of thermodynamics is able to say that if there is no net temperature, there is no net entropy (for a perfect crystal) because every molecule will be stationary and perfectly aligned, and so there can be no random redistribution of kinetic energy or position between molecules. Thus, as the energy of this crystal approaches zero, the unique vibrations of each atom or molecule are reduced to zero.1

 

     At absolute zero, though, even with zero molecular motion, entropy can still exists for an imperfect crystal if multiple orientations are possible for the molecules once they come to rest. These imperfect alignments will have some energy associated with them, and so the system will have some inherent disorder.


 

Application: Temperature

How temperature is derived from kinetic energy of molecules

 

     As was discussed above, the state of a system can be described as a macrostate which is affected based on the microstate of the component atoms/molecules. One such microstate is kinetic energy, a property that varies from molecule to molecule within a system. Temperature is a measure of the average kinetic energy of these atoms and molecules.1

 

     As temperature increases, the average speed (and thus kinetic energy) of the molecules increases, but the distribution increases as well, as there are more possible kinetic energies and so there is a larger spread. This relationship is shown in the chart below:

A chart showing the relationship between the temperature of a system and the distribution of the speeds of individual molecules in the system - as temperature increases, average speed increases, but the distribution also broadens


 

Difficult Concept: Entropy vs. Disorder

 

     Remember that entropy tells us how many microstates are possible for a system, but this characterization of entropy only applies to thermodynamic systems like gas molecules. Disorder, on the other hand, is a more general term for how random a system is. Disorder does often increase with entropy. For example, when coffee cools, the distribution of the molecular speeds for molecules in the coffee and in the environment becomes more random. However, we can also talk about disorder in the context of a clean vs. messy room, but entropy and the mathematics behind it are applicable only to molecular systems.5


 

Question 1: What is the microstate when we are dealing with temperature?

  1. The distribution of temperatures possible for a system.

  2. The distribution of kinetic energies allowed for a specific molecule in the system.

  3. The distribution (i.e. “histogram”) of kinetic energies for the molecules in the system. This would not include information on which molecule has which kinetic energy.

  4. The description of which molecule has which kinetic energy.

 

Question 2: There are two systems with two molecules each. A molecule either has high or low kinetic energy. Calculate the probabilities for each distribution of kinetic energy, and how many microstates are represented by that distribution.

 

Question 3: If you combine three systems together, with each system having n possible microstates, then the combined system will have 3n microstates.

  1. True. The number of microstates is an additive property of systems.

  2. True. Entropy is an additive property of thermodynamic systems.

  3. False. The number of microstates possible is the product of the individual microstates.

  4. False. Entropy is a multiplicative property and so there are n3 possible microstates.

 

Question 4: Considering the relationship between microstates and macrostates in the Second Law of Thermodynamics, which of the following is true:

  1. Some microstates in a system has a higher probability of existence.

  2. Some macrostates in a system has a higher probability of existence.

  3. Having more degenerate microstates will result in a higher probability of some macrostates.

  4. Having some macrostates of the same energy due to a microstate be counted more than once.

  5. More than one of the above are true.

 

Question 5:

True or false: Temperature is a measure of average kinetic energy, and so at absolute zero for a perfect crystal of a pure substance entropy > 0.

  1. True - average kinetic energy means some particles could be above and below absolute zero, meaning there are some unique vibrations and so there is entropy

  2. True - entropy is the randomness of a system and so even if temperature is zero there is randomness in position

  3. False - because there is no negative kinetic energy at absolute zero there would be no kinetic energy for any particle - thus entropy is zero.

  4. False - one crystal cannot have entropy because entropy is a measure of the randomness between two bodies

 

Question 6: True or False: For a system at any given temperature, every molecule has the same kinetic energy.

  1. False - temperature is a measure of average kinetic energy, so most molecules will have kinetic energy near the mean but there will be some random distribution, causing a graph of the data to take on the rough shape of a bell curve

  2. False - temperature is a measure of average kinetic energy, which means the molecules will be equally distributed across a range of kinetic energies

  3. True - temperature is a measure of the kinetic energy of the molecules within a system and so gives information about the kinetic energy of each individual molecule

  4. True - molecules within a system cannot have different kinetic energies at any temperature

 

Question 7: Difficult Concept:

Entropy is applicable to mixing a deck of cards.

  1. True. Entropy is a measure of disorder in the system.

  2. True. Calculating the different combinations of cards possible gives us the entropy.

  3. False. Entropy is applicable to thermodynamic systems and systems with thermal energy.

  4. False. Mixing a deck of cards is a thermodynamic system, but disorder is not an applicable concept.


 

Answers:

Answer 1: D

Hint with Partial Answer 2:

For a review on permutations and combinations (which you will need in this problem), see this link: http://www.mathsisfun.com/combinatorics/combinations-permutations.html .

Essentially, you can use Pascal’s Triangle to see that distributing the kinetic energies between the molecules evenly gives the greatest number of microstates.

Answer 3: 3

Answer 4: 5

Answer 5: 3

Answer 6: 1

Answer 7: 3


Citation:

1Oxtoby, D. W.; Gillis, H. P.; Campion, A. Principles of Modern Chemistry, Seventh Edition; Cengage Learning: Belmont, 2012

2Chabay, R. W.; Sherwood, B. A. Matter and Interactions; Wiley: Hoboken, 2007.

3http://chemwiki.ucdavis.edu/Wikitexts/Simon_Fraser_Chem1%3A_Lower/Thermodynamics_of_Chemical_Equilibrium/Entropy#Entropy_and_.22disorder.22 )

4http://chemwiki.ucdavis.edu/Physical_Chemistry/Thermodynamics/Laws_of_Thermodynamics/Second_Law_of_Thermodynamics

5UC Davis ChemWiki. Entropy. http://chemwiki.ucdavis.edu/Wikitexts/Simon_Fraser_Chem1%3A_Lower/Thermodynamics_of_Chemical_Equilibrium/Entropy#Entropy_and_.22disorder.22 (accessed October 28, 2014).

 

 

 

 

Comments (0)

You don't have permission to comment on this page.