| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Entropy

Page history last edited by rushco 11 years, 5 months ago Saved with comment

Introduction

 

Entropy is a state function (a function that describes the state of the system and is consequently path-independent) that describes the directionality of spontaneous processes, or indicates in which direction a particular process will proceed without external stimulus. A peculiarity of this entropy function is that both the system of interest and its surroundings must taken into account, described as the thermodynamic universe. Entropy is useful because it allows us to predict the spontaneity of a given process. Entropy change is considered positive in the direction of a spontaneous process.

 

A Deeper Look: The Postal Service

 

You are sending your friend in Cambodia a package from here in Ann Arbor. Unfortunately, you are using the US Postal Service, meaning your package will probably be routed around the country before being shipped over the Pacific Ocean. You know that the major processing centers for USPS are located in Chicago, Minneapolis, St. Louis, and Salt Lake City. You also know that the only way for international packages to leave the country is through only three sea ports: Seattle, San Francisco, and Los Angles (see figure below).

 

Your package will receive a stamp from the seaport it passes through, so there are only three possible stamps your package can have. However, your package will pass through at least one processing center and only one seaport, there are many different ways you package can leave the country, but only three possible stamps it can have.  Fortunately, due to budget cuts, the Postal Service only runs six routes for international packages leaving Ann Arbor, which are shown below:

 

 

The six routes can be summed up as follows:

 

Route 1

Ann Arbor

Minneapolis

Seattle

Route 2

Ann Arbor

Minneapolis

Salt Lake City

San Francisco

Route 3

Ann Arbor

Chicago

Salt Lake City

San Francisco

Route 4

Ann Arbor

St. Louis

Salt Lake City

San Francisco

Route 5

Ann Arbor

Chicago

Los Angles

Route 6

Ann Arbor

St. Louis

Los Angles

 

Considering that your package will leave Ann Arbor and travel down a random route, what is the probability of it passing through (and receiving a stamp from) each individual seaport? This is a relatively easy calculation. For instance, three of the six routes go to San Francisco, therefore the probability of it passing through San Francisco Seaport is 3/6=50%. The probabilities of it passing through Seattle and LA are 17% and 33% respectively.

 

Because the processing centers do not stamp the shipments passing through them, your package will arrive to your friend with only one stamp from whatever seaport shipped it. We can define the stamp that your package received as its ‘macrostate’ the overall condition of the package once it arrives. As you can see above, there are different ways that the same macrostate can be achieved. For instance, the ‘LA macrostate’ can be achieved by route 5 or route 6, we can call these ‘microstates’—different ways in which the macrostates can be achieved.

 

Let’s take our analysis one step further and define the number of ways a package can arrive at one seaport (i.e. how many microstates it has) is its multiplicity. This is an accurate definition in terms of mathematics, which defines multiplicity as the number of times a value appears in a single set of numbers. For instance, if we define the number 16 as 2x2x4, the multiplicity of 2 in the definition of 16 is 2, the same way the multiplicity of ‘San Francisco’ for our package is 3 (route 2, route 3, route 4).

 

Now, what happens if you send your friend 2 packages? How about 5? Or even 6.022x1023? Not only would you be a very generous person by sending your friend a mole of packages, but it will surely be difficult for your friend to keep track of what stamps he has for his packages!  Let’s say that you do, in fact, send your friend a mole of packages who, in turn, receives a giant mountain of boxes in front of his pleasant Cambodian residence. After looking over the mountain briefly, your friend notices almost exclusively San Francisco stamps (or macrostates). Clearly, he notices almost all San Francisco stamps because the packages had the highest probability of passing through SF Seaport.

 

You know that your friend is a very picky person. In fact, he is so picky that he is worried about where his package has been in the US. For instance, he only trust the SF Seaport to handle his package properly, and only wants the St. Louis and Salt Lake City processing centers touching it. Painstakingly, your friend removes only the SF macrostate packages—but he notices a problem: there is no way to tell what processing centers the packages passed through! Well, calling the postal service will be of no use (he will surely be on hold for hours), so he determines that he can only find reassurance through calculating the probability that is has passed through his desired route. Because his desired route to SF (route 4) is one of three microstates for SF, the probability that is passed through his route is roughly 1/3. “This is ridiculous,” proclaims your friend, “the post office should only ship packages one way to Cambodia.” You try to explain to your friend that congress has already tried fixing the postal service by increasing its budget—but the postal service keeps squandering the money it receives and is on the verge of bankruptcy. Your friend, who is now furious, devises a way to convince congress to do something about the inefficient postal service.

 

He decides that he will find a way to measure the ‘disorder’ of the shipping system to Cambodia through SF Seaport. He starts by accepting that there is 3 different ways his package can pass through SF. Let’s call the number of microstates Ω (capital omega). Thus, we can define the probability that the package will pass through any one of the microstates as 1/Ω. Let’s say that in general, any package that has Ω number of possible routes has a 1/Ω probability of passing through a specific route. Therefore,

 

f(Ω) = 1/Ω

 

We can now graph the probability function for a general situation with number of microstates on the x-axis, and probability on the y-axis:

 

 

 

Clearly, we aren’t interested in the portion of this graph that places Ω ≤ 0 (cannot have negative or factional microstates), so our function is only defined for Ω > 0.

 

Although this is useful information, your friend will have to first define Ω in terms of more fundamental components to see how it could possibly be changed. For instance, let’s say that Ω depends on three factors: funding for the post office (F), employee motivation (E), and highway conditions across the country (H). Your friend is able to come up with ways to measure all of these aspects of Ω; thus, we can say that the number of routes (microstates) is a function of F, E, and H: Ω(F,E,H). We now need to define this function. Perhaps your friend decides that a power function would be the best way to represent Ω as F, E, and H change. We can then approximate the following:

 

Ω(F,E,H) = exp(kR)

 

Where k can be some constant and R is… what? Well we can see what happens to R as F, E, H change. As we increase funding to the post office, administrators can run more routes between cities, therefore the number of ways a package can arrive at a specific seaport increase (Ω increases). If we try to motivate post office employees (a difficult task indeed!) they will be more apt run more routes in a day, which also increases Ω.

 

Finally, if we the condition of the highways is improved, more routes will become accessible to large transport vehicles. Indeed, R is indirectly proportional to F, E, and H through the number of microstates they compute to. Overall, Ω increases for increases in F, E, and H, which increases the number of paths a package can take from Ann Arbor to Cambodia thus increasing the ‘disorder’ of the whole shipping system. What value in this equation increases along with an increase in Ω? R of course! Let’s now say that R is the ‘randomness’ of the shipping system, which is proportional to the number of paths a package can take. With this definition of Ω, we can now define R independently:

 

R=kln(Ω)

 

But why do we need the natural log in the first place? We can look at the graph of this function to make some intuitive assumptions:

 

 

 

As you can see, at very high values of Ω, adding one more route does not do much to change the overall ‘randomness’ of the system. At low values of Ω, such as 1, adding one more route almost doubles the apparent randomness. We can call this randomness entropy.

 

In Terms of Thermodynamics

 

Postal Service politics aside, the above example above shows us what entropy is and how it can be used to describe a system: be it a shipping service or quantity of gas. We use the same concepts to describe particles that we used to describe packages in the example. The macrostate of a molecule is its overall energy state which can, in effect, be expressed through different microstates (i.e. routes). A microstate is the way the molecule expresses the macrostate. The number of microstates possible is dependent on how much energy the molecule has total. This number is not dependent on post office employee motivation, but is instead dependent on translational, rotational, and vibration energy. For instance, if a molecule has energy E, one microstate would be that is translational energy equal E, and all other energies equal zero. The same macrostate can be expressed as its translational, rotational, and vibration energies all equal E/3. As the energies of the molecule are increased, the disorder (entropy) of the system increases as well, the same way increasing funding to the post office increases the possible disorder of the packages.

 

It is important to realize that Ω describes the number of microstates for one macrostate. What happened to the others? This is similar to the various seaports in the previous example. In real gasses, the most probable macrostate (e.g. their ‘San Francisco’) is almost exclusively occupied by all the molecules (i.e. all the packages ‘pass’ through this macrostate). Therefore, the Ω we define for gasses is the number of microstates for only the most occupied macrostate (e.g. the number of routes that pass through San Francisco)—which is a very reasonable approximation.

Entropy is, indeed defined by the following equation:

 

 

Where S is the entropy, and kb is the Boltzmann constant (determined experimentally). It is important to note that entropy is exclusively dependent on the microstates of the system. The number of microstates can be determined experimentally considering that it is related to the energies of the molecules of the gas. Various methods are employed to calculate the number of microstates. 

 

Definition

 

To understand how entropy is related to spontaneity, we must look at the matter of interest on the microscopic molecular level, described by a microstate. Here, we leave behind classical thermodynamics, a field concerned with the macrostate of a system, defined by properties such as temperature or pressure, and delve into statistical thermodynamics. Statistical thermodynamics demonstrates that spontaneous processes can be described using probability theory to model the system on a molecular level.

 

The macrostate is representative of a probability distribution that describes the possible states of the system across all microstates. This probability distribution enumerates the probability of the system existing in a particular microstate. A microstate describes one of the possible configurations of position, momenta, and number of particles for a given thermodynamic system, referred to as a statistical ensemble (the number of particles will often remain constant for a given thermodynamic system and will remain constant since we are only covering the topic on the micro canonical level).

               

Multiplicity refers to the total number of microstates in a given macrostate. Multiplicity is vastly important as it is the key variable that relates the microstates of a system, the system's macrostate, and entropy, one of the concepts at the heart of statistical mechanics. Multiplicity performs this amazing feat on the basis of two things: the fundamental postulate in statistical mechanics and the relatively large number of bodies in any thermodynamic system on the microscopic level. The fundamental postulate assumes that an isolated system (no energy/heat or matter exchange with the surroundings) in equilibrium has an equal probability to be in any of its accessible microstates.

 

Ralph Baierlein, a physicist, wrote in reference to statistical mechanics that "It all works because Avogadro's number is closer to infinity than to 10." This statement is not necessarily true, but it may as well be, which is what Baierlein was hinting at.  The reason we can accurately use entropy to describe spontaneity is that outliers are rendered obsolete due to the sheer quantity of the "sample size," or the number of molecules in this case, and, consequently, the "true" macroscopic quantities of a system are consistent from one experiment to the next to the real world.

 

Imagine two sealed flasks connected by a small tube that contain only 6 molecules of an ideal gas. What we expect to see, based on what we know about thermodynamics, the Second Law, and entropy is that 3 molecules of gas will be found in each flask in any given time, or, more precisely, any given microstate. However, finding all 6 molecules in one flask would be statistically reasonable at a 1 in 64 chance from 1 in 26. Now imagine that this flask contains 6.022e23 molecules of that same ideal gas, 1 mole of that ideal gas, or Avogadro's number of molecules. The probability that all 6.022e23 molecules will be found all in one flask is 1 in 26.022e23, which web2.0calc.com calculates to infinity and WolframAlpha simply does not calculate. In other words, the probability of finding all of the molecules in one flask approaches zero the more molecules you have. According to statistical mechanics, we will virtually never find all 6.022e23 molecules in one flask, which is exactly what we observe.

An example of one possible formation of molecules in the two flasks.

 

Second Law of Thermodynamics

 

The Second Law of Thermodynamics is an expression of the tendency of differences in properties such as temperature and pressure to equilibrate over time in an isolated physical system. In other words, a closed system tends towards equilibrium between its constituent parts. The Second Law of Thermodynamics can also be interpreted as stating that in a isolated system, the potential energy of the current state will always be less than that of the initial state. This loss of energy is commonly referred to as entropy. Entropy is a measure of the disorder/multiplicity of the system, as well as the system's thermal energy per unit temperature that is unavailable for work. Because of entropy's tendency to increase, the final system (when the system has achieved equilibrium, meaning there will be little flow of heat energy) will be the macrostate where the entropy, and therefore the multiplicity, is greatest.

 

Microstates & Macrostates

 

Looking back at the introduction, two terms often used in describing entropy are microstate and macrostate. A microstate is a possible microscopic configuration of the individual atoms and molecules in a system, while a macrostate is a term referring to observable properties of a system such as temperature and pressure that arise from specific microstates.

 

The total possible number of microstates for a system is often referred to as multiplicity, and entropy, as a measure of multiplicity, can be defined with the following equation containing the entropy of the system S, Boltzmann's constant kB (1.3806488(13)×10−23 J*K-1), and the total number of possible microstates Ω:

 

S = kBlnΩ

 

The Second Law of Thermodynamics at Work

 

Looking at the inverse temperature equation containing the variables temperature T, entropy S, and energy U:

 

 

As temperature increases, the amount of energy the system has increases, while entropy decreases. Because systems have a tendency to increase in entropy, a system with a higher temperature is more willing to give energy to a system with a lower temperature, in order to decrease its energy and increase its entropy.

The Second Law of Thermodynamics is often used to explain why perpetual motion machines are impossible. The definition of a perpetual motion machine is a machine/system that can continue to work indefinitely without an external source of energy. Looking at heat energy, when moving heat from a colder body to a warmer body, energy must be expended, since the thermal energy from the warmer body would naturally flow from cold to warm, as shown above with the inverse-temperature equation. This restriction can be observed in a refrigerator, which require electrical energy in order to move thermal energy from the colder interior to the warmer exterior. Similarly, it is impossible to extract an amount of heat from a hot reservoir and use it all to do work. This in turn can be seen in heat engines, which rely on thermal energy to operate. Some amount of heat must be exhausted to a cold reservoir, as the cold reservoir is required to facilitate the flow of heat in the first place by providing a system with lower entropy.

 

Heat and Disorder

 

There is often confusion among the terms entropy, heat, and disorder.

 

Entropy is a state function that measures which direction a process will proceed in. If there are more ways for a particular condition to occur, the system will spontaneously move toward that condition.

 

Entropy v. Heat: Entropy can be expressed as heat over temperature, or the transfer of thermal energy between objects. In classical thermodynamics, entropy is a measure of the tendency for a process to proceed in a particular direction. An example of this would be heat flowing from areas of high temperature to areas of low temperature. This process is spontaneous because there are more ways for the system to move randomly toward equilibrium than for the high temperature area to get hotter and the low temperature area to get colder. Entropy dictates heat flow.

 

S=Q/T

 

In terms of work, entropy is the energy of a system that cannot be used for work. During the conversion of energy to work, the entropy of the system is increasing, according to the Second Law of Thermodynamics. The resulting entropy dissipates in the form of heat.

 

Entropy v. Disorder: Entropy is intuitively thought of as a measure of disorder, a system’s state of randomness, in a system. When entropy causes thermal energy to flow between regions of different temperatures, the state of order of the system is reduced. Because it drives the process in this direction, entropy is an expression of disorder, a quality in itself.

 

 

 

 

 

 

Delta S, or the change in entropy, is positive when a snowflake melts into liquid water. The entropy of the snowflake crystal is lower than that of the liquid because its molecules have fewer available microstates when they are:  1. positioned in a certain array and 2. limited in movement because they are in a low-temperature solid.

 

THE THIRD LAW OF THERMODYNAMICS

 

The matter around us in our daily lives contains a measureable amount of energy. Because it contains energy, the particles that surround us are in constant translational, vibrational, and rotational motion. But what happens as the as the amount of energy for any given substance reaches zero? The third law of thermodynamics addresses this issue which relates the amount of entropy a substance has in the absence of thermodynamic energy.

 

It has already been shown that the amount of entropy a substance has depends on the number of microstates availiable for any one molecule of that substance at a given temperature according to the following formula:

Clearly, the number of microstates available to any molecule increases with the amount of energy contained in the system, which subsequently increases the entropy of the substance. Logically, as the amount of energy contained in the system decreases, the number of microstates available to the molecule decreases, which decreases the overall entropy of the system.

 

By analyzing the function for entropy above, we can easily infer that the maximum entropy is achieved as the number of microstates approaches infinity, and the minimum value of entropy is achieved as the number of microstates approaches 1. At this value, the entropy of the system is equal to zero, which is the lowest possible value because entropy must always be finite and positive. It is important to note that the number of microstates available to the particles of the system can never be zero because a system without any microstates is not a system at all (it does not exist).

 

Another way to analyze the behavior of substances as their number of microstates approaches zero is with the Heisenberg Uncertainty Principle. We already know that the position and momentum of a particle can never be known simultaneously. If a particle had no microstates available, it could not translate, rotate or vibrate, thus its momentum (zero) would be known. If the particle is now moving, then we would also know its location.

 

Because the number of microstates available to a particle is dependent on temperature (among other things), we can conclude that when the absolute temperature of a substance is zero (i.e. is has no microstates available) its total entropy can be defined to be zero. To reach absolute zero, however, is not possible—so the concept of zero entropy is a theoretical concept, but a useful one.

If it is possible to define zero entropy, we can define absolute entropy as the entropy of any pure substance relative to its entropy at absolute zero. Unfortunately, this condition is impossible to achieve for reasons outlined in the previous paragraphs. If we cannot reach zero entropy in reality, then how can we theoretically?

 

Although it is not possible to reach absolute zero, scientists study matter at

temperatures close to it using large cryocoolers such as the one shown above

 

The answer is simple, for we will start by constructing an ideal scenario where zero entropy can be achieved. The best starting point would be a solid (fluids are defined by the movements of their particles). Not any solid will do, for we will need one that is highly structured that does not allow particles to translate or rotate—thus we arrive at a crystal. Crystals can clearly have energy stored in them. As energy is added to crystals, the bonds between particles vibrate which eventually cause them to dissolve. If we take away all the energy from a crystal that allows it to vibrate, we are left with a system of particles for which we can determine the exact location and momentum of each particle. Therefore, the entropy of a perfect crystal at absolute zero is zero. The theoretical perfect crystal at absolute zero is the lynchpin that defines all absolute entropies of all substances. Is it proper to use this definition of entropy for all substances even if it substance that defines zero entropy does not exist?

 

The answer is yes. It is important to remember that entropy is an intrinsic property of all substances. Just as we can define zero temperature or zero volume (a singularity), zero entropy, defined by the theoretical perfect crystal, can be applied to all substances (not only the substance of the perfect crystal).

 

Mathematical formulation of absolute entropy

 

Based on these concepts, it is fairly simple to arrive at a mathematical way to determine absolute entropy. We can start with the second law of thermodynamics:

 

(1)

The change in thermal energy (dQ) is dependent on the temperature and the specific heat of that substance (C):

(2)

If we put (1) and (2) together, we arrive at the equation for entropy in terms of temperature and specific heat:

(3)

To find the absolute entropy, we can integrate the entire equation (thus reducing dS to S). To get rid of dT, we will integrate with respect to T. Over what area will we integrate? Let’s choose a reference temperature (say T=0) to any temperature T=T’. To integrate properly, we will have to add the value of entropy at our reference temperature to the value of the integral. Because we choose T=0 for our reference temperature, the additional factor will be the entropy at T=0, which is S(0):

 

(4)

As we have previously determined, the entropy of a substance at absolute zero is zero, so we can ignore the S(0) term:

 

(5)

Equation (5) gives us the mathematical formulation of the third law, which is absolute entropy at any given temperature T’ relative to the entropy of a perfect crystal at absolute zero.

 

 

 

Questions

 

  1. A small child arranges randomly scattered blocks into a symmetrical house. This results in a decrease in which of the following?

 

                    A.  Entropy of the universe

                    B.  Heat of the child

                    C.  Disorder of the blocks

                    D.  A and C

                    E.  None of the above 

 

     2. Which of the following statements are true?

 

          I.  Energy will move from the system with lower temperature to the system with higher temperature

          II. Net energy change is zero once equilibrium is reached

          III. Entropy is defined as a logarithmic function of the total possible number of macrostates for a               system

 

                    A.  I

                    B.  II

                    C.  I, II

                    D.  II, III

                    E.  I, II, III

 

 

 

Works Cited

 

http://musr.physics.ubc.ca/~jess/hr/skept/Therm/node13.html

http://musr.physics.ubc.ca/~jess/hr/skept/Therm/node11.html#eq:Equilib

http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/seclaw.html

http://www4.ncsu.edu/unity/lockers/users/f/felder/public/kenny/papers/entropy.html

http://www.emc.maricopa.edu/faculty/farabee/biobk/biobookener1.html

http://www.biochem.vt.edu/modeling/stat_mechanics.html

http://www.britannica.com/EBchecked/topic/189035/entropy

 

 

 

Answers

1. C – Total entropy never decreases. When disorder decreases in a system, entropy increases elsewhere in the surroundings.

2. B – Energy will move from a system with higher temperature to a system of lower temperature, and entropy is defined as a logarithm of the total number of microstates

 

Comments (0)

You don't have permission to comment on this page.