Entropy
Review of previous lecture: practice on your own
-
Exact differentials. Is the following expression an “equation of state” (path independent)?
Bonus: what kind of system does this equation describe?
You know how to do this. Check
This is an equation of state! is an exact differential. This equation describes a Van der Waals gas.
-
Lagrange multipliers. Challenging! Find the distribution such that
- The quantity is at a maximum.
- The mean of is given by
- The variance of is given by
The Lagrangian is for this system is
Now solve the system of equations :
One obvious way to get this to work is to have
Solving for gives
We can then plug this equation into our various constraints:
And after tons of grunt work, we get a normal distribution!
This means that the normal distribution is the “maximum entropy” distribution for known mean and variance. But what the actual hell does that even mean? Check out this little post for more information
Multiplicity
Let’s go back to flipping a coin
Suppose we flip a coin 4 times. Assuming that this is a fair coin, what is the most likely
-
Sequence of H’s and T’s?
-
Composition of H’s and T’s?
Because we have a fair coin with independent trials, all the different sequences (there are of them) are equally probable. So there’s no preferred sequence.
However, there is a preferred composition. To see this, let’s write down the multiplicities of each possible composition.
The combination is most probable because there are more ways to produce it than any other combination.
The same principle is at work for systems of particles. There are many different microscopic states (analagous to the coin flip sequences) that can produce the same macrostate (analagous to the coin flip compositions). Let’s be more precise in our terminology.
Microstates and macrostates
Consider a system of particles in an insulated container (constant energy ) of volume . Define a microstate to be the -dimensional vector that competely specifies the state of the system:
The macrostate of this system is defined by the macroscopic quantities that I observe, in this case, (we can derive all other thermodynamic quantities from these, for what it’s worth). Now, what happens if we look at a different microstate , where we’ve flipped the sign on all the velocities
Say the microstate belongs to our ensemble. Does belong as well? Yes! The energy, which is the only thing that could possibly be affected here, isn’t changed by reversing the momenta since . The point here is that there are many ’s to a single .
Thinking back to our coin flipping problem, we can draw an analogy between the microstates and the sequences of flips and then between the macrostates and the combinations.
If you are studying a system in equilibrium, the thermodynamic variables you observe in the lab are the variables of the macrostate. Now, here’s the really critical point: the macrostate you observe is composed of the set of most probable microstates.
But what does that have to do with entropy?
Entropy
Let’s consider a system not in equilibrium and figure out what equilibrium it reaches by maximizing multiplicities.
Suppose we have two systems, and that are independently in equilibrium and defined by the macrostates and , respectively. We bring the two systems into thermal contact and allow energy to be exchanged. Neither particles nor volume are permitted to exchange. When the systems are brought into contact, the total energy is given by
Goal: find a condition on that describes the new equilibrium
Another way of approaching this is to say “what amount of energy gets transferred between the two containers? What is the change in ?”
We can find equilibrium by maximizing the multiplicity (number of microstates) of the final system. Let’s write down the multiplicity and then take the appropriate partials:
Notice that we can rewrite .
Substituting this into the above expression:
.
Setting this expression to zero:
Divide across by :
Let’s define a quantity . Then we can write the above expression as
That’s crazy! We just showed that maximizing multiplicities is equivalent to maximizing entropy! Assuming, of course, that you accept that this is the same that you see in thermodynamics. Don’t worry. We’ll get there.
Two final comments:
- We can unpack even more from the expression . Because and are freely varying (independent) quantities, the only way for their partials to be equal is if the derivatives are constant. In other words:
where is some constant. In fact, is going to turn out to be exactly . It shouldn’t be surprising that the constant is some function of temperature. After all, we specified that the systems were coming into thermal contact: shouldn’t the temperature be constant at thermal equilibrium?
- Entropy can also be written in terms of probabilities. Again, not surprising because we just showed that multiplicities and probabilities are closely connected.
We’re going to work some examples using the above expression in the next class.
Practice on your own
- Try the first problem of your diagnostic exercise again.
- What is maximum value of entropy that a discrete, -state system can have? What is the minimum? That is, if I have a system with possible states and the probabilities associated with each of those states are , what values of yield the largest values of ? The smallest?