Negative Temperature for Fun and Profit

Again, originally a comment I made on Metafilter, describing the physics behind objects with negative temperature. That is, things "colder than 0 Kelvin." Which sounds impossible.


To understand negative temperature, you need to know what the definition of temperature is, at least to a scientist. It turns out that the statistical mechanics definition of "temperature" does map to your everyday understanding (at least for things with positive temperature), as it should otherwise why the hell would scientists use the same word for both concepts, but it is a bit confusing. 

Temperature is defined as the partial derivative of the energy of a system with respect to the entropy of the system. So: what the fuck does that mean?

You have a thermodynamic system: a box of gas say. That box of gas has a bunch of properties: for one, the energy in the motion of the gas. I'll just refer to this as the "energy" E of the system (there are lots of other energies I could care about: gravitational potential, chemical, nuclear, but since I'm not dropping this box of gas or blowing it up, I can ignore all that). It also has "entropy," which I'll call S. Entropy is, in my opinion, the most important concept in physics. It is also one of the most slippery. You can think of the entropy as "the number of ways you can rearrange the system and no one would notice the difference." So, a hot box of gas has lots of entropy, because all those particles zipping around can be swapped with each other or have their momentum altered and no one would be able to tell that the system had changed. While a cold box of gas (particles nearly stationary) has low entropy, since if you start altering the momentum of individual particles, the system looks different ("hey, there wasn't a particle moving with high velocity before and now there is!").

I know that definition of entropy sounds really anthrocentric. I'm sorry. It's not on a fundamental level - there are ways of rigorously defining all this stuff that removes what seems like the human element of "not noticing the difference," but when describing science I find myself often having to make certain analogies that may make things seem more arbitrary then they are. If you don't trust that statement, I strongly encourage you to go take a physics course on thermodynamics and stat mech and get your hands dirty with all of this. It's fascinating.

It's important to understand that entropy always increases; which makes sense given it's definition. There are more states with high entropy than low (by definition), so given the opportunity, everything eventually finds itself in a state of high entropy - the probability of not doing so is just insanely low (which is why this field is called statistical mechanics, its all about probability). Now, onto the definition of temperature. 

Our day-to-day understanding of temperature is that if something is hot and we touch it to something cold, the cold thing heats up and the hot thing cools down. Let's be specific and say I take my box of hot gas, and touch it to a box of cold gas. We know that, eventually, the two objects will reach the same temperature. 

Why does that happen? Well, a little bit of energy leaves the hot gas, reducing it's entropy by some amount. This would violate the 2nd Law of Thermodynamics (entropy always increases), except that the energy enters the cold gas, and increases it's entropy MORE than the hot gas entropy decreases. That is, 

What this means is that, by adding energy to the cold thing, you've moved up the total S of the universe, because, while there would be more entropy in the hot gas system if it could keep all it's energy to itself, the universe doesn't "care" about maximizing the entropy of a single object. It will find its way to maximize the total entropy, and to do that it will rob the hot object of energy to feed the cold one, because this transaction wins entropically.

So, how do you tell which object is hot and which object is cold? You compare the two systems and ask: if I remove some energy from one system, how does it's entropy change? Is that entropy change greater or less than the entropy change to the second system if I remove a bit of energy? The "hot" system is the one where:

latex-image-1.jpeg

Which is to say, the entropy increase in the cold system is greater than the entropy loss in the hot system, so energy flows from the hot to the cold because the Universe is a bastard and is always going to increase entropy.

If you turn those "Deltas" into derivatives (which is to say, just ask what happens when you change the system by a tiiiinnyyy amount), and define temperature T as ,

you'll see that my (totally obvious) explanation of what a hot object is says that, for these two systems:


So I've rediscovered the obvious fact that hot things have a higher temperature than cold things.

But I've learned something. I've learned that the transparent fact (to us) that there is something called temperature and it has something to do with whether I get burned or get frostbite if I touch "hot" or "cold" things has to do with this totally non-intuitive definition in terms of entropy.

So now I can finally describe a negative temperature object.

In day to day life, adding energy to an object increases its entropy. The box of gas, in my previous example, gets more disordered when you add energy. But, just from the equation:


you can ask, what would happen if you had something that  lost entropy when you added energy? Well, Delta S would be negative for positive Delta E, so T is negative. It's hard to imagine how that would happen, but basically this is saying that there are fewer ways to have high energy than there are to have low energy. This "never" happens in normal life, but the scientists here have found a way to finagle a system that does have this property.

What happens if you touch a "heat bath" of negative temperature material (a heat bath is a theoretical construct in thermodynamics, and defined as "an object of a certain temperature that is so big that you can never remove or add enough energy to noticeably change it's temperature. For example, if I jump into the ocean, the ocean gains energy from me as it cools me down, but you'll never see the temperature of the ocean rise because I jumped in.) Well, if I touch a -T object, energy that flows from me into it would reduce my entropy (as I have positive T), but reduce it's entropy (by definition: gaining energy causes -T objects to reduce in S). However, if energy flows from it into me, I gain entropy AND the -T object gains entropy. So, if the -T object was big enough and stable enough, it would just continue to dump energy into ANY positive T object in thermal contact. Thus, it is "hotter" in our day-to-day understanding of "hot" than every object with positive temperature.

Which implies to me that these objects are massively unstable; you must continually dump energy into the system (by processes low entropy-energy somewhere in your lab apparatus) otherwise your little -T object will just dump all it's energy out and you'll destroy the delicate set-up that allowed you to cheat the normal relation between energy change and entropy change.

In conclusion, temperature is a land of contrasts.