The 2nd Law of Thermodynamics

Introduction

This page is the second part of a series that describes entropy and the fabled second law of thermodynamics, the previous part being "The Definitions of Entropy". You should read that page before this, if you are not already satisifed that you know what entropy is (you might read it anyway, and find out that you really didn't know!).

The 2nd law of thermodynamics is the law that constrains how the entropy of a thermodynamic system is allowed to change, given some process or circumstances. The true meaning of the 2nd law is most commonly lost in the confusion over the true meaning of entropy, or a failure to appreciate the restrictions on "closed" and "open" systems. The entropy problem is hopefully solved by reading the companion piece already noted above. Here I will go on to show how the 2nd law was derived, and how it works in nature.

The Second Law of Thermodynamics (in Classical Thermodynamics)

Here is how Maxwell explains the second law (Theory of Heat, p. 153).

Admitting heat to be a form of energy, the second law asserts that it is impossible, by the unaided action of natural processes, to transform any part of the heat of a body into mechanical work, except by allowing heat to pass from that body into another at a lower temperature. Clausius, who first stated the principle of Carnot in a manner consistent with the true theory of heat, expresses this law as follows: -

It is impossible for a self-acting machine, unaided by any external agency, to convey heat from one body to another at a higher temperature.

Thomson gives it a slightly different form: -

It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects.

And again, later on in The Theory of Heat (page 328), Maxwell has this to say:
One of the best established facts in thermodynamics is that it is impossible in a system enclosed in an envelope which permits neither change of volume nor passage of heat, and in which both the temperature and pressure are everywhere the same, to produce any inequality of temperature or of pressure without the expenditure of work.
A more modern statement of this classical second law may look more complicated, but means the same thing:
Processes in which the entropy of an isolated system would decrease do not occur, or, in every process taking place in an isolated system, the entropy of the system either increases or remains constant
That version of the 2nd law comes from the textbook An Introduction to Thermodynamics, the Kinetic Theory of Gases, and Statistical Mechanics (2nd edition), by Francis Weston Sears, Addison-Wesley, 1950, 1953, page 111 (Chapter 7, "the Second Law of Thermodynamics").

The phrase isolated system means that neither energy nor matter may enter or leave the system; it is an embodiment of the word "unaided" as used by Maxwell & Clausius. If the system is not isolated, then energy can get in, and so can "aid". Hence, isolation is required to uphold the restriction "unaided". The manner in which the "transition" is accomplished is irrelevant; all possible transitions are allowed.

In the earlier paper On the Definition of Entropy, we already encouterd the equation that defines change in classical entropy, S = Q/T. The 2nd law contrains the change in entropy ( S) so as to give us the fundamental equation for the 2nd law, in classical thermodynamics. S 0
Equation 1

In classical thermodynamics, the change in entropy indicates the difference between the amount of energy that is available to perform mechanical work, and the amount that is not. A larger entropy, means less energy available for work. So this forced increase in entropy means that less energy is available for work. Carried to the obvious conclusion, one would presume that eventually there would be no energy at all available for work. But if the system is not isolated, if it is open, the "work energy" resevoir can be replenished, and new energy made available for work. So the restriction that the system be "isolated" is a necessary part of the second law; in classical thermodynamics, it is always wrong to argue that the entropy of a non-isolated (open) system must always increase.

It is also important to remember that in classsical thermodynamics, entropy is defined at all, only for systems in thermodynamic equilibrium. So, naturally, the change in entropy that is restricted by the second law, can only be determined when the system moves from one state of thermodynamic equilibrium to another.

The second law in classical thermodynamics is derived from Carnot cycles. The assumption of perfect quasistatic processes involved is an idealism that can be approached, but not duplicated, in practical reality. Hence, in practice, the change in entropy for an isolated system can only be greater than zero, and equal to zero only in the case of an ideally reversible Carnot cycle. The derivation is standard fare in any thermodynamics textbook.

A New Concept: Phase Space vs Coordinate Space

Before we go on to explore how the second law works in statistical mechanics, we need to understand a concept that will likely be a new one for most readers, the idea of a phase space. The kind of "space" we are used to working with, the kind that represents the volume of a box, for instance, what we might call "physical space", is called a coordinate space in physics. A coordinate space is characterized by positional coordinates. We could line up rulers along the side of the box, and use the reading from the rulers to establish the (x,y,z) location of a point anywhere inside the box. Here we are using the word space not in its colloquial sense, but in a restrictive sense as mathematical jargon. A set of numbers has to satisfy certain defining characteristics, in order to qualify in mathematics as a space. I will not deal with those details here, because everything that I will call "space" does in fact meet those qualifications. The volume of the box qualifies as "space", but so do the momentum & position of particles moving inside the box.

Consider that box to be full of gas molecules. Each molecule can be located in the box by 3 position coordinates (x,y,x), and the momentum of each particle also has 3 components (Lx, Ly, Lz; I will use "L" to represent momentum). So the combined positions & momenta of the particles constitutes a space with 6 dimensions, as opposed to our normal concept of physical space, being only 3 dimensional. That 6 dimensional space is called the phase space for the gas molecules.

Remember that the ability to do mechanical work is a key concept for understanding entropy. Particles do work by virtue of their motion, which is expressed by their momentum (I might have used kinetic energy, but momentum is what you usually find in the text books). So the entropy of the gas is determined by the behavior of the particles in their 6 dimensional phase space, and not in the 3 dimensional coordinate space ("physical space") of the box. Furthermore, while the volume of coordinate space filled by the gas is just the physical volume of the box, the volume of phase space occupied by the gas is the 6 dimensional space occupied by one particle, times the total number of particles. When you consider that one cubic centimenter of sea level air holds about 1019 molecules, you can see that the phase space occupied by the gas is conceptually quite large when compared to the physical space of the box.

Now suppose that our box is really just one half of a larger box, with a partition down the middle. When we remove the partition, the physical space (coordinate space) of the box increases by a factor of 2. But what happens to the volume of phase space occupied by the gas? It goes up by a factor of 2N, where N is the total number of particles in the box (and the 2 would be a 3 if we tripled the physical volume instead of doubling it). If that box was a one cubic centimeter box of sea level air, while the physical volume doubles, the phase space volume goes up by a factor of 21019 = 100.3x1019, quite a bit larger than 2.

The enormous change in phase space is an important concept, which will come back to rescue us (we hope), from the paradox of irreveribility, which I will discuss after we deal with the second law in statistical mechanics.

The Second Law of Thermodynamics (in statistical mechanics, I)

The 2nd law is no different in statistical mechanics than it is in classical thermodynmics. It remains a fact that S 0, as previously derived by the founders of classical mechanics. But in classical thermodynamics, heat and entropy are treated like fluids that flow from one system to another. The asymmetrical 2nd law forces them to flow always in one direction, but not the other. But gravity asymmetrically forces water, for instance, to always flow down hill. So the asymmetry of the 2nd law, and the need to pump heat or entropy "uphill", as one would for water, don't seem like a fundamental problem.

But the transition to statistical mechanics, and the idea that a gas is made up of moving particles creates a large, fundamental problem. The motion of the particles is controlled by Newton's equations, and they are all time symmetrical, they work just as well "backwards" as they do "forwards". Indeed, there is no intrinsic difference between the two at all. That means that a gas should be able to run "backwards" as well as "forwards", and entropy, as a state variable for a gas system should be able to do likewise. But the 2nd law forbids it, setting up a fundamental conflict. Why should an essentially Newtonian system not be time symmetric? The problem has earned a name: The Paradox of Irreversibility.

Consider a tank of gas. It is full of gas molecules, the motion of each one describing a trajectory in both physical space, and in phase space. Each particle moves in accordance with the laws of Newtonian mechanics, which are fully reversible in time; a "time-backward" trajectory cannot be distinguished from a "time-forward" trajectory, in the absence of some outside context used to define "backward" and "forward". However, when those particles are all gathered together in the ensemble of a gas tank, their combined motion is not time reversible! We know this because the second law of thermodynamics enforces it.

Here is how William Thomson (Lord Kelvin) describes the problem (Proceedings of the Royal Society of Edinburgh, 8, 325 (1874):

The essence of Joule's discovery is the subjection of physical phenomena to dynamical law. If, then, the motion of every particle of matter in the universe were precisely reversed at any instant, the course of nature would be simply reversed for ever after. The bursting bubble of foam at the foot of the waterfall would reunite and fall into the water ... Physical processes, on the other hand, are irreversible: for example, the friction of solids, conduction of heat, and diffusion. Nevertheless, the principle of dissipation of energy [read irreversible behavior] is compatible with a molecular theory in which each particle is subject to the laws of abstract dynamics.
This apparent paradox caused much controversy in the early days of statistical mechanics, and in fact remains today a topic of open interest & controversy. Maxwell, Thomson, but mostly Boltzmann, together offered an explanation that satisfied them, but not necessarily everyone else.

As shown above, in the section on phase space & coordinate space, a mere doubling of the physical volume of one cubic centimeter of air, will enlarge the phase space of the gas by a factor of 103x1018. So the random odds that the gas in the tank might return to one of the previous phase space volumes (one that corresponds to all of the gas molecules suddenly gathering in one half of the box) are about 1 in 103x1018. Those are pretty slim odds in any practicle sense, and they are even slimmer for more likely and larger volumes. An ensemble of closer to a mole, say 1023 molecules will have its phase space enlarged by a factor of 103x1022, and so on. This argument will explain why violations of the second law are not encountered, they are just too unlikely. But, while the argument works (maybe), it is not very satisfying to many that the argument is built around "unlikely", rather than some more fundamental concept.

But maybe there is a better answer, "chaos". Otherwise known as "nonlinear dynamics", but popularly (and misleadingly) called "chaos", this field of mathematical physics comes out of the pioneering work of the French mathematician Henri Poincaré. Poincaré published the first description of chaotic motion in 1890, but it was not until the latter half of the 20th century that the scope of application of chaos theory to Newtonian mechanics was fully realized. But chaos theory will explain the second law of thermodynamics, by virtue of the infinite sensitivity to initial conditions that characterize such systems. The gas molecules cannot return to the previous small phase space volume, because they are a chaotic system, which cannot recover the initial conditions.

Jean Bricmont, mathematician & physicst from l'Université catholique de Louvain, in Belgium, holds to this point of view ( "Science of Chaos or Chaos in Science?", J. Bricmont, Annals of the New York Academy of Sciences 775: 131-175, 1996). Likewise Joel Lebowitz (Department of Physics and astronomy at Rutgers University (Microscopic Origins of Irreversible Macroscopic Behavior, J.L Lebowitz, Physica A 263(1-4): 516-527, February 1 1999, or "Microscopic Reversibility and Macroscopic Behavior: Physical Explanations and Mathematical Derivations", from a lecture on nonequilibrium statistcal mechanics in 1994).

The "loyal opposition" is non other than Ilya Prigogine, the Godfather of nonequilibrium thermodynamics himself. He does not like the statistical argument, and claims that the second law can be derived from the mathematics of functions directly (for instance, Laws of Nature, Probability and Time Symmetry Breaking, Physica A 263(1-4): 528-539, February 1 1999).

Not all questions in science necessarily have definitive answers (yet), and this may well be one of them. Perhaps the paradox is a real paradox, but it does not look that way. Though Bricmont & Prigogine disagree on the solution, they both offer compelling arguments for their point of view. It may well be a paradox with too many solutions, as opposed to a paradox with no solutions at all. But it is important to keep in mind that science does not consist of nothing but questions with canned answers. The paradox of irreversibility remains an issue in statistical mechanics, and a field of active exploration.

The Second Law of Thermodynamics (in statistical mechanics, II)

Now lets go back to the definition of entropy that we learned in the earlier chapter on the defintion of entropy,

S = -k· [Pilog(Pi)]
Equation 2

Those probabilities Pi are calculated in phase space, not in coordinate space. That's why I tried my hand at organizing the section on phase space above, so the reader might gain some understanding that can be applied to the paradox of irreversibility. Under physically normal conditions, one can easily encounter changes in phase space volume that are on the order of 1060, which certainly emphasizes the meaning of "unlikely". Once chance in 1060 are slim odds in anyone's book.

So the real essence of making the transition from classical mechanics is that the 2nd law makes a transition, from something thought to be comfortably absolute, to something that is uncomfortably probabilistic, even if the odds of a violation are that small.

Go on to Entropy and the Second Law in Open Systems
Go back to The Definitions of Entropy
Go back to The Collected Writings of Tim Thompson