*Before you read this, I suggest you read* posts 16.34, 16.35 and 16.37.

In post 16.34 entropy was introduced as the quantity of heat flowing into an object divided by its temperature (on the Kelvin scale). This definition leads to the idea that the entropy of a closed system tends to increase spontaneously. The idea of entropy arises simply from observation and requires no theories about the nature of heat or that stuff is made of atoms and molecules. There is nothing wrong with considering entropy or the second law of thermodynamics (post 16.34) in this way. Indeed, when I taught an introductory course on thermodynamics to physics students at the University of Aberdeen, I used exactly this approach in order to demonstrate the logical consistency of the subject and its basis in simple observation.

However, in the late nineteenth century, physicists, especially Maxwell (posts 16.11, 16.15 and 16.35) and Boltzmann (posts 16.22 and 16.35), wanted to investigate the nature of heat – concluding that it was kinetic energy of the motion of particles (atoms in elements [post 16.27]; molecules in compounds [post 16.30]). Entropy could then be defined in terms of the number of way in which things or their energy could be distributed; this led to the introduction of Boltzmann’s constant – to ensure that entropy defined in this way had the units of J.K^{-1} to conform with the original definition (see previous paragraph; for information on writing units see post 16.13; for a definition of J see post 16.21).

This way of thinking about entropy provides another interpretation of heat flow from hot objects to cold objects. Suppose a hot object contains *n _{h}* particles and a cold object contains

*n*particles. If the hot object keeps all its heat, it is shared among only

_{c}*n*particles. But, if its heat flows to the cold object, it is shared between

_{h}*n*+

_{h}*n*objects. As a result

_{c}*W*in the equation

*S*=

*k*.log

*(post 16.35) increases and so the entropy increases. In this interpretation, it is highly improbable that all the energy will stay in the hot object – the relationship between*

_{e}W*W*and probability is discussed in post 16.35.

Entropy is sometimes considered as a measure of disorder. The second law of thermodynamics could then be considered as the idea that closed systems become spontaneously more disordered. Although this idea can be helpful, it can also be misleading.

The question that naturally arises from this idea of entropy is “what’s so special about disorder?” I think the answer to this question is “nothing”. To explain my answer, let’s think about four discs (of equal radius) arranged on a surface so that every disc must touch at least one other disc. I can think of three ways of doing this that lead to what we might consider an ordered structure (see diagram above). Perhaps you can think of more – it may depend on exactly what we mean by “order”. I am thinking of a regular arrangement.

Now let’s think about all the irregular or disordered ways in which we can arrange the discs – some of them are shown in the diagram above. You will be able to draw lots more! So if our four discs arrange themselves spontaneously, with the only condition that each one has to touch at least one other, there is a higher probability that they will be arranged in what we might consider a disordered arrangement than an ordered arrangement. So we are back to our idea of entropy being related to the number of possibilities that exist and so to probability.

Disordered systems are more likely to form than ordered systems simply because they are more probable.

Now let’s think about a crystalline solid (post 16.37) at a temperature of 0 K – the lowest temperature possible at which atoms have no kinetic energy (see posts 16.34 and 16.35). The atoms are in fixed positions in a defined arrangement from which they can’t move. So *W*, the number of possible states of the system is 1. Since log* _{e}*(1) = 0,

*S*=

*k*.log

*= 0. This result, that the entropy of a pure crystalline solid at a temperature of 0 K is equal to zero, is called the*

_{e}W*third law of thermodynamics*. The first and second laws of thermodynamics can be stated, and used to solve problems, without knowing anything about atoms, molecules, crystals or the nature of heat. The third law needs us to know about these things. It is associated with a newer way of thinking about thermodynamics (see the second paragraph) that is called

*statistical thermodynamics*(because it depends on ideas about probability) or, simply,

*statistical mechanics*.

There is a problem hidden in the last paragraph. We know the exact positions of all the atoms in our pure crystalline solid at 0 K. We also know that they all have zero kinetic energy. Does this contradict Heisenberg’s uncertainty principle (post 16.29) which states that we cannot simultaneously know the exact position and the exact energy of an object? Perhaps it doesn’t matter – Heisenberg’s uncertainty principle has its origins in quantum theory and Newton’s laws of motion make useful predictions that quantum theory states are impossible (post 16.2). I think it matters because we can think of Newton’s laws as approximations to the predictions of quantum mechanics that work for large objects (post 16.2). In contrast, it doesn’t make sense to formulate scientific laws that contradict each other (see post 16.2). We can reconcile the third law of thermodynamics and Heisenberg’s uncertainty principle if the atoms in our pure crystalline solid have some energy, that is not kinetic energy, that is called the *zero-point energy*.

So, starting with the ideas of “heat” and “temperature” and thinking about the nature of solids, has provided us with a lot more to think about!

*Related posts*

16.37 Solids, liquids and gases

16.35 Heat

16.34 Temperature

16.30 Molecules

16.27 Atoms

16.21 Energy

16.2 Scientific laws

*Follow-up posts*

18.26 Diffusion

18.27 Diffusion through membranes, osmosis and dialysis

18.28 Applying the ideal gas equation to solutions

18.29 Reverse osmosis

18.30 Heat pumps

19.30 Maxwell’s demon