Entropy

Entropy, always abbreviated by a capital "S", measures the degree of disorder in a system in a quantitative way. It thus comes up with a number for disorder.
How can you put a number on the disorder in your daughters room? You certainly know the room is disorderly (and that she doesn't have that from you), and you can assess it qualitatively on a scale going from "almost good" via "clean up or else" to "I'm now going to see my lawyer to strike you from my will". You make assessments like that all the time, but how can you come up with a number?
Well, to be honest, for complex systems like your daughter's room, you could come up with a number in principle but in reality you cannot. But for much simpler systems, like a large bunch of iron atoms, it is perfectly possible to find definitions that can and will give you a meaningful number.
The clue for putting a number on disorder lies in looking first at the age-old definition of order:
 

Definition of Order:
There is one place for everything,
and everything is in its place

 
What this means is that there is only one possibility to have order in a system; there is only one way to arrange things orderly. Here is a picture of the opposite:
 
Disorder
Source: Sorry, don't know. This picture floated around in the Net.
 
Perfect order implies that the white color is in the can and nowhere else. Two happy kids have found a way to distribute a lot of paint in many places where paint does not belong. Disorder in the room has most definitely increased. Note also that order is boring.
Let's look at two model systems to grasp how we define disorder now.
  • System 1:
    Your bedroom (or bathroom) and nSo dirty socks. If nSo = 14, it means there are 14 single dirty socks. Possibly, if always two match, you have 7 dirty pairs. Then you have Nroom places where you could find your dirty socks (e.g. Nroom = 4: under the bed, on the floor, in the wrong drawer, in the hamper). We might call that the {nSo ; Nroom} system. With the numbers given we have the special {nSo = 14; Nroom = 4} system but we could assign other numbers to the two variables if we like.
  • System 2:
    A crystal and nV single vacancies or missing atoms. We have NCrystal places where you can find a vacancy. Since any one of the crystal atoms could be missing, NCrystal equals the number of atoms in the crystal. We might call that the {nV; NCrystal} system, and we can assign all kinds of numbers to the two variables.
Now ask yourself: In how many ways can I distribute my nSo dirty socks on the available Nroom places? My nV vacancies on the available NCrystal places?
From the way I set this up you probably figure that these two questions have the same general answer. Not really. Figure again!
Allright, I give you a hint. In your bedroom, the red sock might be under the bed and the blue sock in the wrong drawer. Or it could be the other way around. You definitely can distinguish these two arrangements (on top of wherever the other socks are), and you must count it as two possibilities.
However, you cannot distinguish if one vacancy sits on place 15 and another one on place 324, or if it is the other way around. Both arrangements are identical in this case.
Oh f...! This is getting messy. It gets even messier if you consider all the possibilities of being able / unable to distinguish between elements / arrangements. For the brave: here is the link; the faint of heart need not apply.
Don't worry, be happy! You certainly don't want to get immersed into that kind of detail, and the good news is that it is not necessary. All you need to know is that there are unambiguous answers to those questions. There is a definite number of possibilities to distribute nSo dirty socks on the available Nroom places, or nV vacancies on the available NCrystal, or God knows what else. Even better, there are some nerds out there who can calculate those numbers; they have probably nothing better to do.
Let's use some age-old magic and deal with this vaguely frightening stuff by giving it a name. Let's call the number of possibilities to distribute whatever we want to distribute on the places available: P , short for possibilities. There. Now we can talk about it without really knowing what it is we talk about.
If P is large, the disorder is large. Remember: Perfect order means P = 1, there is only one possibility to put things away in perfect order.
Now comes major magic: The entropy of a disordered state with P possibilities to create the disorder is
 

S  =  k · ln P

 
That is Boltzmann's famous equation for the entropy of a system.
In words: The entropy S is proportional to the natural logarithm of the number of possibilities P to arrange the things in the system we are looking at. The proportionality constant is Boltzmann's constant k.
The letter "k" always denotes Boltzmann's constant: k = R/NA = 1,380 6503 ·10–23 J·K–1 = 8,617269 · 10–5 eV · K–1
R = 8,314 472 J · mol–1 · K–1 is the "universal gas constant" and NA = 6,0221 419 9 · 1023 mol–1 is the Avogradro number, the number of atoms or molecules in one mol of a substance.
A mol of a substance is defined as that amount of a substance that contains NA atoms or molecules. You always get 1 mol if you take as many grams of a substance as the number giving the atomic weight of its particles (H = 1, C = 12, Si = 28,09, Fe = 55,85, and so on).
If it doesn't come back to you now, because you either never learned about the very basics of the world around you or forgot it, I feel sorry for you. More than that I can't do. Some are born to be wild, some are born to be Materials Scientists, but none are born to remain stupid. Get to it!
Boltzmann's constant is one of the few truly fundamental and universal constants. No theory exists that can calculate it from something more fundamental; we have to measure the numbers for those constants. Other fundamental constants are, for example, the speed of light c, and Planck's constant h.
Boltzmann's entropy law S = k · lnP is just as fundamental and important as Einstein's famous E = mc2, linking mass m and energy E via the speed of light, or Planck's   E = hn, linking energy and frequency via Planck's constant h = 4,1356 ·10–15 eV·s
So if you have P possibilities for arranging the elements of your system, you know the degree of disorder. Take the natural logarithm of that number and multiply by the Boltzmann constant.
Note that the number you get increases only slowly with P:
  • ln1 = 0,
  • ln10 = 2,30,
  • ln100 = 4,6,
  • ln1.000 = 6,9,
  • ..., well, use your own pocket calculator!
We are done. We can calculate the entropy of a system with a bit of combinatorics. Since we also can calculate the energy, we can combine the two in the free energy and use that to calculate the nirvana conditions of the system.
  If that doesn't sound like something you want to spend your free time with, you are a guy like me. A bit lazy, maybe, but not crazy. Thank Boltzmann (and others), we don't need to go through all that steps. They have done that for us in full generality once and for all, coming up with ingenious ways to get what we want in a far simpler if more abstract ways. This module gives a glimpse
Now that we can calculate the entropy of some system, we can go places. For example to:
  • The science module "Second law".
  • The science module "Making vacancies". That's were we really go and calculate the entropy of a system down to the most gruesome detail.
     


With frame With frame as PDF

go to History of Carbon

go to Group 16 / VIA; Chalkogenides or Oxygen Group

go to Discovery of Atoms

go to 4.3.1 Nirvana for Crystals

go to Boltzmann Distribution

go to Magnetism

go to Radiocarbon (C14) Dating

go to The Second Law and Computer Science

go to Units and Constants

go to Units of Length, Area, and Volume

go to Exponentials and Logarithms

go to Beer and Conquering The World

go to Steel Properties

go to Making Vacancies

go to The Second Law

go to Early Metal Technology - 1. Gold

go to Combinatorics

© H. Föll (Iron, Steel and Swords script)