**
General Remarks and State Functions**

Let's face it: Thermodymanics is not easy! It is not
possible to learn it by just reading through this module. | ||||||||

However, if you fought your way through thermodynamics proper at least once, and thus are able to look at it from a distance without getting totally confused by the "details" (which you don't have to know anymore, but must be able to understand when they come up), it's not so difficult either. | ||||||||

It gets even easier by restricting ourselves to solids which means that most of the time we
don't have to worry about the pressure
anymore – it is simply constant. We nevertheless specify it here for the sake of general validity.p |
||||||||

In this primer we will review the most important issues necessary for understanding defects, including defects in semiconductors. In order to stay simple, we must "cut corners". This means: | ||||||||

We will usually not show the functional relationships by showing the variables. We thus simply write
for the free enthalpy, and not G, showing that G(T, p, n_{i}) is
a function of the temperature G, the pressure T and the particle numbers p.n_{i} |
||||||||

In the same spirit, we will omit the indexes showing what stays constant for partial derivations, i.e.
we write for the chemical potential µ of the particle sort _{i}i the simple form |
||||||||

| ||||||||

In full splendor it should be | ||||||||

| ||||||||

We also are sloppy about standards. may refer to particle numbers n_{i}or concentrations; in the latter case particles/cm or ^{3}mol/cm
- you must know what is meant from the context. You are also supposed to know that if Boltzmann's
constant ^{3}k comes up in an equation, we are working with properties per particle, whereas the gas constant R
signifies properties per mol. | ||||||||

This, admittedly, is dangerous. But multi-indexed quantities are confusing (and not easily written in HTML, anyway)! Let's stay simple and refer to complications whenever they come up. |

If we restrict ourselves to crystals, it is rather easy to consider the concepts
behind the all-important thermodynamic quantities Internal Energy, Enthalpy, Entropy, Free Energy
and Free Enthalpy. We start with the internal energy of a crystal.
U | ||||||

Neglecting external energies (e.g. the gravitational potential) and
internal energies that never change (e.g. the energy of the inner electrons), we are
essentially left with the internal energy being contained in the Uvibrations of
the crystal atoms (or molecules), which express themselves in the temperature according toT
of the system | ||||||

| ||||||

With = average energy per atom, U
= degree of freedoms for "investing" energy in an atom (f for crystals; f
= 63 for the kinetic energy in v, and _{x}, v_{y}, v_{z}3
for the potential energy at (); x, y, z)k = Boltzmann's constant and = (absolute) temperatureT
| ||||||

The
(macro)state of the system is thus given by the number of atoms , the
pressure N and the temperature p. Knowing these numbers is all there is to know about the system
Ton a macroscopic base. | ||||||

We can change the state of the system by adding or removing heat , putting mechanical work
Q into the system or taking it out, and by changing the number of atoms (or more generally, particles) by
some WD. N | ||||||

Since at this point we keep the number of atoms in our crystal constant, we only have to consider
and Q if we change the state. The following basic equations (a formulation of the W1st law of thermodynamics) (german
link) holds | ||||||

| ||||||

With the changes written in differential form. Note that the regular "d" is not the sign
for (partial) derivatives (that would be ¶) but for "delta". d,
e.g., stands for Utotal change of . Note that Usometimes the d indicates
a total differential (e.g. in the case of d), sometimes it does
not (e.g. in the case of UdQ or d).W | ||||||

If we reserve the letter "d" for total differentials only, the equation
above acutally should have been written as dU = dQ – dW | ||||||

Any mechanical work must change the volume (something must move); for the normal conditions encountered with crystals where the pressure stays constant it can always be expressed as | ||||||

| ||||||

This term is cumbersome as long as only situations involving
crystals under constant pressure are considered. We thus introduce a pdVnew state function
called enthalpy and
define it asH | ||||||

| ||||||

If we again change the state of the system by adding or subtracting heat and mechanical
work Q, we now obtain for the total Wchange in enthalpy
dH | ||||||

| ||||||

With , because the pressure Vdp = 0 is constant, and pd we obtain U =
dQ – pdV | ||||||

| ||||||

This is a simple relation always best suited for systems under constant pressure and also clarifying why
we tend to think of enthalpy as heat. | ||||||

d is a measure of the energy needed to form a substance in a given state, it is occasionally
also called theH heat of formation (always refering to the difference between two
states). | ||||||

Of course, not much happens if the substance is just heated a bit but does not change its chemical nature
– let's say we look at a mixture of H and _{2}O which we heat up a bit. All the
fun comes from chemical reactions (or phase changes) – in our example it would be the formation of _{2}H
in a somewhat violent fashion._{2}O | ||||||

It was thought that the sign of d would indicate if a reaction
should or should not occur. A negative sign would mean that the reaction would transfer energy to the surroundings and thus
could easily happen, whereas a positive sign would tell us that energy would have to be pumped into the system – nothing
would happen by itself.H | ||||||

It's not that simple! While this point of view was true enough for relatively large d (let's
say H> 100 kcal/mol), the criterion often does not work for smaller changes of d.H |
||||||

The reason, of course, is that we neglected the change of the entropy
of the system, Sd, that occurs parallel to Sd.H | ||||||

Purely mechanical systems (consisting of non-interacting
mass points) would be in equilibrium for the lowest possible internal energy, i.e. for a minimum in their potential energy
and no movement – just lying still at the lowest possible point. But thermodynamic systems
consisting of many interacting particles and some externally fixed condition (e.g. a constant temperature), are in equilibrium
if the best possible balance between a !
small energy and a large entropy is achieved | ||||||

We just take that as an article of faith (or law of nature) at this point. | ||||||

Often, both quantities are opposed to each other: High entropies mean high energies and vice verse. The
entropy part becomes more important at high temperatures, and the thermodynamic potential which has to be minimized for
systems under constant pressure, is the free enthalpy
(also called GGibbs
energy). It is defined as | ||||||

| ||||||

With
= Sentropy
= d in classical thermodynamics (the suffix "rev" refers to reversible
processes).Q_{rev}/T | ||||||

If you have a system with constant volume (and variable pressure), the best suited state function is the
free energy
(also called
FHelmholtz
energy). It is defined as | ||||||

| ||||||

Before turning to the entropy, a word to the choice of state functions. We now
already have four: , U, H , G – but for a given system, there
is only Fone state. Two things are important in this context: | ||||||

State functions, by definition, must describe the state of a system no matter how this state developed
– they must, in other words, meet all the requirements for potentials and thus are thermodynamic
potentials. We have not proved if this is the case for , U, H, G
– turn to the potential module for some input to this question – but
they really are potentials.F | ||||||

Any state function or thermodynamic potential can be used to describe
any
system (always for equilibrium, of course), but for a given system some are more convenient than others. The most convenient
(and thus important) one for crystals (usually under constant pressure) is the free enthalpy. | ||||||

**Entropy - Statistical Consideration**

The key question is: | ||||||||||||||||||||||

| ||||||||||||||||||||||

There is a classical answer, but here we only use the statistical definition where entropy is the measure of the "probability" , or, essentially
the same thing, w of a given macrostatethe number .
P of microstates possible for the given macrostate | ||||||||||||||||||||||

Not too helpful: What is a microstate or a macrostate?
Or the probability of a macrostate? | ||||||||||||||||||||||

Well, any particular arrangement of atoms (or more generally,
particles) where we look only on average quantities is a macrostate,
while any individual arrangement defining the properties (e.g. location and momentary
velocity) of all the particles for a given macrostate is a microstate. |
||||||||||||||||||||||

In other words, and somewhat simplified: For a microstate it matters what individual particles do, for the macrostate it does not. | ||||||||||||||||||||||

The difference between microstates and macrostates is best illustrated for for a gas in a
closed container: We can define many possible macrostates, e.g. **1.**All molecules are in the left half of the container.**2.****70 %**of the molcules are in the left half of the container,**30 %**in the right half.**3.**Equal (average) distribution of the molecules.
(or U).H |
||||||||||||||||||||||

However, the probability of experimentally finding one
or the other of those macrostates is very different. The probabilities of the macrostates 1. and 2. are certainly
much much smaller than the probability of macrostate No. 3. | ||||||||||||||||||||||

For all the possible macrostates, the state function tells
us which one will be realized (= is most probable) in thermal equilibrium. | ||||||||||||||||||||||

How do we calculate the probability of a macrostate? Let's see: | ||||||||||||||||||||||

For every possible macrostate we can think of, there are many microstates to realize it. Its
exactly like playing dice: Let's assume you have 3 dice. A macrostate would be some possible number you may throw;
e.g. 9. The corresponding microstates are the possible combinations of the individual dice. For throwing 9
we have | ||||||||||||||||||||||

| ||||||||||||||||||||||

- and so on. You get the picture. | ||||||||||||||||||||||

The probability for such a macrostate would
be the number of microstates divided by the number of all possible combinations of the dice (which is a constant). We can
see off-hand that the macrostates "3" and "18" are the most unlikely ones, having only
one microstate at their disposal, while 9, 10, or 12 are more likely to occur. | ||||||||||||||||||||||

Now we know what the number of possible ways to generate the same
macrostate means and why the "probability" of a given macrostate is w"almost"
the same thing. | ||||||||||||||||||||||

An example just as easy as playing dice, comes from our friend, the vacancy. We
simply ask: How many ways (= Pmicrostates
) are there to arrange vacancies (= nmacrostate ) in a crystal of
atoms? N | ||||||||||||||||||||||

When we figure that out, we can use the equilibrium condition to select the most likely macrostate and this gives us the number of vacancies in equilibrium. | ||||||||||||||||||||||

The fundamental point now is that just knowing the internal energy
of a system with a constant volume and temperature is not good enough to tell us what the equilibrium configuration will
be because Uwe
could think of many macrostates with the same (and mother nature, to be sure, can come up with lots
more).U | ||||||||||||||||||||||

That's why just minimizing (or U) is not good enough, we have
to minimize H or F = U – TS to find the equilibrium
configuration of the system, and for that we have to know the entropy, because we now can interprete these formulas:G = H – TS |
||||||||||||||||||||||

Of all the many macrostates possible
for a given (or U) the one with the Hlargest entropy at
the given temperature will be the one that the system will adopt | ||||||||||||||||||||||

Obviously, we need to be able to calculate the entropy of a certain macrostate
and this is done by employing the statistical definition of the entropy , the famous SBoltzmann
entropy equation
(german link): | ||||||||||||||||||||||

| ||||||||||||||||||||||

With = probability of a macrostate and w = number of microstates for a macrostate.P |
||||||||||||||||||||||

If you feel that the ambiguity with respect to taking or w
is a bit puzzling - that's because it is! You should consult the link to see that
at least it is nothing to worry about. Whatever you chose to work with, the results you will get in the end will be the
same.P | ||||||||||||||||||||||

Entropy , by the way, is Snot a state function
( would be one).TS | ||||||||||||||||||||||

We used the statistical definition of entropy and the minimization of the free enthalpy in chapter 2.1; and in an exercise module it can be seen in detail how to apply it to derive the formula for the vacancy concentration | ||||||||||||||||||||||

2.1.1 Simple Vacancies and Interstitials

Vagaries in the Statistical Definition of the Entropy

Internal Energy, Enthalpie, Entropy and Free Enthalpie

Boltzmann's Constant and Gas Constant

Solution to Basic Exercise 2.1-4

© H. Föll (Defects - Script)