Dieses Modul ersetzt das alte deutsche Modul "Elektrische Leitfähigkeit", das unter "Illustrations" aber immer noch verfügbar ist. Es bestehen aber wesentliche Unterschiede zwischen dem neuen und dem alten Modul!
Ein neues Modul in deutscher Sprache, das in etwa die Inhalte wiedergibt, ist in einem anderen Hyperskript verfügbar.
Wer mit dem Inhalt (egal ob in Deutsch oder in Englisch) Probleme hat, sollte noch die beiden deutschen Module "Temperatur, Gleichverteilungssatz etc.  die Grundlagen" und "Beweglichkeit und Diffusion" zu Rate ziehen oder gleich die Essenz der Thermodynamik in einem anderen Hyperskript nacharbeiten.
In this subchapter we will look at the classical treatment of the movement of electrons inside a material in an electrical field.  
In the preceding subchapter we obtained the most basic formulation of Ohm's law, linking the specific conductivity to two fundamental material parameters:  
 
For a homogeneous and isotropic material (e.g. polycrystalline metals or single crystal of cubic semiconductors), the concentration of carriers n and their mobility µ have the same value everywhere in the material, and the specific conductivity s is a scalar .  
This is boring, however. So let's look at useful complications:  
In general terms, we may have more than one kind of carrier (this is the common situation in semiconductors) and n and µ could be functions of the temperature T, the local field strength E_{loc} resulting from an applied external voltage, the detailed structure of the material (e.g. the defects in the lattice), and so on.  
We will see that these complications are the essence of advanced electronic materials (especially semiconductors), but in order to make life easy we first will restrict ourselves to the special class of ohmic materials.  
We have seen before that this requires n and µ to be independent of the local field strength. However, we still may have a temperature dependence of s; even commercial ohmic resistors, after all, do show a more or less pronounced temperature dependence  their resistance increases roughly linearly with T.  
In short, we are treating metals, characterized by a constant density of one kind of carriers (= electrons) in the order of 1...3 electrons per atom in the metal.  
Basic Equations and the Nature of the "Frictional Force"  
We consider the electrons in the metal to be "free", i.e. they can move freely in any direction  the atoms of the lattice thus by definition do not impede their movement.  
The (local) electrical field E_{loc} then exerts a force F=– e · E_{loc} on any given electron and thus accelerates the electrons in the field direction (more precisely, opposite to the field direction because the field vector points from + to – whereas the electron moves from – to +).  
In the fly swarm analogy, the electrical field would correspond to a steady airflow  some wind  that moves the swarm about with constant drift velocity.  
Now, if a single electron with the (constant) mass m and momentum p is subjected to a force F, the equation of motion from basic mechanics is  
 
Note that p does not have to be zero when the field is switched on.  
If this would be all, the velocity of a given electron would acquire an ever increasing component in field direction and eventually approach infinity. This is obviously not possible, so we have to bring in a mechanism that destroys an unlimited increase in v.  
In classical mechanics this is done by introducing a frictional force F_{fr} that is proportional to the velocity:  
 
Here, k_{fr} is some friction constant. But this, while mathematically sufficient, is devoid of any physical meaning with regard to the moving electrons.  
There is no "friction " on an atomic scale! Think about it! Where should a friction force come from? An electron feels only forces from two kinds of fields  electromagnetic and gravitational (neglecting strange stuff from particle physics).  
It thus makes no sense to complement the differential equation above with a friction term  we have to look for a better approach.  
All that friction does to big classical bodies is to dissipate ordered kinetic energy of the moving body to the environment. Any ordered movement gets slowed down to zero surplus speed, and the environment gets somewhat hotter instead, i.e. unordered movement has increased.  
This is called energy dissipation, and that is what we need: Mechanisms that take kinetic energy away from an electron and "give" it to the crystal at large. The science behind that is called (Statistical) Thermodynamics  we have encountered it before.  
The best way to think about this is to assume that the electron, flying along with increasing velocity, will hit something else along its way every now and then; it has a collision with something else, or, as we will say from now on, it will be scattered by something else.  
This collision or scattering event will change its momentum, i.e. the magnitude and the direction of v, and thus also its kinetic energy E_{kin} , which is always given by  
 
In other words, we consider collisions with something else, i.e. other particles (including "pseudo" particles), where the total energy and momentum of all the particles is preserved, but the individual particle looses its "memory" with respect to its velocity before the collision, and starts with a new momentum after every collision.  
What are the "partners" for collisions of an electron, or put in standard language, what are the scattering mechanisms? There are several possibilities:  
Other electrons. While this may happen, it is not the most important process in most cases. It also does not decrease the total energy contained in the electron movement  the losses of some electrons are the gains of others.  
Defects , e.g. foreign atoms, other point defects (i.e. voids, interstitials) or dislocations. This is a more important scattering mechanism, moreover, it is a mechanism where the electron can transfer its surplus energy (obtained through acceleration in the electric field) to the atoms of the lattice, which means that the material heats up.  
Phonons, i.e. "quantized" lattice vibrations traveling through the crystal. This is the most important scattering mechanism.  
The last aspect is a bit strange. While we (hopefully) have no problem imagining a crystal lattice with all atoms vibrating merrily, there is no immediate reason to consider these vibrations as being localized (whatever this means) and particlelike.  
You are right – but nevertheless: The lattice vibrations indeed are best described by a bunch of particlelike phonons careening through the crystal.  
This follows from a quantum mechanical treatment of lattice vibrations. Then it can be shown that these vibrations, which contain the thermal energy of the crystal, are quantized and show typical properties of (quantum) particles: They have a momentum , and an energy given by hn (h= Planck's constant, n= frequency of the vibration).  
Phonons are a first example of "pseudo" particles; but there is no more "pseudo" to phonons than there is to photons. (Both of them are bosons, by the way.)  
We will not go into more details here. All we need to know is that a hot crystal has more phonons and more energetic phonons than a cold crystal, and treating the interaction of an electron with the lattice vibration as a collision with a phonon gives not only correct results, it is the only way to get results at all.  
At this point comes a crucial insight: It would be far from the truth to assume that only accelerated electrons scatter; scattering happens all the time to all the electrons moving randomly about because they all have some thermal energy. Generally, scattering is the mechanism to achieve thermal equilibrium and equidistribution of the energy of the crystal.  
If electrons are accelerated in an electrical field and thus gain energy in excess of thermal equilibrium, scattering is the way to transfer this surplus energy to the lattice which then will heat up. If the crystal is heated up from the outside, scattering is the mechanism to turn heat energy contained in lattice vibrations to kinetic energy of the electrons.  
Again: Even without an electrical field, scattering is the mechanism to transfer thermal energy from the lattice to the electrons (and back). Generally, scattering is the mechanism to achieve thermal equilibrium and equidistribution of the energy of the crystal.  
Our free electrons in metals behave very much like a gas in a closed container. They careen around with some average velocity that depends on the energy contained in the electron gas, which is – in classical terms – a direct function of the temperature.  
Averaging over Random Scattering Events  
Let's look at some figures illustrating the scattering processes.  

Shown here is the magnitude of the velocity v _{±x}of an electron in +x and –x direction without an external field. The electron moves with constant velocity until it is scattered, then it continues with some new velocity.  
The scattering processes, though unpredictable at single events, must lead to the averages of the velocity, which is characteristic for the material and its conditions.  
The plural in "averages" is intentional: there are different averages of the velocity!  
Whereas <v>=0 , <v> has a finite value (consult the "fly swarm modul" if you are unsure about this); this is also true for <v_{x}> and <v _{–x}>, where the averages are taken either over the positive or over the negative values only (see drawing). Here it holds that <v_{x} >=–<v_{–x}>, since due to the randomness of the scattering events there is no difference between either direction. 
From classical thermodynamics we know that the (classical) electron gas in thermal equilibrium with the environment contains the energy E_{kin}=(1/2)kT per particle and degree of freedom, with k= Boltzmann's constant and T = absolute temperature. If you forgot all about this, check this link, too. The three degrees of freedom are the possible movements in x, y and zdirection, so we have (considering just one of them)  
 
For the other directions we have exactly the same relations, of course. For the total energy we obtain  
 
with


 
Note that by using classical thermodynamics to derive this result, all processes involved in an ideal gas (here formed by the free electrons) are included. This means that electron–electron scattering is already covered by this expression. Therefore, from now on electron–electron scattering events do not to play any role anymore.  
At this point you should stop a moment and think about just how fast those electrons will be careening around at room temperature (300 K) – without plugging numbers in the equation!  
Got a feeling for it? Probably not. So look at the exercise question (and the solution) further down!  
Now you should stop another moment and become very aware of the fact that this equation is from purely classical physics. It is absolutely true for classical particles  which electrons are not, actually. Electrons obey the Pauli principle, i.e. they behave about as nonclassical as possible. This should make you feel a bit uncomfortable. Maybe the equation from above is not correct for electrons then? Indeed  it isn't. Why this is so we will see later, and also how we can "repair" the situation!  
Now lets turn on an electrical field. It will accelerate the electrons between the collisions (which now are collisions with defects and phonons only, since we stick to the classical treatment from above). Their velocity in field direction then increases linearly from whatever value it had right after a collision to some larger value right before the next collision.  
In our diagram from above this looks like this: 

Here we have an electrical field that accelerates electrons in xdirection (and "brakes" in –x direction). Between collisions, the electron gains velocity in +xdirection at a constant rate (=identical slope).  
The average velocity in +x direction, < v_{+x}>, has now a larger absolute value than that in –x direction, <v_{–x}>.  
However, beware of the pitfalls of schematic drawings: For real electrons the difference is very small as we shall see shortly; the slope in the drawing is very exaggerated.  
The drift velocity is contained in the difference <v_{+x}> – <v_{–x}>; it is completely described by the velocity gain between collisions. For obtaining a value, we may neglect the instantaneous velocity right after a scattering event because they average to zero anyway and just plot the velocity gain in a simplified picture; always starting from zero after a collision.  

The picture now looks quite simple; but remember that it contains some not so simple averaging.  
At this point it is time to define a very meaningful new average quantity to describe the influence of the scattering processes on the drift velocity:  
A certain mean time t between collisions, which for certain reasons (becoming clear only later) is defined as the mean time for reaching the drift velocity v_{D} in the simplified diagram. We also call t the mean scattering time or just scattering time for short.  
This is most easily illustrated by simplifying the scattering diagram once more: We simply use just onetime  the average  for the time that elapses between scattering events and obtain:  

This is the standard diagram illustrating the scattering of electrons in a crystal usually found in textbooks; the definition of the scattering time t is included.  
It is highly idealized (if not to say just wrong) if you compare it to the correct picture above. Of course, the average velocity of both pictures will give the same value, but that's like saying that the average speed v_{a} of all real cars driving around in a city is the same as the average speed of ideal model cars, which are going at v_{a} all the time.  
Note that t is only half of the average time between collisions.  
So, while this diagram is not wrong, it is a highly abstract rendering of the underlying processes obtained after several averaging procedures. From this diagram only, no conclusion whatsoever can be drawn as to the average velocities of the electrons without the electrical field!  
New Material Parameters and Classical Conductivity  
With the scattering concept, we now have two new (closely related) material parameters:  
The mean (scattering) time t between two collisions as defined before.  
The mean free path l between collisions; i.e. the distance travelled by an electron (on average) before it collides with something else and changes its momentum. We have  
 
Note that v_{0} enters the defining equation for l, and that we have to take twice the scattering time t because it only refers to half the time between collisions!  
After we have come to this point, we now can go on: Using t as a new parameter, we can rewrite Newtons equation from above for an electron (q=e) as follows:  
 
We now only consider what happens to the electron as long as it doesn't hit anything. Then it is possible to equate the differential quotient with the difference quotient, because the velocity change is constant. After a scattering event has taken place, the process is completely interrupted and starts under "virgin" conditions again.  
We obtain immediately the relation between the drift velocity v_{D} and the applied field E:  
 
Inserting this equation for v_{D} in the old definition of the current density j=– n · e · v_{D} and invoking the general version of Ohm's law, j =s · E, yields  
 
This gives us the final result  
 
This is the classical formula for the conductivity of a classical "electron gas" material; i.e. metals. The conductivity contains the density n of the free electrons and their mean classical scattering time t as material parameters.  
We have a good idea about n, but we do not yet know t_{class}, the mean classical scattering time for classical electrons. However, since we know the order of magnitude for the conductivity of metals, we may turn the equation around and use it to calculate the order of magnitude of t_{class}. If you do the exercise further down, you will see that the result is:  
 
"Obviously" (as stated in many text books), this is a value that is far too small and thus the classical approach must be wrong. But is it really too small? How can you tell without knowing a lot more about electrons in metals?  
Let's face it: you can't !! So let's look at the mean free path l instead. We have  
 
The last equation gives us a value v_{0}
» 10^{4} m/s at room temperature! Now we need v_{D} , and this
we can estimate from the equation given above to  
We thus can rewrite the equation for the conductivity and obtain  
 
Knowing s from experiments, but not l, makes possible to determine l. The mean free path l between collisions (for v_{D}=0) for a typical metals thus is  
 
And this is certainly too small!  
But before we discuss these results, let's see if they are actually true by doing an exercise:  
 
Now to the important question: Why is a mean free path in the order of the size of an atom too small?  
Well, think about the scattering mechanisms. The distance between lattice defects is certainly much larger, and a phonon itself is "larger", too.  
It does not pay to spend more time on this. Whichever way you look at it, whatever tricky devices you introduce to make the approximations better (and physicists have tried very hard!), you will not be able to solve the problem: The mean free paths are never even coming close to what they need to be, and the conclusion which we will reach  maybe reluctantly, but unavoidably  must be:  

Scattering and a New Look on Mobility  
Somewhere on the way, we have also indirectly found that the mobility µ as defined before is just another way to look at scattering mechanisms. Let's see why.  
All we have to do is to compare the equation s=(n · e^{2} · t)/m for the conductivity from above with the master equation s=q · n · µ.  
This gives us immediately  
 
In other words:  
 
The mobility µ thus is a basic material property, welldefined even without electrical fields, and just another way to characterize the scattering processes taken place by a single number.  
We even can go one stage further with this: If we envision the movement of an electron again, as described above in many words, analogies ("fly swarm"), graphs and equations, we "see" exactly the same thing we envisioned when we looked a diffusing particle or vancany when we learned about diffusion and random walk.  
"Something" bounced round in a random matter, and everything important about the
"something" was captured in its diffusion coefficient D. This diffusion coefficient was either defined
via Fick's laws (e.g. Fick's first
law  
You should now have a certain feeling that all this old stuff from diffusion and what we just learned about the random bouncing around of electrons, must be somehow connected. After all, we always have the element of something moving around (mostly) at random.  
Right you are! Again, it was Einstein (and independently Smoluchowski) who found the proper relation, the EinsteinSmoluchowski relation hinted at a chapter ago:  
 
The mobility µ thus is "almost" the same as the diffusion coefficient D; for a given temperature T they are proportional to each other,  
How do we obtain this simple relation? Well  we won't at this point. It's not all that difficult to derive, but it is no accident either that it's called after Einstein (that's actually part of what he got the Nobel prize for).  
If you are not satisfied with that, check this link for a derivation, or this one for an alternative way. More to the relation between diffusion coefficient and mobility in this (German) link.  

Mobility and Speed of Electronic Devices  
In the equations above slumbers an extremely important aspect of semicoductor technology:  
In all electronic devices carriers have to travel some distance before a signal can be produced. A MOS transistor, for example, switches currents on or off between its "Source" and "Drain" terminals depending on what voltage is applied to its "Gate". Source and drain are separated by some distance l_{SD}, and the "Drain" only "feels" the "on" state after the time it takes the carriers to run the distance l_{SD} .  
How long does that take if the voltage between Source and Drain is U_{SD}?  
Easy. If we know the mobility µ of the carriers, we also know their (average)
velocity v_{SD} in the sourcedrain region, which by definition
is  
The traveling time t_{SD} between source and drain
for obvious reasons defines roughly the maximum frequency f_{max}
the transistor can handle, we have  
 
The maximum frequency of a MOS transistor thus is directly proportional to the mobility of the carriers in the material it is made from (always provided there are no other limiting factors). And since we used a rather general argument, we should not be surprised that pretty much the same relation is also true for most electronic devices, not just MOS transistors.  
This is a momentous statement: We linked a prime material parameter, the material constant µ, to one of the most important parameters of electronic circuits. We would like µ to be as large as possible, of course, and now we know what to do about it!  
Actually, we do not really know what to do, but other people do  and act on it. See the link to find out how it is done.  
A simple exercise is in order to see the power of this knowlegde:  
 

© H. Föll (MaWi 2 Skript)