Download STATISTICAL PHYSICS 1. Entropy and Temperature 1.1. Accessible

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

X-ray photoelectron spectroscopy wikipedia , lookup

Atomic theory wikipedia , lookup

Molecular Hamiltonian wikipedia , lookup

Wave–particle duality wikipedia , lookup

Renormalization wikipedia , lookup

Canonical quantization wikipedia , lookup

Ising model wikipedia , lookup

Particle in a box wikipedia , lookup

Relativistic quantum mechanics wikipedia , lookup

Renormalization group wikipedia , lookup

Theoretical and experimental justification for the Schrödinger equation wikipedia , lookup

T-symmetry wikipedia , lookup

Transcript
STATISTICAL PHYSICS
TODD KARIN
Abstract. This summary will review some important results and assumptions on statistical physics. I made this summary
during a graduate level statistical physics course at University of Washington with Prof. Boris Spivak. Much of this summary
is copied straight from an undergraduate statistical physics book [3]. Some of the notation has been made to follow Landau
(the course text). [4]. Reif is another extremely useful text for certain concepts [5].
the probability of finding the system in a particular quantum
state s is
1. Entropy and Temperature
1.1. Accessible States and Entropy. An accessible state
is a quantum state compatible with all the information known
about the system (e.g. energy, volume, etc.). Sometimes, a
state may be unaccessible because of initial conditions. For
example, consider a box with a divider in the center and a gas
on the right side. The volume accessible to the gas is clearly
only the volume of the right side, even though if the gas were
distributed amongst the whole box the energy would be the
same.
This can be understood more rigorously by inspecting the
proof of the H theorem 1. The H theorem describes how a
quantum system will tend to move towards a distribution
where all the accessible states are equally likely. The proof
relies on a non-zero transition probability between accessible states, which agrees with our intuition that a system will
never move to a state if the transition probability is zero.
We will call the number of accessible states g the statistical
weight or multiplicity of the system.
(1)
P (s) = 1/g,
where g is the number of accessible states. This justifies naming g the statistical weight.
This postulate is justified by the H-theorem, which explains
that a system can only be in equilibrium if all accessible states
are equally likely.
1.4. Entropy. The entropy σ is defined as the logarithm of
the number of accessible states,
(2)
σ(N, U ) = log g(N, U ).
The fundamental temperature τ = kB T is defined by
∂σ
1
=
.
(3)
τ
∂U N
In this equation, we explicitly hold the number of particles N
constant, but the derivative must hold other constraints on
the system (except of course the energy) constant as well. We
will often use the shorthand β = 1/τ .
1.2. The random walk. Random walk statistics are very
common. In the random walk, a drunk starts out at a lamp- 1.5. Law of increase of entropy. The entropy of a closed
post and starts stumbling around [5]. In fact, he is so drunk system tends to remain constant or increase when a constraint
that every time he takes a step he immediately forgets which internal to the system is removed.
we he was going. After taking N = n+ + n− steps, he finds
2. Boltzmann Distribution and Thermodynamic
himself a distance n = n+ − n− away from where he started.
Quantities
The number of ways he can get there is
2.1. Boltzmann Factor. Since the Boltzmann factor (also
N!
known as the Gibbs distribution) is so important, we will deg(n, N ) =
.
n+ ! n− !
rive it here.
Consider a closed quantum system interacting with a reserHe is not likely to get very far.
For large N is often helpful to use Stirling’s approximation voir. The total energy Uo of the system plus reservoir is a constant. We are interested in the probability that the system is
log N ! = N log N − N
in a quantum state s1 or s2 with energies ε1 or ε2 . Since we
have specified the state of the system precisely, the number
to expand the factorial.
of ways to have this configuration is just the multiplicity of
1.3. Postulate of equal a priori probabilities. A funda- the reservoir. Using (1) and (2),
P (s1 )
gR (Uo − ε1 )
eσR (Uo −ε1 )
=
= σ (U −ε )
P (s2 )
gR (Uo − ε2 )
e R o 2
mental assumption is that a closed system in equilibrium is
equally likely to be in any of its accessible states. Therefore,
Date: October 2012.
Contact: Todd Karin. University of Washington. tkarin (at) uw (dot) edu.
1
Reif p.625
1
2
TODD KARIN
Taylor expanding σR about Uo , we get the Boltzmann factor : 2.7. Thermodynamic Identity. This identity is very useful for proving thermodynamic results. First, consider the
P (s1 )
exp(−1 /τ )
(4)
=
differential of σ(U, V ):
P (s2 )
exp(−2 /τ )
∂σ
∂σ
describing the relative probabilities of occupation for two
dU +
dV.
dσ(U, V ) =
∂U V
∂V U
quantum states of a system in equilibrium with a reservoir.
2.2. Partition Function. From (4) and probability normal- Using (3) and (11) we get the thermodynamic identity:
ization, we conclude that the probability a state s is occupied
(12)
dU = τ dσ − pdV.
is given by
1 −s /τ
e
,
Z
where we defined the partition function,
X
(6)
Z=
e−εs /τ .
(5)
P (s) =
2.8. Helmholtz Free Energy. The Helmholtz free energy is
defined as
(13)
F = U − τ σ.
Taking a differential and using 12 we quickly get
If we were to consider a system classically, the partition
∂F
∂F
function comes from integrating the accessible region of phase (14)
, p=−
.
σ=−
∂τ V
∂V τ
space
Z
We can physically understand the free energy as balancing
dq1 dp1 · · · dqn dpn
.
(7)
Z = e−βE(q1 ,p1 ,··· ,qn ,pn )
how a system tends toward lower energy and higher entropy.
(2π~)n
We can rewrite the pressure as
It is straightforward to show that the partition function of
two independent subsystems satisfies
∂σ
∂U
+τ
.
(15)
p=−
∂V τ
∂V τ
Z = Z1 Z2 .
s
2.3. Equipartition Theorem. The equipartition theorem
specifies the thermal average energy of a classical system.
Each independent quadratic term of the classical hamiltonian
(be it a momentum or a position) has an average value of 12 τ
in equilibrium. For example, the Hamiltonian
1
p2
+ kx2
2m 2
has a thermal average energy of hHi = τ because there are
two degrees of freedom.
H=
The terms on the right hand side are called the energy pressure and entropy pressure. For a process at constant temperature, a system pushes towards lower energy and higher
entropy.
The Helmholtz free energy is at a minimum for a system
in thermal contact with a reservoir, if the system is at constant temperature and volume. This is easy to show using the
definition of temperature:
dF = dU − τ dσ = dU − dU = 0.
2.4. Example of partition function. The partition func- 2.9. Identities regarding the partition function. The
tion of a single atom of mass m confined in a cubical box of partition function is an extraordinarily useful construct. From
volume V is
knowledge of the partition function, we can extract any thermodynamic quantity.
(8)
Z1 = nQ V
We can write the mean energy
where the quantum concentration is
1 X
mτ 3/2
U=
εs e−εs /τ
Z s
(9)
nQ =
.
2π~2
Whenever the density of a gas is much less than the quantum as a logarithmic derivative of the partition function
concentration, the gas is in the classical regime.
∂
∂
(16)
U = τ2
log Z = −
log Z
∂τ
∂β
2.5. Heat Capacity. The heat capacity CV of a system at
constant volume is defined as
with β = 1/τ .
∂U
∂σ
Using the definition of the free energy and (14), we can
(10)
CV =
=τ
∂
∂τ V
∂τ V
show U = −τ 2 ∂τ
(F/τ ). Comparison with (16) yields
2.6. Pressure. The average pressure of a system is
∂U
∂σ
(11)
p=−
=τ
∂V σ
∂V U
(17)
F = −τ log Z.
The other thermodynamic quantities can be found using table 1.
STATISTICAL PHYSICS
3. Chemical Potential and Gibbs Distribution
3.1. Definition. The chemical potential
∂F
(18)
µ(τ, V, N ) =
∂N τ,V
is defined so that two systems in diffusive and thermal equilibrium will have the same chemical potential. We can see
this by recalling from that F is a minimum for a system at
constant temperature and volume. If we split our system into
two parts and write the differential of F for an infinitesimal
transfer of particles:
∂F1
∂F2
dF = 0 =
dN1 +
dN2
∂N1 τ
∂N2 τ
∂F2
∂F1
dN1 −
dN1
=
∂N1 τ
∂N2 τ
So for systems in diffusive and thermal equilibrium,
µ1 = µ2 .
3
3.4. Gibbs Factor. We consider a system in diffusive and
thermal equilibrium with a reservoir. The system can be as
small as a single quantum state, or macroscopic. Similarly to
the Boltzmann factor, the probability there are N1 particles
in a state with energy ε1 is given by
1
exp [(N1 µ − ε1 )/τ ] ,
Z
where the Gibbs sum or grand sum is
P (N1 , ε1 ) =
Z=
∞ X
X
exp (N µ − εs(N ) )/τ .
N =0 s(N )
P
We will abbreviate the double sum above as ASN , which
means to carry the sum for all states of the system and all
numbers of particles possible in those states
The total number of particles in the system can vary, but
it’s average is
∂ log Z
1 X
N exp[(N µ − εs )/τ ] = τ
.
(20)
hN i =
Z
∂µ
ASN
3.2. Internal and Total Chemical Potential. The chemical potential is the tool to describe how a particles’ tendency
to lower energy can be balanced by its tendency towards
higher entropy.
As an example, consider two boxes of gas at different
heights in a gravitational field in diffusive and thermal equilibrium. The chemical potential of the lower box is a sum of
two parts:
∂F1
∂U1
∂σ1
µ1 =
=
−τ
∂N1 τ,V
∂N1 τ,V
∂N1 τ,V
It is often convenient to define use the shorthand
λ = exp(µ/τ )
called the absolute activity. For an ideal gas, λ is proportional
to the concentration.
3.5. Gibbs Free Energy. The Gibbs free energy
G = U − τ σ + pV
is a minimum for a system at constant pressure in thermal
equilibrium with a reservoir. This is easy to show using the
which are called the external and internal chemical potentials thermodynamic identity.
respectively. We call the second term the internal chemical
If only one species of particle is present, the chemical popotential, because it is the chemical potential that would ex- tential is the Gibbs free energy per particle:
ist if no external field were present. (The internal chemical
(21)
G(N, p, τ ) = N µ(p, τ ).
potential of an ideal gas is given in (27).)
Since the total free energy is a minimum, we can physically 3.6. Maximum work done by a system in a reservoir.
understand the external chemical potential in the usual way: Consider a system in a reservoir. We are interested in the
that particles are pushed towards regions of lower energy. On amount of work R required to move a system from one state
the other hand, the internal chemical potential physically de- to another (defined positive if the system gains energy). If we
scribes how gas molecules are more likely to be spread over a apply a work R on the system using some object, the system
greater volume. The internal chemical potential is equivalent will change it’s energy by
to a true potential energy; it is equal to the energy differ∆E = R + Po ∆Vo − To ∆σo
ence required to counter the entropic tendency of the gas to
expand.
where the subscript indicates a quantity pertaining to the
If we wanted to know the difference in densities between the
reservoir. Suppose further that the total volume of the resertop and bottom boxes, we simply must set the two chemical
voir and the system is a constant. Also, we know that entropy
potentials equal:
can only increase, so ∆σ + ∆σo ≥ 0 This means that
U1 (0) + τ log n1 /nQ = U2 (h) + τ log n2 /nQ ,
R ≥ ∆E − To ∆σ + Po ∆V
balancing the energy and entropy forces.
with equality for a reversible process:
3.3. Thermodynamic Identity. The thermodynamic iden|R|min = ∆(E − To σ + Po V ).
tity of a system in which the number of particles is allowed
Note that if temperature and volume are held constant, then
to change is
|R|min = ∆F . If temperature and pressure are held constant,
(19)
dU = τ dσ − pdV + µdN.
then |R|min = ∆G.
4
TODD KARIN
Consider the total energy of the reservoir plus system as a 3.10. Spin Susceptibility. In general, the magnetization M
may be an arbitrary function of the magnetic field H. Howfunction of the total entropy σt = σt (Et ). We can expand
ever, we are often only interested in the small H limit, and
dσt
∆Et .
∆σt = −
define the linear susceptibility χ using M = χH.
dEt
In a paramagnetic material an applied magnetic field inIf work is done on the system, the total energy changes by duces a magnetization in the same direction of the field χ > 0.
∆Et = R + To ∆σt . Suppose now that this transition is re- For a diamagnet, χ < 0. Van Leeuwen’s theorem states that
versible, so ∆σt = 0. Then
there is no diamagnetism in classical physics.
There are two components of the susceptibility: one due
Rmin
(22)
∆σt = −
to
spin and another due to angular momentum. The angular
τ
momentum for a 2D electron gas comes from Landau levels.
giving the total change in entropy in some reversible process.
The spin susceptibility is the contribution of the spin degree
of
freedom to the susceptibility. The spin susceptibility is
3.7. Thermodynamic Potential. The Landau potential is
paramagnetic.
Ω = U − τ σ − µN
The energy of a spin in a magnetic field is
If only one species of particle is present, G = µN and we get
E = −µ · B.
Ω = −pV
The thermodynamic potential is related to the grand sum
by
Ω = −τ log Z
For a gas of non-interacting atoms, the Gibbs free energy is
related to the energy by 2
2
Ω = − U.
3
3.8. Enthalpy. The enthalpy
A classical spin is a magnetic moment that can point in any
direction. The energy of a quantized angular momentum in a
field in the z direction is
E = −gµB
Jz Bz
,
~
where the Bohr magneton is (CGS):
µB =
e~
.
2me c
H(σ, p, N ) = U + pV
The g-factors are different for different angular momenta. For
is extremum for a process at constant entropy and pressure. an electron spin ge = 2.0023193 . . .. For electron orbital anFor example, the evaporation of a liquid from an open vessel gular momentum gL = 1.
is a process at constant pressure in which the entropy doesn’t
The magnetization density is
change. However, the liquid may change temperature during
evaporation. The heat required to evaporate a liquid is the
1 ∂F
1 ∂Ω
M=−
=−
change in enthalpy.
V ∂H
V ∂H
T,V,N
3.9. Summary of thermodynamic relations and free
variables. One of the more confusing aspects of statistical
physics is dealing with derivatives were certain parameters
are held constant. This comes from the fact that a thermodynamic system is completely specified by three parameters (for
example U, V, N ). The other thermodynamic quantities can
be expressed as functions of these independent parameters.
We have freedom to choose the three independent variables of the system to suit our convenience. For example, if
in an experiment we can externally dial the temperature and
pressure, it would be most convenient to choose τ and p as
independent variables. In this case, the Gibbs free energy
would be the natural choice for energy since it is minimized
for systems at constant τ and p.
We often want to convert between different thermodynamic
quantities. Many can be expressed in terms of one another as
shown in table 1.
Some of the Maxwell Relations can be easily derived from
taking cross derivatives in table 1.
2Landau [4] p163
T,V,µ
However other definitions are used as well [1],
M=−
1
V
∂U
∂H
.
and they are equal at τ = 0.
The linear susceptibility can be easily extracted by differentiation:
"
#
∂2Ω
χ=−
.
∂H 2 T,V,µ
H=0
4. Ideal Gas
An ideal gas is a gas of noninteracting atoms in the limit of
low concentration. One way to find the ideal gas equation of
motion is to take the low occupation limit of the Fermi-Dirac
or the Bose-Einstein distribution.
STATISTICAL PHYSICS
5
Table 1. Summary of thermodynamic relations showing τ, p, µ, σ and V as derivatives of σ, U, F, G and Ω
in terms of their natural independent variables. Some of these relations are definitions. There are always
three independent variables. The other thermodynamic quantities can be written as derivatives of the potentials with respect to the independent variables. These can be derived in a straightforward way by taking
differentials and using the thermodynamic identity.
τ
p
µ
σ
V
N
σ(U, V, N )
1
∂σ
=
τ
∂U V,N
p
∂σ
=
τ
∂V U,N
µ
∂σ
− =
τ
∂N U,V
U (σ, V, N )
∂U
τ=
∂σ V,N
∂U
−p =
∂V σ,N
∂U
µ=
∂N σ,V
σ known
σ independent
V independent
N independent
V independent
N independent
F (τ, V, N )
G(τ, p, N )
Ω(τ, V, µ)
τ independent
τ independent
τ independent
−p =
∂F
∂V
p independent
−p =
τ,N
∂F
µ=
∂N τ,V
∂F
−σ =
∂τ V,N
V independent
N independent
∂G
µ=
∂N τ,p
∂G
−σ =
∂τ N,p
∂G
V =
∂p N,τ
N independent
∂Ω
∂V
τ,µ
µ independent
−σ =
∂Ω
∂τ
µ,V
V independent
−N =
∂Ω
∂µ
τ,V
4.1. Fermi-Dirac Distribution. We can derive the Fermi- The energy of an ideal gas could be got easily from the
Dirac distribution quickly from the Gibbs factor. We consider equipartition theorem
the average occupancy of a single orbital that can have either
0 or 1 fermions.
3
U = Nτ
2
(23)
Z = 1 + exp[(µ − )/τ ]
Using (20), we quickly conclude that the average occupancy The entropy of the ideal gas is
for a state with energy ε is
1
nQ 5
.
(24)
fFD () =
+
σ = N log
exp[(ε − µ)/τ ] + 1
n
2
The Sommerfeld expansion is useful for performing integrals that include the Fermi function at low temperature:
Z ∞
Z µ
π2 2 0
H(ε)fF D (ε)dε =
H(ε)dε +
τ H (µ) + . . .
5. Fluctuations
6
−∞
−∞
4.2. Bose Einstein Distribution. For bosons, we must do 5.1. The Gaussian Distribution. If we write the entropy
of a system in terms of the energy of the subsystems, the
an infinite sum to find the average occupancy. This gives
probability distribution is proportional to eσ [4]. Since this
1
(25)
fBE () =
,
fact does not use any properties of the energy, we can make
exp[(ε − µ)/τ ] − 1
a similar argument about any thermodynamic quantity x.
which has a fantastically different low ε behavior.
Writing the entropy in terms of the thermodynamic quantity
x, we find the probability density for x to be
4.3. Classical Limit: Ideal Gas. If the average occupancy
of all levels is small, both the Fermi-Dirac and Bose-Einstein
distributions become
(29)
w(x) = constant · eσ(x)
(26)
f (ε) = λ exp(−ε/τ )
Since we expect the entropy to be maximal at the equilibThe chemical potential of the ideal gas comes from writing
rium
value of x, we can expand the entropy about x̄ to find
the total number of particles. Using (8), we quickly find that
the gaussian distribution:
(27)
µ = τ log(n/nQ )
The free energy comes from integrating an equation in Table 1:
(28)
F = N τ [log(n/nQ ) − 1]
r
(30)
w(x) =
α − 1 α(x−x̄)2
e 2
.
2π
6
TODD KARIN
5.2. Fluctuations of thermodynamic quantities. In
thermodynamics, we are interested in fluctuations due to the
statistical nature of the system, and not due to quantum mechanical uncertainty. A thermodynamical system will occupy
the configuration which is most likely, or has highest entropy.
This means that the entropy is a maximum with respect to
some coordinate x. The probability distribution function for
x is given by the number of ways to arrange the system to
give the value x, that is:
w(x) ∝ e−Rmin /τ
where Rmin = ∆E − τ ∆σ + P δV from (22). This formula
allows us to derive the fluctuations of any thermodynamic
quantity.
To do this, expand the entropy about its maximum value
1X
Rmin
=−
βij xi xj .
S − S0 = −
τ
2 i,j
Then the fluctuations are given by the matrix inverse of β,
to a plane in a perpendicular magnetic field [2]. The hamiltonian
2
e
1 p − A(x)
(32)
H=
2m
c
can be rewritten as a harmonic oscillator. This yields the
spectrum
1
En = ~ωc (n + ),
2
The degeneracy of a level is
(33)
ωc =
eB
.
mc
2e
BL2
~c
where L is the size of the box.
(34)
N=
7. Questions
On page 60, how can ∆St = 0 for a reversible process, but
on page 62, ∆St 6= 0?
−1
hxi xj i = βij
.
Starting from the Gibbs distribution, we can write an expression for the mean number of particles. Differentiating
this with respect to the chemical potential gives the mean
fluctuation of the number of particles:
∂N
2
(31)
h(∆N ) i = τ
∂µ τ,V
6. Physical Examples
6.1. Landau Levels. Landau levels are the solution to the
quantum mechanical problem of a charged particle confined
References
[1] Neil W. Ashcroft and N. David Mermin. Solid State Physics. Brooks
Cole, 1 edition, January 1976.
[2] Kurt Gottfried and Tung-mow Yan. Quantum mechanics : fundamentals. Springer, 2003.
[3] Charles Kittel and Herbert Kroemer. Thermal Physics (2nd Edition). W. H. Freeman, second edition edition, January 1980.
[4] L. D. Landau and E. M. Lifshitz. Statistical Physics, Third Edition, Part 1: Volume 5 (Course of Theoretical Physics, Volume 5).
Butterworth-Heinemann, 3 edition, January 1980.
[5] F. Reif. Fundamentals of statistical and thermal physics. McGrawHill, 1 edition, June 1965.