Download N 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Wave–particle duality wikipedia , lookup

Quantum entanglement wikipedia , lookup

Canonical quantization wikipedia , lookup

Relativistic quantum mechanics wikipedia , lookup

Elementary particle wikipedia , lookup

Double-slit experiment wikipedia , lookup

Identical particles wikipedia , lookup

Atomic theory wikipedia , lookup

Theoretical and experimental justification for the Schrödinger equation wikipedia , lookup

T-symmetry wikipedia , lookup

Quantum electrodynamics wikipedia , lookup

Probability amplitude wikipedia , lookup

Transcript
Chapter 2
Statistical
Thermodynamics
1- Introduction
- The object of statistical thermodynamics is to present a particle
theory leading to an interpretation of the equilibrium properties
of macroscopic systems.
- The foundation upon which the theory rests is quantum mechanics.
- A satisfactory theory can be developed using only the quantum
mechanics concepts of quantum states, and energy levels.
- A thermodynamic system is regarded as an assembly of
submicroscopic entities in an enormous number of every-changing
quantum states. We use the term assembly or system to denote a
number N of identical entities, such as molecules, atoms, electrons,
photons, oscillators, etc.
1- Introduction
- The macrostate of a system, or configuration, is specified by the
number of particles in each of the energy levels of the system.
Nj is the number of particles that occupy the jth energy level.
n
If there are n energy levels, then
N
j 1
j
N
A macrostate is defined by ( N1 , N 2 ,..., N j ,..., N n )
- A microstate is specified by the number of particles in each
quantum state. In general, there will be more than one quantum
state for each energy level, a situation called degeneracy.
1- Introduction
- In general, there are many different microstates corresponding
to a given macrostate.
- The number of microstates leading to a given macrostate is called
the thermodynamic probability. It is the number of ways in which
a given macrostate can be achieved.
- The thermodynamic probability is an “unnormalized” probability,
an integer between one and infinity, rather than a number between
zero and one.
- For a kth macrostate, the thermodynamic probability is taken to
be ωk.
1- Introduction
- A true probability pk could be obtained as
pk 
k

where Ω is the total number of microstates available to the system.
   k
k
2- Coin model example and the most probable
distribution
- We assume that we have N=4 coins that we toss on the floor and
then examine to determine the number of heads N1 and the number
of tails N2 = N-N1.
- Each macrostate is defined by the number of heads and the number
of tails.
- A microstate is specified by the state, heads or tails, of each coin.
- We are interested in the number of microstates for each macrostate,
(i.e., thermodynamic probability).
- The coin-tossing model assumes that the coins are distinguishable.
Macrostate
Label
Macrostate
Microstate
k
Pk
k
N1
N2
Coin 1
Coin 2
Coin 3
Coin 4
1
4
0
H
H
H
H
1
1/16
2
3
1
H
H
H
T
4
4/16
H
H
T
H
H
T
H
H
T
H
H
H
H
H
T
T
6
6/16
H
T
H
T
H
T
T
H
T
T
H
H
T
H
T
H
T
H
H
T
H
T
T
T
4
4/16
T
H
T
T
T
T
H
T
T
T
T
H
T
T
T
T
1
1/16
3
4
5
2
1
0
2
3
4
2- Coin model example and the most
probable distribution
The average occupation number is
N 


jk
Nj
k
k
k

N
jk
k

k
  N jk pk
k
k
where Njk is the occupation number for the kth macrostate.
For our example of coin-tossing experiment, the average number
of heads is therefore
1
N1  (4  1)  (3  4)  (2  6)  (1  4)  (0  1)  2
16
Similarly, N 2  2. Then N1  N 2  4  N
2- Coin model example and the most
probable distribution
N1 k
1
1
4
2
6
3
4
6
Figure. Thermodynamic probability
versus the number of heads for a
coin-tossing experiment with 4 coins.
5
Omega k
0
7
4
3
2
1
0
4
1
0
1
2
3
4
N1
Suppose we want to perform the coin-tossing experiment with a
larger number of coins. We assume that we have N distinguishable
coins.
Question: How many ways are there to select from the N candidates
N1 heads and N-N1 tails?
2- Coin model example and the most
probable distribution
The answer is given by the binomial coefficient
N
N!
    
(1)
 N1  N1!( N  N1 )!
Example for N = 8
k
80
0
1
70
1
8
60
2
28
50
3
56
4
70
5
56
6
28
Omega k
N1
Figure. Thermodynamic
Probability versus the
number of heads for a
coin-tossing experiment
with 8 coins.
40
30
20
10
0
7
8
8
1
0
1
2
3
4
N1
5
6
7
8
The peak has become
considerably sharper
2- Coin model example and the most
probable distribution
What is the maximum value of the thermodynamic probability (max)
for N=8 and for N=1000?
The peak occurs at N1=N/2. Thus, Equation (1) gives
8!
For N  8
max 
 70
4!4!
1000!
For N  1000
max 
 ???
500!500!
For such large numbers we can use Stirling’s approximation:
ln( n!)  n ln( n)  n
2- Coin model example and the most
probable distribution
max 
1000!
 ln( max )  ln( 1000!)  2 ln( 500!)
500!500!
ln( max )  1000 ln( 1000)  1000  2500 ln( 500)  500
 1000 
ln( max )  1000 ln 
  693
 500 
log 10 (max )  log 10 (e)  ln( max )  0.4343  693  300
max  10300
For N = 1000 we find that max is an astronomically large number
2- Coin model example and the most
probable distribution
k
For N  1000 coins
Figure. Thermodynamic
probability versus the
number of heads for a
coin-tossing experiment
with 1000 coins.
10300
ordered region
ordered region
N1
Totally random region
(the most probable distribution is
a macrostate for which we have a
maximum number of microstates)
The most probable distribution is that of total randomness
The “ordered regions” almost never occur; ω is extremely small
compared with ωmax.
2- Coin model example and the most
probable distribution
For N very large
   k  max
k
Generalization of equation (1)
Question: How many ways can N distinguishable objects be
arranged if they are divided into n groups with N1 objects in the
first group, N2 in the second, etc?
Answer:
 ( N1 , N 2 ,..., N j ,...N n )  k 
N!
N!

N1! N 2!...! N j !...! N n !  N j !
j
3- System of distinguishable particles
The constituents of the system under study (a gas, liquid, or solid)
are considered to be:
- a fixed number N of distinguishable particles
n
N
j 1
j
N
(conservati on of particles)
- occupying a fixed volume V.
We seek the distribution (N1, N2,…, Nj,…, Nn) among energy levels
(ε1, ε2,…, εj,…, εn) for an equilibrium state of the system.
3- System of distinguishable particles
We limit ourselves to isolated systems that do not exchange energy
in any form with the surroundings. This implies that the internal
energy U is also fixed
n
N 
j 1
j
j
U
(conservati on of energy)
Example: Consider three particles, labeled A, B, and C, distributed
among four energy levels, 0, ε, 2ε, 3ε, such that the total energy is
U=3ε.
a) Tabulate the 3 possible macrostates of the system.
b) Calculate ωk (the number of microstates), and pk (True probability)
for each of the 3 macrostates.
c) What is the total number of available microstates, Ω, for the system
3- System of distinguishable particles
Macrostate
Label
Macrostate
Specification
Microstate
Specification
Thermod.
Prob.
True
Prob.
k
N0
N1
N2
N3
A
B
C
ωk
pk
1
2
0
0
1
0
0
3ε
0
3ε
0
3ε
0
0
3
0.3
2
1
1
1
0
0
0
ε
ε
2ε
2ε
ε
2ε
0
2ε
0
ε
2ε
ε
2ε
0
ε
0
6
0.6
3
0
3
0
0
ε
ε
ε
1
0.1
  3  6  1  10
3- System of distinguishable particles
- The most “disordered” macrostate is the state of highest
probability.
- this state is sharply defined and is the observed equilibrium state
of the system (for the very large number of particles.)
4- Thermodynamic probability and Entropy
In classical thermodynamics: as a system proceeds toward a state
of equilibrium the entropy increases, and at equilibrium the entropy
attains its maximum value.
In statistical thermodynamics (our statistical model): system tends
to change spontaneously from states with low thermodynamic
probability to states with high thermodynamic probability (large
number of microstates).
It was Boltzmann who made the connection between the classical
concept of entropy and the thermodynamic probability:
S  f ()
S and Ω are properties of the state of the system (state variables).
4- Thermodynamic probability and Entropy
Consider two subsystems, A and B
S A  f ( A )
Subsystem A
SB  f (B )
SA
A
Subsystem B
SB
B
The entropy is an extensive property, it is doubled when the mass or
number of particles is doubled.
Consequence: the combined entropy of the two subsystems is
simply the sum of the entropies of each subsystem:
Stotal  S A  S B
or
f (total )  f ( A )  f ( B ) (3)
4- Thermodynamic probability and Entropy
One subsystem configuration can be combined with the other to
give the configuration of the total system. That is,
total   A B
(4)
Example of coin-tossing experiment: suppose that the two
subsystems each consist of two distinguishable coins.
Macrostate
( N1 , N2 )
Subsystem A
Subsystem B
Coin 1 Coin 2 Coin 1 Coin 2
ωkA
ωkB
pkA
pkB
(2,0)
H
H
H
H
1
1
1/4
1/4
(1,1)
H
T
H
T
2
2
2/4
2/4
T
H
T
H
T
T
T
T
1
1
1/4
1/4
(0,2)
total   A   B  4  4  16
4- Thermodynamic probability and Entropy
Thus Equation (4) holds, and therefore
f (total )  f ( A   B ) (5)
Combining Equations (3) and (5), we obtain
f ( A )  f (B )  f ( A  B )
The only function for which this statement is true is the logarithm.
Therefore
S  k ln( )
Where k is a constant with the units of entropy. It is, in fact,
Boltzmann’s constant:
k  1.38  1023 J  K 1
5- Quantum states and energy levels