* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Chapter 11 Quantum statistics
Quantum computing wikipedia , lookup
Molecular Hamiltonian wikipedia , lookup
Wave function wikipedia , lookup
Bohr–Einstein debates wikipedia , lookup
Many-worlds interpretation wikipedia , lookup
Quantum field theory wikipedia , lookup
Renormalization wikipedia , lookup
Scalar field theory wikipedia , lookup
Bell's theorem wikipedia , lookup
Measurement in quantum mechanics wikipedia , lookup
Atomic theory wikipedia , lookup
Hydrogen atom wikipedia , lookup
Quantum machine learning wikipedia , lookup
Elementary particle wikipedia , lookup
Quantum electrodynamics wikipedia , lookup
Matter wave wikipedia , lookup
Quantum group wikipedia , lookup
Double-slit experiment wikipedia , lookup
Quantum key distribution wikipedia , lookup
Copenhagen interpretation wikipedia , lookup
Renormalization group wikipedia , lookup
Probability amplitude wikipedia , lookup
Particle in a box wikipedia , lookup
EPR paradox wikipedia , lookup
History of quantum field theory wikipedia , lookup
Quantum teleportation wikipedia , lookup
Coherent states wikipedia , lookup
Interpretations of quantum mechanics wikipedia , lookup
Path integral formulation wikipedia , lookup
Wave–particle duality wikipedia , lookup
Identical particles wikipedia , lookup
Quantum entanglement wikipedia , lookup
Hidden variable theory wikipedia , lookup
Relativistic quantum mechanics wikipedia , lookup
Ensemble interpretation wikipedia , lookup
Symmetry in quantum mechanics wikipedia , lookup
Theoretical and experimental justification for the Schrödinger equation wikipedia , lookup
Quantum state wikipedia , lookup
Density matrix wikipedia , lookup
Chapter 11 Quantum statistics 11.1 Thermal wavelength We consider a gas of atoms or molecules at temperature T . In chapter 9 we defined the concept of the thermal wavelength λT , λT = √ h , 2πmkB T as the wavelength of a ”wave packet” associated with each gas particle (atom or molecule) of mass m. At high temperatures, the gas particles can be described as billiard balls since their size is much smaller than the interparticle distance. As the temperature is lowered, the particles’ wave packets begin to gain in importance ⇒ the quantum nature of the particles cannot be neglected! When the wave packets of the particles begin to overlap with each other, quantum effects must be taken into account. ⋆ For a system to be in the classical regime, λT should be much smaller than the average interparticle distance r0 (Fig. 11.1), r0 ∼ 1 n1/3 , with n being the density of particles. The gas is in the classical regime when n ⇒ r0 >> λT , 1 >> λT 1/3 λ3T n << 1 . 131 (11.1) 132 CHAPTER 11. QUANTUM STATISTICS Figure 11.1: Thermal wavelength λT and interparticle distance r0 . Quantum regime: quantum effects become important when λ3T n ≈ 1 . When this condition is fulfilled, the wave functions of different particles begin to overlap and the system has to be treated according to quantum mechanics. The condition nλ3T = 1, 3/2 2πh̄2 n =1 mkB T defines a line in the T − n plane that sets the division between the classical and the quantum regimes. We define the degeneracy temperature T0 as 2πh̄2 kB T 0 = n2/3 . m T0 can have very different values, depending on the physical system under study. Examples of quantum degeneracy temperatures: H2 gas n = 2 × 1019 cm−3 T0 = 0.05 K Liquid 4 He n = 2 × 1022 cm−3 T0 = 2 K Electrons in a metal n = 1022 cm−3 T0 = 10000 K 11.2. FUNDAMENTALS OF QUANTUM STATISTICS 133 We observe that, for instance, a gas can be described classically at room temperature, whereas electrons in a metal are in the extreme quantum region. Liquid helium has a degeneracy temperature in between. At 2.17 K it makes a transition to a quantum phase that exhibits superfluidity. 11.2 Fundamentals of quantum statistics Classically, the complete description of a physical system is given by - the phase space (q, p): Γ and - the Hamilton equations of motion. The solution of the equations of motion defines a trajectory in the phase space. One has to employ statistical methods in order to describe the macroscopic state out of the (incomplete) microscopic information. By saying ”incomplete”, we mean that we don’t necessarily know the initial conditions for the trajectory of each particle of the system. That’s why we treat the system statistically. In quantum mechanics, on top of the incomplete knowledge of the microscopic information we have the quantum mechanical indeterminism: even though the state of the system may be completely known in some special cases (pure state), we still cannot predict the result of a measurement since any measurement perturbs the system. This lack of knowledge manifests itself in the statistical interpretation of the wave function and in the uncertainty relation between qi and pi , which cannot be anymore sharply measured at the same time. The concepts of phase space and phase trajectory have no meaning anymore. 11.3 Statistical operator: density matrix In quantum mechanics, we classify the states of a system into pure and mixed states. 11.3.1 Pure state Consider a complete set of commuting observables. A state prepared through measuring these observables is a pure state and is represented by a vector |Ψi in the Hilbert space. Now, if B̂ is an operator associated with one of the observables in this complete set, with B̂ |bn i = bn |bn i , hbn |bm i = δnm , then |Ψi = X n cn |bn i ; cn = hbn |Ψi . One can think of the expectation value of B̂, < B̂ >, as the averaged result of 134 CHAPTER 11. QUANTUM STATISTICS 1 - successive measurements of B done on the same system or 2 - measurements of B done simultaneously on copies of the system. The second interpretation reminds us about the concept of the ensemble that we introduced in statistics. X < B̂ > = hΨ| B̂ |Ψi = hΨ| B̂ |bn i hbn |Ψi | {z } n cn X 2 = bn |cn | . n 11.3.2 Mixed state When the information about the system is incomplete, i.e., no measurement through a complete set of commuting observables was possible, then the system is in a mixed state. A mixed state cannot be represented by a Hilbert-vector. For a mixed state, we only know the probabilities that the system is in a given pure state, while the information about the relative phases is missing. We have that the system can be found in pure states |Ψm i, m = 1, 2, . . ., with probabilities 0 ≤ Pm ≤ 1, so that X Pm = 1, m with hΨn |Ψm i = δnm . The expectation value of an operator B̂ is then given by: < B̂ >= X m Pm hΨm | B̂ |Ψm i . Calculation of this expectation value includes two distinct operations: 1) calculation of quantum mechanical expectation values hΨm | B̂ |Ψm i and 2) statistical averaging over the states |Ψm i weighed by Pm , which is a consequence of the incomplete information about the system. The central operator in quantum statistics which includes both averages is the density matrix operator ρ̂: X ρ̂ = Pm |Ψm i hΨm | . m 11.4. PROPERTIES OF ρ̂ 11.4 135 Properties of ρ̂ 1. The average value of an operator B̂ is given as < B̂ >= Tr(ρ̂B̂). Proof : < B̂ > = X m = Pm hΨm | B̂ |Ψm i = X X m ij = X X m,ij Pm hΨm |bi i hbi | B̂ |bj i hbj |Ψm i ! Pm hbj |Ψm i hΨm |bi i hbi | B̂ |bj i ρ̂ji B̂ij = X (ρ̂B̂)jj = Tr(ρ̂B̂). j ij 2. ρ̂ is hermitian: ρ̂ = ρ̂† since the projector |Ψm i hΨm | is hermitian. 3. Trρ̂ = 1. 4. ρ̂ is non-negative. Proof : given a state |ϕi, hϕ| ρ̂ |ϕi = X m Pm |hϕ |Ψm i|2 ≥ 0. 5. Eigenvalues of ρ̂ are real. 6. Pure state |Ψi can be represented as ρ̂Ψ ≡ P (Ψ) = |Ψi hΨ| . 7. ρ̂2 = X m 2 Trρ̂ Pm2 |Ψm i hΨm | , X Pm2 Pm2 ≤ X = m Since 0 ≤ Pm ≤ 1, X m Pm = 1. m 8. Time dependence: in the Schrödinger picture ∂ ρ̂ = [H, ρ̂], (QMI) ∂t which is the quantum mechanical analogue of the Liouville equation. ih̄ 136 11.5 CHAPTER 11. QUANTUM STATISTICS Statistical operator in thermal equilibrium In thermal equilibrium, the expectation value of a macroscopic observable B, < B̂ >= Tr(ρ̂(t)B̂), has to be time independent. This is only possible if ρ̂(t) is time independent: 1 ∂ ρ̂ = − [ρ̂, H] = 0. ∂t ih̄ This implies that ρ̂ is a conserved quantity. ρ̂ = ρ̂(H) 11.6 (Liouville theorem). Correspondence principle In this section, we want to establish the correspondence between classical statistical physics and quantum statistical physics. Statistical ensemble: a set of exact copies of a real system for which there is no complete knowledge, i.e., of a system in a mixed state. Each member of the ensemble is in a possible state |Ψm i in which the real system could be. The ensemble is described by an incoherent set of states as the members of the ensemble don’t interact with each other and their states don’t interfere with each other. This definition of the statistical ensemble is valid both in classical statistical physics and in quantum statistical physics. 137 11.6. CORRESPONDENCE PRINCIPLE Correspondences: Classical QM 1 Phase space function B(q, p) Operator B̂ 2 Probability distribution function ρ(q, p) Statistical operator ρ̂ 3 Poisson brackets X ∂B ∂G ∂G ∂B · − · [B, G]L = ∂qj ∂pj ∂qj ∂pj j Commutators i 1 h 1 B̂, Ĝ = B̂ Ĝ − ĜB̂ ih̄ ih̄ 4 Phase space integration Z 1 d3N q d3N p . . . h3N N ! Trace 5 Ensemble in equilibrium (stationary state): [ρ, H]L = 0 6 Statistical average of an observable B 7 Equation of motion Tr(. . .) Z 1 d3N q d3N p ρ(q, p)B(q, p) hBi = 3N h N! Z 1 d3N q d3N p ρ(q, p) 1 = 3N h N! Liouville equation: ∂ρ = [H, ρ]L ∂t H: Hamilton function [ρ̂, H]− = 0 < B̂ >= Tr(ρ̂B̂) 1 = Trρ̂ i ∂ ρ̂ ih = − Ĥ, ρ̂ ∂t h̄ − Ĥ: Hamilton operator 138 11.7 CHAPTER 11. QUANTUM STATISTICS Microcanonical ensemble Consider an isolated system in thermodynamical equilibrium, with N particles and energy between E and E + ∆. Since the corresponding ensemble is characterized by a stationary distribution, i.e., [ρ̂, Ĥ] = 0, ρ̂ and Ĥ have a common set of eigenstates: Ĥ |En i = En |En i , hEn |Em i = δnm , hEn | Ĥ |Em i = Em δnm and hEn | ρ̂ |Em i ∼ δnm . For an isolated system, the principle of ”a priori” probability states that all possible states of the system have the same probability. Therefore, we can write: X ρ̂micro = Pm |Em i hEm | , m with Pm = const if E < Em < E + ∆, 0 otherwise, where the constant can be obtained out of the normalization condition Trρ̂ = 1. We define Γ(E) = Tr E<EX m <E+∆ m ! |Em i hEm | , (11.2) which is the quantum mechanical analogue of the classical phase-space volume. In the energy representation, eq. (11.2) can be rewritten as Γ(E) = E<EX m <E+∆ 1= m Then, Pm = 1 Γ(E) number of states with energies between E and E + ∆. ∀ m with E < Em < E + ∆, and the expectation value of an operator B̂ in the microcanonical ensemble is given as 1 Tr < B̂ >= Γ(E) E<EX m <E+∆ m |Em i hEm | B̂ ! 139 11.8. THIRD LAW OF THERMODYNAMICS ⋆ Note that P m is the sum over states! The internal energy U is now 1 Tr U ≡ < Ĥ >= Γ(E) = 1 Γ(E) E<EX m <E+∆ m E<EX m <E+∆ m |Em i hEm | Ĥ ! Em ≈ E and the entropy is S = kB ln Γ(E). ⋆ Note that with the definition of Γ(E) given in equation (11.2), there is no Gibbs paradox since the correct counting of states is provided by the equation. In analogy with classical statistics, we can also define the phase space volume Φ(E) as Φ(E) = EX m ≤E 1 m (the number of eigenstates of the Hamiltonian operator with energies small or equal to Em ), with Γ(E) = Φ(E + ∆) − Φ(E), and δΦ , the density of states. δE For ∆ → 0, we can define S = kB ln Ω(E) Ω(E) = 11.8 Third law of thermodynamics The third law of thermodynamics is of quantum mechanical nature: ”The entropy of a thermodynamical system at T = 0 is a universal constant that can be chosen to be zero and this choice is independent of the values taken by the other state variables.” For a system with a discrete energy spectrum, there is a lowest energy state, the ground state. At T → 0, the system will go into this state. If the groundstate is n-times degenerate, the entropy of the system at T = 0 is S(T = 0) = kB ln n, n: degeneracy ≡ multiplicity. 140 CHAPTER 11. QUANTUM STATISTICS For n = 1 (no degeneracy), S = 0. But for n > 1, S doesn’t apparently fulfill the third law of thermodynamics. However, there is no paradox: when n > 1, the groundstate is degenerate due to the existence of internal symmetries of the Hamiltonian (for instance, spin rotational symmetry); at T = 0 the symmetry gets broken through a phase transition that lets the entropy go to zero. In the case of spin rotational symmetry, a phase transition to a ferromagnetic state breaks the symmetry, and the system goes into one of the many possible degenerate states. 11.9 Example in the microcanonical ensemble: Two-level system and the concept of negative temperature Consider an isolated system at energy E of N particles, each with spin s = 1/2, sz = ±1/2. Due to two possible spin orientations, each particle can have two energy states ±ǫ. The number of particles in the energy state +ǫ is N+ , and the number of particles in the energy state −ǫ is N− , such that N = N+ + N− and E = (N+ − N− )ǫ. Obviously, and 1 N+ = 2 1 N− = 2 E N+ ǫ E N− ǫ . The number of possible states with energy E and particle number N is given by the density of states Ω(E, N ): Ω(E, N ) = N! N! 1 = 1 E E N+ !N− ! N + ! N − ! 2 ǫ 2 ǫ ⇒ There are exactly N ! possibilities to count the possible microstates (the particles are indistinguishable!). But all microstates obtained by interchanging particles in the state +ǫ are equivalent, and the same holds for the particles in the state −ǫ. Therefore, in order to avoid multicounting, N ! has to be divided by N+ ! and N− !. The entropy is then: S(E, N ) = kB ln N! . N+ !N− ! 141 11.9. EXAMPLE IN THE MICROCANONICAL ENSEMBLE: Now, we make use of the Stirling formula (N >> 1, N+ >> 1, N− >> 1) and get 1 E E 1 N+ ln N+ S(E, N ) = kB N ln N − kB 2 ǫ 2 ǫ . 1 E E 1 N− ln N− − kB 2 ǫ 2 ǫ The temperature of the system is then obtained as ( N− ∂S 1 kB = ln = βkB = T ∂E N 2ǫ N+ E ǫ E ǫ ) . Situations: 1. If E < 0, we are in the situation where there are more particles in the lower energy level −|ǫ| than in the higher energy level +|ǫ| and T > 0. 2. For T → ∞, the particles distribute equally between the two levels: N+ = N− ⇒ E = 0. 3. Suppose that through a special mechanism particles can be excited to the higher energy level (for instance, by light irradiation as in a Laser). If the particles remain in the excited state for a certain period of time (metastable state), the situation N+ >> N− can be realized. This phenomenon is called inversion. From E = (N+ − N− )|ǫ| > 0 follows that T < 0, i.e., we are in a situation of a negative temperature. The three situations can be presented graphically (Fig. 11.2). Figure 11.2: Phase diagram for a two-level system. 1. For NEǫ = −1, all the particles are in the lower energy level. The slope β ∝ infinity and T = 0. ∂S ∂E is 142 CHAPTER 11. QUANTUM STATISTICS 2. For equal distribution, N+ = N− , E = 0 and β = 0. Nǫ The temperature T jumps from +∞ to −∞ as soon as more particles are in the higher than the lower level. 3. When all the particles are in the higher level, then E = 1 and T = 0. Nǫ Such systems with an inverted state are described in terms of negative temperatures. ⋆ Note that the concept of negative temperature makes only sense for systems with an upper bound for the energy. 11.10 Canonical ensemble We consider a quantum system A1 in thermal equilibrium with a reservoir A2 . Our goal is to obtain the density matrix operator ρ̂ for A1 . The canonical ensemble contains copies of the system A1 , which are in possible allowed states |Ψm i for A1 , with Ĥ1 |Ψm i = Em |Ψm i. If we neglect interaction H12 between the systems A1 and A2 , the Hamiltonian of the isolated total system A = A1 ∪ A2 is just the sum of their individual Hamiltonians: Ĥ = Ĥ1 + Ĥ2 . Then, the total energy of the isolated system A is E = Em + En and the number of states of A is Γ(E) = X E1 Γ1 (E1 )Γ2 (E − E1 ). (11.3) If A1 is in the state E1 = Em , the number of possible states of the total system is Γ2 (E − Em ), all with equal probabilities (principle of ”a priori” probability). Then, Pm ∼ Γ2 (E − Em ). 143 11.10. CANONICAL ENSEMBLE Since A1 << A2 and Em << E, we can perform a Taylor expansion of the previous expression: ∂ ln Γ2 (E − Em ) = ln Γ2 (E) − Em + ... ln Γ2 (E2 ) ∂E2 E2 =E Em + ... ≈ ln Γ2 (E) − kB T Then, for Pm we have Pm ∼ Γ2 (E − Em ) ∼ e−βEm and for the density matrix operator ρ̂: X X |Em i hEm | = c e−β Ĥ1 , ρ̂ = c e−βEm |Em i hEm | = c e−β Ĥ1 m m which after normalization becomes ρ̂ = e−β Ĥ1 Tre−β Ĥ1 [ρ̂, Ĥ1 ] = 0 We drop from now on the subscript ”1”. We define the canonical partition function as Z(T ) = Tre−β Ĥ , Z(T ) = Z(T, V, N ) = ZN (T, V ). In the energy representation, Z(T ) = X e−βEn . n Then, the expectation value of an observable B̂ is given as −β Ĥ Tr e B̂ < B̂ >= Tr(ρ̂B̂) = . Tre−β Ĥ For the internal energy U =< Ĥ > we thus have U =< Ĥ >= − ∂ ln ZN (T, V ). ∂β One can also show that the energy fluctuations are given by s √ < Ĥ 2 > − < Ĥ >2 C V kB T 2 1 = ∼√ . 2 U N < Ĥ > 144 CHAPTER 11. QUANTUM STATISTICS ⋆ Note the analogy of the derived expressions with the corresponding ones from classical statistical physics. ⇒ For a macroscopic system N → ∞, almost all of its copies in the canonical ensemble have the same energy E =< Ĥ >. This means that for N → ∞ the canonical ensemble is statistically equivalent to the microcanonical ensemble. Additivity of the free energy F , F (T, V, N ) = −kB T ln ZN (T, V ). For two systems 1 and 2 in thermal equilibrium, such that Ĥ = Ĥ1 + Ĥ2 and Ĥ12 ∼ 0, the partition function of the total system 1 ∪ 2 is XX e−β (En1 +En2 ) = Z1 (T )Z2 (T ), Z(T ) = n1 n2 which means that in this case the free energy is an additive quantity: F (T ) = F1 (T ) + F2 (T ). 11.11 Example in the canonical ensemble: N quantum mechanical harmonic oscillators We consider N localized independent linear harmonic oscillators with frequency ω in thermal equilibrium at temperature T . Such a system would describe, for instance, the radiation energy of a black body. The energy of one oscillator is: Ĥ1 Ψn = εn Ψn , 1 , εn = h̄ω n + 2 where n is the occupation number. Then, the partition function in the canonical ensemble for one oscillator is given by Z(T, V, 1) = Tre −β Ĥ1 = X e −βεn n=0 n = e− βh̄ω 2 ∞ X n=0 e−βh̄ω = ∞ X n . e−βh̄ω(n+ 2 ) 1 145 11.11. EXAMPLE IN THE CANONICAL ENSEMBLE: This is a geometrical series: ∞ X rn = n=1 Therefore, 1 . 1−r βh̄ω For N particles, e− 2 1 1 Z(T, V, 1) = = βh̄ω βh̄ω = −βh̄ω 1−e 2 sinh e 2 − e− 2 N Z(T, V, N ) = [Z(T, V, 1)] = 2 sinh βh̄ω 2 βh̄ω 2 −N . . The above expression is valid only because we assumed that the oscillators don’t interact with each other and that they are distinguishable. The free energy is βh̄ω F (T, V, N ) = N kB T ln 2 sinh 2 N +N kB T ln 1 − e−βh̄ω . h̄ω = |2{z } zero-point energy of N oscillators ∂F F µ = = , ∂N T,V N ∂F = 0 (since the oscillators are localized). P = − ∂V T,N The entropy: S=− ∂F ∂T = N kB V,N = N kB The internal energy: In fact, βh̄ω coth 2 βh̄ω −1 eβh̄ω βh̄ω 2 − ln 2 sinh −βh̄ω − ln 1 − e . 1 1 + . U = N h̄ω 2 eβh̄ω − 1 U = N < ǫn >, with 1 1 1 + +<n> , = h̄ω < ǫn >= h̄ω 2 eβh̄ω − 1 2 βh̄ω 2 (11.4) 146 CHAPTER 11. QUANTUM STATISTICS where < n >= 1 eβh̄ω −1 is the average quantum number describing the average particle occupation at temperature T . We will see thus expression again when we introduce the Bose gas. ⋆ Note that for T → 0 N h̄ω 2 gives the zeroth energy of a system of N quantum oscillators. For T → ∞, U∼ U ∼ kB T, as we obtained for a system of n classical oscillators. 11.12 Grand canonical ensemble We consider a system A1 that interchanges energy and particles with a reservoir A2 . A = A1 ∪ A2 is isolated, with V = V1 + V2 . In thermal equilibrium the system has temperature T and chemical potential µ. We consider the case when E1 << E2 and N1 << N2 . In the grand canonical ensemble, the copies of the system A1 are in possible eigenstates |Em (N1 )i of Ĥ1 and N̂1 , with [Ĥ1 , N̂1 ] = 0: Ĥ1 |Em (N1 )i = Em (N1 ) |Em (N1 )i , N̂1 |Em (N1 )i = N1 |Em (N1 )i , 147 11.12. GRAND CANONICAL ENSEMBLE E = E2 (N2 ) + E1 (N1 ), N = N1 + N2 . We want to find the density matrix operator for the system A1 in the grand canonical ensemble: XX Pm (N1 ) |Em (N1 )i hEm (N1 )| . ρ̂ = m N1 Since A is an isolated system, the phase space volume of A is X X (1) (2) ΓN1 (Em (N1 ), V1 )ΓN −N1 .(E − Em (N1 ), V2 ) ΓN (E, V ) = N1 m (2) If A1 is in the state |Em (N1 )i, there are ΓN −N1 (E − Em (N1 ), V2 ) possible states for A2 and therefore for A. Since all these states are equally possible, (2) Pm (N1 ) ∼ ΓN −N1 (E − Em (N1 ), V2 ). (2) For Em << E and N1 << N , we can perform a Taylor expansion of ΓN −N1 : 1 Em (N1 ) ∂S2 (2) ln ΓN −N1 (E − Em , V2 ) ≈ S2 (E, N, V2 ) − kB kB ∂E2 E2 = E N2 = N N1 ∂S2 − , kB ∂N2 E2 = E N2 = N ∂S2 ∂E2 ∂S2 ∂N2 N1 = N E2 = E N2 = N E2 = E = 1 , T µ = − , T Pm (N1 ) = c e−β(Em (N1 )−µN1 ) . Then, the density matrix operator is XX e−β(Em (N1 )−µN1 ) |Em (N1 )i hEm (N1 )| ρ̂ ∼ N1 = e m −β(Ĥ1 −µN̂1 ) XX N1 m |Em (N1 )i hEm (N1 )| Normalizing ρ̂, we get ρ̂ = e−β(Ĥ−µN̂ ) Tre−β(Ĥ−µN̂ ) (we dropped the subindex ”1”). 148 CHAPTER 11. QUANTUM STATISTICS The partition function in the grand canonical ensemble is then Z(T, V, µ) = Tre−β(Ĥ−µN̂ ) . In the particle number representation, Z is given as Z(T, V, µ) = ∞ X z N ZN (T, V ). N =0 The average value of an operator B̂ is < B̂ >= Tr(ρ̂B̂) = Tr e−β(Ĥ−µN̂ ) B̂ Tre−β(Ĥ−µN̂ ) and in the energy-particle representation: < B̂ >= ∞ 1 X X −β(Em (N )−µN ) e Bmm (N ), Z N =0 m with Bmm (N ) = hEm (N )| B̂ |Em (N )i . If < B̂ >c is the average value of B̂ in the canonical ensemble, then < B̂ >= ∞ X z N ZN (T, V ) < B̂ >c N =0 ∞ X . N z ZN (T, V ) N =0 The grand canonical potential is Ω(T, V, µ) = −kB T ln Z(T, V, µ) and all the properties of the system are derived analogously as was done in classical statistics. 11.13 Entropy and the density matrix operator We want to show that for all three ensembles, the relation between entropy and density matrix operator ρ̂ is S = −kB Tr(ρ̂ ln ρ̂) = −kB < ln ρ̂ > , (11.5) i.e., the entropy is given by the expectation value of the logarithm of the density matrix operator ρ̂. ⋆ Note that since the eigenvalues of ρ̂ are probabilities, i.e., they are between 0 and 1, the ln in eq. (11.5) will be negative and S will be positive. 11.13. ENTROPY AND THE DENSITY MATRIX OPERATOR Microcanonical ensemble ρ̂micro = 1 Γ(E) with the eigenvalue equation: E<EX m <E+∆ m |Em i hEm | , 1 |Em i if E < Em < E + ∆, ρ̂micro |Em i = Γ(E) 0 otherwise. Tr(ρ̂micro ln ρ̂micro ) = = X hEi | ρ̂micro ln ρ̂micro |Ei i i E<EX i <E+∆ i 1 1 ln hEi |Ei i Γ(E) Γ(E) = − ln Γ(E). Since S was defined as S = kB ln Γ(E), S = −kB Trρ̂micro ln ρ̂micro . Canonical ensemble ρ̂cano = 1 −β Ĥ e , Z with Z = Tre−β Ĥ = e−βF . Then, ρ̂cano = eβF e−β Ĥ and 1 1 −kB < ln ρ̂cano >= −kB β(F − < Ĥ >) = − (F − U ) = − (−T S) = S. T T Grand canonical ensemble ρ̂gc = with Then, 1 −β(Ĥ−µN̂ ) e , Z Z = e−βΩ . ρ̂gc = eβΩ eβ(−Ĥ+µN̂ ) and 1 −kB < ln ρ̂gc > = − Ω− < Ĥ > +µ < N̂ > T 1 = − (F − U ) = S. T 149 150 11.14 CHAPTER 11. QUANTUM STATISTICS Information theory In this section, we want to provide a short introduction to the information theory, which is an alternative way of interpreting statistical physics. In the following, we shall concentrate on the canonical ensemble and present a derivation of this ensemble based purely on probabilities and the extremum principle. We consider a system A with constant mean energy E. Let us suppose that an ensemble of this system is such that ar copies of the system are in state r. Then, X ar = a (11.6) r and 1X ar Er = E. a r (11.7) The number Γ(a1 , a2 , . . .) of distinct possible ways of selecting a total of a distinct systems in such a way that a1 of them are in state r = 1, a2 are in state r = 2, etc. is given by a! . a1 !a2 !a3 ! . . . Γ= The logarithm of Γ is then ln Γ = ln a! − X ln ar !. (11.8) r For a >> 1 and ar >> 1, we can apply the Stirling’s approximation: ln ar ! = ar ln ar − ar , so that eq. (11.8) becomes ln Γ = a ln a − a − ln Γ = a ln a − X X ar ln ar + r ar ln ar . X ar , r r Extremum condition: For what set of numbers a1 , a2 , a3 , . . . subject to conditions (11.6) and (11.7) will Γ be maximum? Or, in other words, for which ar will the equality X δ ln Γ = − (δar + ln ar δar ) = 0 r subject to X r δar = 0 (11.9) 151 11.14. INFORMATION THEORY and X Er δar = 0 (11.10) r hold? From the above expressions, it follows that one has: X ln ar δar = 0 r subject to eq. (11.9) and (11.10). In order to handle this extremum condition subject to constraints, we make use of the Lagrange multipliers: X (ln ar + α + βEr )δar = 0, r where α, β are the Lagrange multipliers. With a proper choice of α and β, all δar can be regarded as independent and therefore each coefficient of δar must separately vanish. We denote by ār the value of ar when Γ is maximum. Then, ln ār + α + βEr = 0, ār = e−α e−βEr . α is determined by the normalization condition (11.6): X −1 e−βEr e−α = a and β by the condition (11.7): ⇒ P −βEr E e P −βE r = E r e P̄r ≡ e−βEr ār = P −βEr . a re We obtained the probability distribution in the canonical ensemble as corresponding to the distribution of systems in the ensemble which makes the number Γ of possible configurations a maximum. In terms of Pr = aar , Γ can be written as X ln Γ = a ln a − aPr ln(aPr ) r = a ln a − a X Pr (ln a + ln Pr ) r = a ln a − a ln a X r Pr ! −a Therefore, ln Γ = −a X r Pr ln Pr . X r Pr ln Pr . 152 CHAPTER 11. QUANTUM STATISTICS Since P r Pr = 1, 0 < Pr < 1 and ln Γ is positive because ln Pr < 0. ⋆ The Pfact that it makes the quantity Pcanonical distribution Pr is characterized by the − r Pr ln Pr a maximum subject to a given value Pr Er = E of the mean energy. The entropy is given by S= X kB ln Γ = −kB Pr ln Pr . a r An increase in ln Γ reflects a more random distribution of systems over the available states. The maximum possible value of Γ gives the entropy of the final equilibrium state. P ⋆ The function - r Pr ln Pr can be used as a measure of information available about the systems in the ensemble. This function plays an important role as a measure of ”information” in problems of communication. For instance, if an event (state) has probability 1 we get no information from the occurrence of the event I(1)=0. The event is completely predictable. ”Shannon’s entropy”: quantifies the information contained in a message (for instance in bits units). A fair coin has an entropy of one bit −log2 ( 21 ). However, if the coin is not fair, then the uncertainty is lower since if asked to bet on the next outcome, we would bet preferentially on the most frequent result, and thus the Shannon entropy is lower. The concept was introduced by Claude E. Shannon in his 1948 paper ”A Mathematical Theory of Communication” Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948. Intuitively, entropy quantifies the uncertainty involved when encountering a random variable. (see, for instance, ”Entropy and information theory”, Robert M. Gray, Springer-Verlag 2008)