Download Dr.Eman Zakaria Hegazy Quantum Mechanics and Statistical

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Equipartition theorem wikipedia , lookup

Temperature wikipedia , lookup

Heat wikipedia , lookup

First law of thermodynamics wikipedia , lookup

Entropy wikipedia , lookup

Conservation of energy wikipedia , lookup

Heat transfer physics wikipedia , lookup

Internal energy wikipedia , lookup

Adiabatic process wikipedia , lookup

Ludwig Boltzmann wikipedia , lookup

Non-equilibrium thermodynamics wikipedia , lookup

Extremal principles in non-equilibrium thermodynamics wikipedia , lookup

Chemical thermodynamics wikipedia , lookup

Maximum entropy thermodynamics wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

Second law of thermodynamics wikipedia , lookup

Thermodynamic system wikipedia , lookup

T-symmetry wikipedia , lookup

History of thermodynamics wikipedia , lookup

H-theorem wikipedia , lookup

Transcript
Dr.Eman Zakaria Hegazy Quantum Mechanics and Statistical Thermodynamics Lecture 21
Statistical Thermodynamics
Introduction and Definitions
Statistical Thermodynamics: is the application of probability theory,
which includes mathematical tools for dealing with large populations, to
the field of mechanics, which is concerned with the motion of particles
when subjected to a force.
- It provides a framework for relating the microscopic properties of
individual atoms and molecules to the macroscopic or bulk
properties of materials that can be observed in everyday life,
therefore explaining thermodynamics as a natural result of statistics
and mechanics (classical and quantum) at the microscopic level.
- It provides a molecular-level interpretation of thermodynamic
quantities such as work, heat, free energy, and entropy, allowing the
thermodynamic properties of bulk materials to be related to the
spectroscopic data of individual molecules.
- Statistical thermodynamics was born in 1870 with the work of
Austrian physicist Ludwig Boltzmann, much of which was
collectively published in Boltzmann's 1896 Lectures on Gas Theory.[2]
Boltzmann's original papers on the statistical interpretation of
thermodynamics, the H-theorem, transport theory, thermal
equilibrium, the equation of state of gases, and similar subjects,
occupy about 2,000 pages in the proceedings of the Vienna Academy
and other societies. The term "statistical thermodynamics" was
proposed for use by the American thermodynamicist and physical
chemist J. Willard Gibbs in 1902. According to Gibbs, the term
"statistical", in the context of mechanics, i.e. statistical mechanics,
was first used by the Scottish physicist James Clerk Maxwell in 1871.
[1]
Dr.Eman Zakaria Hegazy Quantum Mechanics and Statistical Thermodynamics Lecture 21
Some Definitions must be taken in statistical thermodynamics such
as:
First law of thermodynamics: states that energy can be transformed
(changed from one form to another), but it can neither be created
nor destroyed.
Heat is a process by which energy is either added to a system from a
high-temperature source or lost from a system to a low-temperature
sink
energy may be lost by the system when it does work on its
surroundings, or conversely, energy may be gained as a result of
work done to it by its surroundings
The internal energy of a thermodynamic system, or a body with
well-defined boundaries, denoted by U, or sometimes E, is the total of
the kinetic energy due to the motion of molecules (translational,
rotational, vibrational) and the potential energy associated with the
vibrational and electric energy of atoms within molecules or crystals.
It includes the energy in all of the chemical bonds, and the energy of
the free, conduction electrons in metals.
- The increase in the internal energy of a system is equal to the
amount of energy added by heating the system, minus the amount
lost as a result of the work done by the system on its
surroundings.
The first law can be stated mathematically as:
where dU is a small change in the internal energy of the system, δQ is a
small amount of heat added to the system, and δW is a small amount of
work done by the system. The sign convention here is that δQ < 0 if
energy is lost from the system as heat, but δW > 0 if energy is lost from
the system as work. Note that some textbooks (e.g., Greiner Neise
Stocker) alter the sign convention for W and formulate the first law as:
[2]
Dr.Eman Zakaria Hegazy Quantum Mechanics and Statistical Thermodynamics Lecture 21
where δW is the work done on the system. This amounts to again using
δQ < 0 if energy leaves the system as heat, but now taking δW < 0 if
energy leaves the system as work.[1] So, when a system (e.g., a gas)
expands the work done − PdV whereas in the previous formulation of
the first law, the work done by the gas while expanding is PdV. In any
case, both give the same result when written explicitly as:
In other words δw = PdV where P is pressure and V is volume. Also, for
a reversible process, the total amount of heat added to a system can be
expressed as δQ = TdS where T is temperature and S is entropy.
Therefore, for a reversible process, :
The second law of thermodynamics is an expression of the universal
law of increasing entropy, stating that the entropy of an isolated
system which is not in equilibrium will tend to increase over time,
approaching a maximum value at equilibrium.
In statistical thermodynamics, entropy is defined as
Statistical mechanics explains entropy as the amount of
uncertainty which remains about a system, after its observable
macroscopic properties have been taken into account. For a given
set of macroscopic variables, like temperature and volume, the
entropy measures the degree to which the probability of the
system is spread out over different possible quantum states. The
more states available to the system with appreciable probability,
the greater the entropy. More specifically, entropy is a
logarithmic measure of the density of states. In essence, the most
general interpretation of entropy is as a measure of our
uncertainty about a system. The equilibrium state of a system
[3]
Dr.Eman Zakaria Hegazy Quantum Mechanics and Statistical Thermodynamics Lecture 21
maximizes the entropy because we have lost all information about
the initial conditions except for the conserved variables;
maximizing the entropy maximizes our ignorance about the
details of the system.[6] This uncertainty is not of the everyday
subjective kind, but rather the uncertainty inherent to the
experimental method and interpretative model.
where
kB is Boltzmann's constant 1.38066×10−23 J K−1 and
is the number of microstates corresponding to the observed
thermodynamic macrostate.
A common mistake is taking this formula as a hard general definition of
entropy. This equation is valid only if each microstate is equally
accessible (each microstate has an equal probability of occurring).
Boltzmann Distribution
If the system is large the Boltzmann distribution could be used (the
Boltzmann distribution is an approximate result)
This can now be used with
:
[4]
Dr.Eman Zakaria Hegazy Quantum Mechanics and Statistical Thermodynamics Lecture 21
[5]