Download Entropy and the end of it all

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Internal energy wikipedia , lookup

First law of thermodynamics wikipedia , lookup

Heat transfer physics wikipedia , lookup

Thermal conduction wikipedia , lookup

Ludwig Boltzmann wikipedia , lookup

Heat wikipedia , lookup

Temperature wikipedia , lookup

Adiabatic process wikipedia , lookup

Chemical thermodynamics wikipedia , lookup

Entropy wikipedia , lookup

Non-equilibrium thermodynamics wikipedia , lookup

Thermodynamic system wikipedia , lookup

History of thermodynamics wikipedia , lookup

Second law of thermodynamics wikipedia , lookup

T-symmetry wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

Maximum entropy thermodynamics wikipedia , lookup

H-theorem wikipedia , lookup

Transcript
Entropy and the end of it all
In thermodynamics entropy was defined as,
S   dQ/T
rev
Consider now several examples of how to use this
definition to find entropy changes in systems.
For a phase transition,
S = mL/Tc ,
where Tc is the (absolute) temperature for the phase
transition and L is the heat of transition.
To take another example, we show in class for an ideal gas
undergoing a reversible process from state i to state f (use
the above definition, the ideal gas law, dEint and the first
law of thermodynamics):









S = ncVln(Tf/Ti) + nRln(Vf/Vi).
Thermodynamics also gives us a statement of the second
law,
Sisolated > 0.
However, you have no doubt heard that entropy is a
measure of the disorder or randomness of a system. The
above thermodynamic definition of entropy seems far
removed from entropy as randomness. We will now relate
entropy to the disorder of a system using ideas from
statistical mechanics.
Statistical mechanics is a deeper look into the thermal
properties of a system than the look given by
thermodynamics. In statistical mechanics entropy is defined
S  kB ln W.
If the system is highly disordered, W is large. We will give
an example in class that will make sense of the statistical
interpretation of entropy and introduce new terms such as
microstate, macrostate, and probabilities.
A bit of philosophy.
Some physicists believe the second law of thermodynamics
implies the world will end with everything being heat, 'the
heat death of the universe'. Everybody goes to hell, so to
speak.
Other physicists counter this argument with the fact that
Newtonian dynamical laws are time-reversal invariant-you couldn't tell whether you are watching a movie of an
event running backwards or the real thing going forward in
time. In the late 1800's a famous French physicist, Henri
Poincare, proved a theorem that says for a finite system
obeying Newtonian dynamics, the system will always
return to its initial state (provided you wait long enough).
The waiting time is called a Poincare cycle.
Finally, no one knows the answer to these dilemma's
because we now realize that Newtonian mechanics is only
partly right. You have to study quantum mechanics if you
really want to know how things work or at least how we
currently think things work.
Examples[in class]