Download Chapter Entropy Statistics

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Thermal conduction wikipedia , lookup

Van der Waals equation wikipedia , lookup

First law of thermodynamics wikipedia , lookup

Heat transfer physics wikipedia , lookup

Heat wikipedia , lookup

Internal energy wikipedia , lookup

Ludwig Boltzmann wikipedia , lookup

Temperature wikipedia , lookup

Adiabatic process wikipedia , lookup

Chemical thermodynamics wikipedia , lookup

History of thermodynamics wikipedia , lookup

Non-equilibrium thermodynamics wikipedia , lookup

Entropy wikipedia , lookup

T-symmetry wikipedia , lookup

Thermodynamic system wikipedia , lookup

Second law of thermodynamics wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

Maximum entropy thermodynamics wikipedia , lookup

H-theorem wikipedia , lookup

Transcript
Statistical Interpretation
of
Entropy
By:Rama Arora
Associate Professor
Govt. College for Girls
Sector 11, Chandigarh
Statistical Definition of Entropy


Statistically the Entropy S of a system is the
product of the natural logarithm of
thermodynamics probability W and Boltzmann
Constant k
S = k lnW
(k = 1.38 x 10-16 erg/K)
So Entropy in the state of equilibrium of system
is given by
S = K ln (Wmax)
Change of entropy of a system


The thermodynamic probability W is a function of
no. of particles (n) (which is function of µ amount
of substance, the total no. of available phase
space cells (which in turn depends upon the
volume) and the energy u of the system.
S = f (u, v, µ)
Therefore entropy s, is also a function of µ, v, u.
So the entropy of the system can be changed by
changing the energy µ.
First law of thermodynamics



From first law of thermodynamics
δQ = dU + PdV
Since V is constant, so
δQ = dU
Since δQ is infinitesimally small so the system is
assumed to be in most probable state
ds = δQ /T
Additive Nature of Entropy


S1 and S2 be entropies of
the sub-systems A and B
respectively.
S = S1 + S2
In general, the entropy of
the system is given by the
summation of entropies of
all the subsystems of the
system.
i.e.
S = ∑ Si
k
i=1
A
B
Third law of thermodynamics


Thermodynamics probability W decreases
with the decrease of temp.
The minimum possible temperature is
called the absolute zero.
When

S = k log W
W = 1, S=0.
The absolute zero temperature is 00 kelvin
Reversible Process

A process in which a system can be retraced to
its initial state by reversing the controlling factor
is called reversible process.
-> Carnot cycle
Irreversible Process

Physical Processes That Proceed in One Direction
But Not The Other
Tends Towards Equilibrium

Equilibrium Only At End of Process

Hot
dQ
Cold
->Thermal Conduction
Irreversible Processes
Diffusion of gases
Principle of increase of entropy


A natural process always takes place in such a
direction as to cause an increase in the entropy of
the system and surroundings.
In an isolated system the entropy of the system
always tends to increase.
Ice melting -an example of "entropy increasing"
Entropy and Disorder


Entropy is a measure of
molecular disorder.
It is the law of the nature
that disorder is more
probable than order.
Is this your room/
Then you already know
about entropy
A system (such as a room) is in a state of
high entropy when its degree of disorder is
high.
As the order within a system increases,
its entropy decreases.