Download Document

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Basic ideas of statistical physics
Basic ideas of statistical physics
Statistics is a branch of science which deals with
the collection, classification and interpretation
of numerical facts. When statistical concept are
applied to physics then a new branch of science
is called Physics Statistical Physics.
Trial → experiment→ tossing of coin
Event → outcome of experiment
Sample Space A set of all possible distinct
outcomes of a experiment.
S = {H, T } for tossing of a coin
Exhaustive events
The total number of
possible outcomes in any trial
For tossing of coin exhaustive events = 2
Favorable events number of possible
outcomes (events) in any trial
Number of cases favorable in drawing a king from a
pack of cards is 4.
Mutually exclusive events no two of them
can occur simultaneously.
Either head up or tail up in tossing of coin.
Equally likely events every event is equally
preferred.
Head up or tail up
Independent events if occurrence of one
event is independent of other
E.g: Tossing of two coin
Simultaneous events two or more events
occurring at the same time
E.g: Tossing of two coin together
Probability
The probability of an event =
Number of times an event occurs
Total number of trials
If m is the number of cases in which an event occurs and
n the number of cases in which an event fails, then
m
Probability of occurrence of the event =
mn
Probability of failing of the event =
n
mn
m
n
or1 

mn
mn
The sum of these two probabilities i.e. the total
probability is always one since the event may either
occur or fail.
Principle of equal a priori probability
The principle of assuming equal probability for
events which are equally likely is known as the
principle of equal a priori probability.
A priori really means something which exists in
our mind prior to and independently of the
observation we are going to make.
For mutually exclusive events:
For two mutually exclusive events A and B, the
probability of occurrence of either event A or B is
= P(A)+P(B)
For independent events:
For two independent events A and B, the probability
that both the events occur is
= P(A)×P(B)
For n-events
P  P1  P2  ...........Pn
Distribution of 4 different Particles in
two Compartments of equal sizes
Particles must go in one of the compartments.
Both the compartments are exactly alike.
The particles are distinguishable. Let the four particles
be called as a, b, c and d.
The total number of particles in two compartments is 4
i.e.
2
 ni  4
i 1
The meaningful ways in which these four particles can
be distributed among the two compartments is shown
in table.
Macrostate
The specification of the number of particles in each
compartment is called macrostate of the system.
Or
The arrangement of the particles of a system without
distinguishing them from one another is called
macrostate of the system.
In this example if 4 particles are distributed
in 2 compts, then the possible macrostates (4+1)
If n particles are to be distributed in 2 compts.
Then the no. of macrostates is
=5
= n+1
Microstate
The distinct arrangement of the particles of a system is
called its microstate.
For example, if four distinguishable particles
are distributed in two compartments, then
4
=
2
the no. of possible microstates (16)
If n particles are to be distributed in 2
compartments. The no. of microstates is = 2n
=(Compts)particles
Thermodynamic probability or frequency
(W)
The numbers of microstates in a given macrostate is
called thermodynamics probability or frequency of that
macrostate.
For distribution of 4 particles in 2 identical
compartments
W(4,0) =1
W(3,1) =4
W(2,2) = 6
W(1,3) = 4
W(0,4) =1
W depends on the distinguishable or indistinguishable
nature of the particles. For indistinguishable particles,
W=1
macrostate Frequency probability
MicroStates
W
Comp 1
Comp 2
(4,0)
1
(3,1)
1
(2,2)
1
(1,3)
1
(0,4)
1
1
5
1
5
1
5
1
5
1
5
All the microstates of a system have equal a priori
probability.
Probability of a microstate =
1
Total no. of microstate
1
1
1

 4  n
16 2
2
Probability of a macrostate =
(no. of microstates in that microstate) 
(Probability of one miscrostate)
1
1
1
W 
W  4 W  n
16
2
2
= thermodynamic probability× prob. Of one
microstate
Constraints
Restrictions imposed on a system are called constraints.
Example
total no. particles in two compartments = 4
Only 5 macrostates (4.0), (3,1), (2,2),(1,3),(0,4) possible
The macrostates (1,2), (4,2), (0,1), (0,0) etc not possible
Accessible and inaccessible states
The macrostates / microstates which are allowed under
given constraints are called accessible states.
The macrostates/ microstates which are not allowed
under given constraints are called inaccessible states
Greater the number of constraints, smaller the number
of accessible microstates.
Distribution of n Particles in 2
Compartments
Consider n distinguishable particles in two
compartments of equal size of a box.
The (n+1) macrostates are
(0, n) (1, n, 1)… (n1, n2)…… (2, n2),….. (n 0),
Out of these macrostates, let us consider a particular
macrostate (n1, n2) such that
n1 + n 2 = n
The total no. of microstates = 2n
•n particles can be arranged among themselves in
nP
n
= n! ways
•These arrangements include meaningful as well as
meaningless arrangements.
•Total number of ways = (no. of meaningful ways)
 (no. of meaningless ways)
Total number of ways
no. of meaningful ways 
no. of meaningles s ways
•n1 particles in comp. 1 can be arranged in
= n1 ! meaningless ways.
•n2 particles in comp. 2 can be arranged in
= n2 ! meaningless ways.
•n1 particles in comp. 1 and n2 particles in comp. 2 can be
arranged in
= n1 !  n2 ! meaningless ways.
Total number of ways
no. of meaningful ways 
no. of meaningles s ways
Total number of ways
no. of meaningful ways 
no. of meaningles s ways
n!
n!


n1!n2 ! n1! (n – n1 )!
as n  n1  n2
•Now, the number of meaningful arrangements in a given
macrostate is equal to the number of microstates in that
macrostate.
•The number of microstates in a given macrostate is called
Thermodynamic probability (W).
• therefore thermodynamic probability of macrostate (n1, n2) is
n!
n!
W (n1 , n2 ) 

n1!n2 ! n1! ( n – n1 )!
•Now, the probability of a macrostate is equal to the ratio of
thermodynamic probability to total number of microstates in the
system
• Therefore probability of macrostate
1, n2)is given by
W ( n1 , n(n
)
2
( n1 , n2 ) 
2n
n!
1
n!
1

. n 
. n
n1! n2 ! 2
n1!( n  n1 )! 2
The total no. of microstates = 2n
•Prob. of distribution (r, n-r)
n!
1
1
( r , n  r ) 
. n  C (n, r ). n
r!(n  r )! 2
2
Maximum Probability:
When r=n/2 (n=even)
Therefore
Pmax
n!
1

 n
n n
! ! 2
2 2
Minimum Probability:
When r=0 or r=n
Therefore
Pmin
n!
1
1

 n  n
0! n! 2
2
Stirling’s formula
ln n! n ln n  n
Deviation from the state of Maximum
probability
The probability of the macrostate (r, n r) is
n!
1
( r , n  r ) 
. n
r!(n  r )! 2
When n particles are distributed in two comp., the
number of macrostates
= (n+1)
The macrostate (r, n r) is of maximum probability if r =
n/2, provided n is even.
The prob. of the most probable macrostate
n n
 , 
2 2
Pmax
n!
1

 n
n n
! ! 2
2 2
Let us deviate probability of macrostate slightly from
most probable state by x ( x << n )
Then new macrostate will be  n
n

  x,  x 
2
2

n!
1
Px 
 n
2
n
 n

  x !   x  !
2
 2

2
n 
 !
2 

Px  Pmax
n
 n

  x  !  x  !
2
 2

ln n! n ln n  n
stirling’s formula
Taylor’s theorem
2
3
y
y
ln( 1  y )  y 
  ......provided | y |  1
2
3
 f 2n 

Px  Pmax exp  

2


x
Where f 
is the fractional deviation from most
n/2
probable no. of particles in a cell
Discussion:
Consider deviation of the order of 0.1 i.e.
10-3
n
Px
Pmax
103
0.999
106
0.607
108
1
e 50
1
1010
e
5000
n1 > n2 > n3
n3
Px
Pmax
n2
n1
0.2
0.1
(2x / n)
0
0.1
0.2
•Thus we conclude that as n increases, the prob.
of a macrostate decreases more rapidly even for
small deviations w.r.t. the most probable state.
Px
Pmax
• If a graph is plotted between
and f , then
the probability distribution curve becomes
narrower and narrower as n increases.
•Thus if n is very large then the system stays
very near to most probable state.
Static and Dynamic systems
Static systems: If the particles of a system remain
at rest in a particular microstate, it is called static
system.
Dynamic systems:
If the particles of a system
are in motion and can move from one microstate to
another, it is called dynamic system.
Equilibrium state of a dynamic system
A dynamic system continuously changes from one
microstate to another.
Since all microstates of a system have equal a priori
probability, therefore, the system should spend same
amount of time in each of the microstate.
If tobs be the time of observation in N microstates
The time spent by the system in a particular microstate
tobs
tm 
N
Let macrostate (n1, n2 )
has frequency W (n1, n2 )
Time spend t (n1, n2 ) in macrostate (n1, n2 )
 Average time spend in each microstate  No. of microstate
tobs
t (n1 , n2 ) 
 W (n1 , n2 )
N
t (n1, n2 )  tobs  P(n1, n2 )
t (n1 , n2 )
P (n1 , n2 ) 
tobs
That is the fraction of the time spent by a dynamic
system in the macrostate is equal to the probability
of that state
Equilibrium state of dynamic system
•The macrostate having maximum probability is termed
as most probable state.
• For a dynamic system consisting of large number of
particles, the probability of deviation from the most
probable state decrease very rapidly.
•So majority of time the system stays in the most
probable state.
• If the system is disturbed, it again tends to go towards
the most probable state because the probability of
staying in the disturbed state is very small.
• Thus, the most probable state behaves as the
equilibrium state to which the system returns again and
again.
Distribution of n distinguishable
particles in k compartments of unequal
sizes
The thermodynamic prob. for macrostate (n1 , n2 , n3 ....nk )
n!
 k
n!
W (n1 , n2 ....... nk ) 
n1!n2!..... nk !
 ni !
i 1
Let the comp. 1 is divided into g1 no. of cells
Particle 1st can be placed in comp.1 in = g1 no. of ways
Particle 2nd can be placed in comp.1 in = g1 no. of ways
Particle n1th can be placed in comp.1 in =
g1 no. of ways
n1 particles in comp. 1 can be placed in = g1n1
n2
g
n2 particles in comp. 2 can be placed in = 2
nk particles in comp. k can be placed in =
nk
gk
total no. ways in which n particles in k comparmrnts
can be arranged in the cells in these compartments is
given by
 g1n1 .g2n2 .g3n3 ......g knk
k

i 1
ni
gi
Thermodynamic probability for macrostate is
n!
W (n1 , n2 ....... nk ) 
 ( g 1 ) n1 ( g 2 ) n2 ....( g k ) nk
n1!n2!..... nk !
k
ni
( gi )
W  n! 
ni !
i 1