Download Lecture 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Insulated glazing wikipedia , lookup

Dynamic insulation wikipedia , lookup

Heat sink wikipedia , lookup

Heat exchanger wikipedia , lookup

Internal energy wikipedia , lookup

Thermal radiation wikipedia , lookup

Heat capacity wikipedia , lookup

Thermoregulation wikipedia , lookup

Calorimetry wikipedia , lookup

Copper in heat exchangers wikipedia , lookup

First law of thermodynamics wikipedia , lookup

Heat equation wikipedia , lookup

Countercurrent exchange wikipedia , lookup

R-value (insulation) wikipedia , lookup

Temperature wikipedia , lookup

Heat transfer wikipedia , lookup

Heat transfer physics wikipedia , lookup

Heat wave wikipedia , lookup

Thermal conduction wikipedia , lookup

Non-equilibrium thermodynamics wikipedia , lookup

Heat wikipedia , lookup

Chemical thermodynamics wikipedia , lookup

T-symmetry wikipedia , lookup

Entropy wikipedia , lookup

Adiabatic process wikipedia , lookup

Thermodynamic system wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

Otto cycle wikipedia , lookup

Maximum entropy thermodynamics wikipedia , lookup

H-theorem wikipedia , lookup

Second law of thermodynamics wikipedia , lookup

History of thermodynamics wikipedia , lookup

Transcript
PES 2130 Fall 2014, Spendier
Lecture 7/Page 1
Lecture today: Chapter 20 (included in exam 1)
1) Entropy
2) Second Law of Thermodynamics
3) Statistical View of Entropy
Announcements:
Next week Wednesday Exam 1!
- Hand out: what will not be covered on exam
- Equation Sheet
Question regarding HW2: Monatomic vs diatomic
Indeed hydrogen is a diatomic gas. This is because monatomic hydrogen is very reactive
and unstable. Monatomic hydrogen will react with almost any other chemical. Hydrogen
gas = H2
Oxygen is similar. The most stable is diatomic oxygen O2. Next is triatomic (ozone) O3
and the least stable (very unstable) is monoatomic. Monoatomic oxygen is very shortly
living species combining easily (due to very high electronegativity) with large number of
elements including another oxygen atom.
Example of monatomic gases: helium, neon, argon, krypton, and radon
Last lecture:
Reversible and Irreversible Processes (Fundamentals of Physics book)
•Any time there is heat flow through a finite temperature drop, it is an irreversible
process.
•If you want a reversible process, heat only should flow when the temperatures are
infinitesimally different (which would take forever).
Heat Engines
The efficiency of a heat engine is defined as the ratio of the work done to the amount of
heat brought in from the hot reservoir:
Q
W

For a cyclic process,  C  1  L
QH
QH
Heat Pump (Refrigerators)
Device that does work in order to move heat from cold to hot. Refrigerators are nothing
more than engines run in reverse: they take heat out of a cold reservoir, and use work to
put more heat into the hot reservoir.
Coefficient of Performance:
QL
Q
K  L For a cyclic process, KC 
QH  QL
W
The Carnot Cycle
Carnot Engine Efficiency
Q
T
C  1  L  1  L
QH
TH
PES 2130 Fall 2014, Spendier
Lecture 7/Page 2
DEMO: Stirling engine
The Stirling engine (heat cycle) will run when it is placed on a dish of crushed ice or on a
mug of hot water. The working volume may be filled with helium which makes the
engine run faster.
http://www.animatedengines.com/stirling.html
Stirling Engine:
Comparison with the Carnot cycle shows that each engine has isothermal heat transfers at
temperatures TH and TL. However, the two isotherms of the Stirling engine cycle are
connected, not by adiabatic processes as for the Carnot engine but by constant-volume
processes. To increase the temperature of a gas at constant volume reversibly from TL to
TH (process da) requires a transfer of energy as heat to the working substance from a
thermal reservoir whose temperature can be varied smoothly between those limits. Also,
a reverse transfer is required in process bc. Thus, reversible heat transfers (and
corresponding entropy changes) occur in all four of the processes that form the cycle of a
Stirling engine, not just two processes as in a Carnot engine.
Now we are ready to make the connection to entropy!
Entropy
“Any method involving the notion of entropy, the very existence of which depends on the
second law of thermodynamics, will doubtless seem too many far-fetched, and may repel
beginners as obscure and difficult of comprehension.” - Willard Gibbs, Graphical
Methods in the Thermodynamics of Fluids (1873)
Let us start our discussion of entropy by looking at one of the last steps in our derivation
of the Carnot efficiency:
QL
T
 L
QH
TH
QL QH

0
TL TH
Now, if we define a variable S, such that:
dS 
dQ
(T is constant)
T
PES 2130 Fall 2014, Spendier
Lecture 7/Page 3
f
 dQ  Q , where Q is the total energy transferred as heat during the process
and since
i
f
1
f
 dS  T  dQ
i
i
S  S f  Si 
Q
T
Then, we can see that over the entire Carnot cycle,
QL QH

0
TL TH
the variable S, which we now refer to as the entropy, returns to the same value. In other
words the change in entropy adds to zero over the entire cycle.
The net entropy change per cycle: S  SL  SH  0
(In a Carnot engine there are two reversible energy transfers as heat, and thus two
changes in the entropy of the working substance - one at temperature TH and one at TL.)
Entropy as a State Variable
• Now, perhaps that is just some artifact of the Carnot cycle, but, using calculus, we can
show that over any cycle, the entropy returns to its original value (see proof in book).
• Therefore, entropy is a state variable! It can describe the state of a system (like the
temperature, volume, pressure, Eint, etc.). A state function only depends on a state and not
the path the system takes (unlike W and Q).
This means that if we want to calculate the entropy of an irreversible process we can use:
f
dQ
Sirr , gas  S f  Si  Srev , gas  
T
i
In other words, we use a corresponding reversible path to do the integration.
The 2nd Law of Thermodynamics
All heat engines have a maximum efficiency that is much less than 100% because of the
second law of thermodynamics.
No engine can exceed Carnot efficiency, because heat does not flow spontaneously from
cold to hot.
“No process is possible whose sole result is the transfer of heat from a body of lower
temperature to a body of higher temperature.” – Rudolf Clausius (1854)
“No process is possible in which the sole result is the absorption of heat from a reservoir
and its complete conversion into work.” – Lord Kelvin (1851)
In other words, since objects and their environment will always reach a thermal
equilibrium eventually, you can't keep getting work out of the system! "There is no such
thing as a perpetual motion machine!"
PES 2130 Fall 2014, Spendier
Lecture 7/Page 4
Statements above are concerned with work and are considered the first statements of the
second law before entropy was defined.
Entropy - Measures the amount of disorder in a system.
Let’s examine now an infinitesimal isothermal expansion:
since d∆Eint = 0 (∆T = 0, b/c of constant temperature)
nRT
dV
dQ = dW = pdV =
V
dV
dQ

 dS
V
nRT
Now, it turns out that the incremental fractional volume increase (for added heat) is
directly related to the increase in disorder of the gas. Since there is more space to occupy,
there are more ways which the molecules can exist where the gas has the same state.
The second law of thermodynamics can be stated in terms of entropy: No process is
possible in which the total entropy of an isolated system decreases.
S  0
The term “isolated system” here is crucial. The entropy of some object can decrease, if it
is in contact with another object whose entropy increases just as much or more.
It is certainly possible for something to have more order than it did at some point in the
past, but it just means that something else, in contact with it now has less order.
irreversible process: Sirr  0
reversible process: Srev  0
We just talked about entropy being a state variable and that if we want to calculate the
entropy of an irreversible process we can use a corresponding reversible path to do the
integration:
f
dQ
Sirr , gas  S f  Si  Srev , gas  
T
i
But doesn't this violate the 2nd law of thermodynamics? No, since we have not
considered the whole system, only the gas. The second law applies to the system as a
whole, not just one component. We would also need to calculate the changes in entropy
of the surroundings which can be difficult.
PES 2130 Fall 2014, Spendier
Lecture 7/Page 5
Example:
A 250g potato at 293K is thrown into a 2.00 L pot of water at 373K.
a) What is the final equilibrium temperature of the potato-water system?
b) What is the change in entropy of this system, assuming it is isolated from the
surroundings?
Specific heat of potato: cp = 3430 J kg-1 K-1
Specific heat of water: cw = 4187 J kg-1 K-1
PES 2130 Fall 2014, Spendier
Lecture 7/Page 6
A statistical View of Entropy and the Arrow of Time
You can slide a book across a table. It stops due to energy lost from friction. But, have
you ever seen a book start sliding taking energy from the thermal energy of the table?
You could start out with all of the air in this room in one half and vacuum in the other
half, separated by a partition. Then remove the partition and the air would fill the room.
But, you wouldn’t expect for a room filled with air to spontaneously separate into half
filled and half empty.
Neither of these scenarios is prohibited by the conservation laws or by the first law of
thermodynamics. Why don’t many things that happen spontaneously in nature also
happen in reverse?
Macroscopic and Microscopic States
In order to understand this apparent direction in time we need to understand how to find a
particular macro-state given the statistics of the micro-states which make it up. As
example, let’s look at a simpler problem: a coin toss.
Let’s say we have one coin. For simplicity, it is motionless and can only be in one place.
Its “state”, which we now specify as its “macro-state” is defined as whether it is a heads
or a tails.
How many ways can you make each macro-state? i.e. what is the number of possible
micro-states (called "multiplicity" in your textbook). Only one way for each state - one
micro-state for each way. So, if we toss a single coin in the air, what is the probability
that it will land in each macro-state? 1H or 1T?
P 1H  
# microstates 1H 
 microstates

1
1

1 1 2
 microstates  microstates 1H   microstates 1T 
PES 2130 Fall 2014, Spendier
P 1T  
# microstates 1T 
 microstates

Lecture 7/Page 7
1
1

1 1 2
Now, lets do the same with two coins.
There are 3 different macro-states, but for each macro-state, there is a different number of
ways to make it. For one tail and one head, there are 2 ways to make it.
So, if we toss 2 coins in the air, what is the probability that they will come up all heads?
( )
( )
∑
What is the probability that they will come up 1 head and 1 tail?
(
)
(
)
∑
Notice that
the total number of micro-states = 2N =
 microstates
where N is the number of coins.
So, for 100 coins, there are 2100 micro-states and the possibility of throwing 100 heads is
1 in 1,267,650,600,228,229,401,496,703,205,376!
Now, the number of molecules in this room is approximately 1 x 1029. Consider the
macro-states as being if (heads) if a molecule is in one half of the room and (tails) if it is
in the other half. What is the probability that on a random sampling of the molecules
position, we find them all in one half of the room?
So, this tendency towards disorder is just a
consequence of the probability of finding a
particular macro-state given the statistics of the
micro-states which make it up! This is the
underlying nature of entropy, and the
underlying nature of the direction of time.
Calculating Entropy
PES 2130 Fall 2014, Spendier
Lecture 7/Page 8
One can calculate the entropy, first expressed by Ludwig Boltzmann (1844 – 1906) an
Austrian physicist and philosopher), as:
S  kB ln  w ,
where kB=1.38 x 10-23 JK-1, is the Boltzmann constant and w is the number of all microstates available to the system.
Or, since only the difference in entropies are ever used in calculations:
S  S f  Si  kB ln  w f   kB ln  wi 
 wf 
SkB ln 

 wi 