Download Chemistry Entropy Notes 1. What is entropy? How many ways can

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Adiabatic process wikipedia , lookup

Chemical potential wikipedia , lookup

History of thermodynamics wikipedia , lookup

Non-equilibrium thermodynamics wikipedia , lookup

Entropy wikipedia , lookup

T-symmetry wikipedia , lookup

Thermodynamic system wikipedia , lookup

Second law of thermodynamics wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

Maximum entropy thermodynamics wikipedia , lookup

H-theorem wikipedia , lookup

Chemical thermodynamics wikipedia , lookup

Transcript
Chemistry
Entropy Notes
1.
What is entropy?
How many ways can you arrange the particles and energy on the inside of a system while the
system maintains the same outward appearance? This number of arrangements is called the
entropy of the system. Actually, because the number of arrangements of atoms/molecules/ions
is huge, entropy (S) is defined as the natural logarithm of the number of arrangements (W)
multiplied by a constant (k):
S = k*ln(W)
K is Boltzman’s constant, and the equation bears the name “Boltzman’s equation.” The
equation appears on Ludwig Boltzman’s tomb in Vienna, Austria.
A high value of entropy means that there are many possible arrangements that yield the same
state of the system. This means that high-entropy states are more likely – more probable – than
low-entropy states.
2.
How does entropy relate to chemical reactions?
Entropy is the key variable that helps us understand if chemicals will react completely, or if they
will react partially, or of they will fail to react. What we call the “system” is the process that
we’re interested in – something like the mixing of HCl and NaOH solutions in a single beaker.
What we call the “surroundings” is the environment that surrounds the process of interest. The
air in the room, for instance. Together, the “system” and the “surroundings” make up the
“universe.” The second law of thermodynamics tells us what we need to know about entropy.
“Any spontaneous chemical or physical change (that is, a change that happens by itself
(although we might add a catalyst)) increases the entropy of the universe.” If a change
decreases the entropy of the universe, the change won’t be spontaneous – it won’t happen on
its own.
Another approach to entropy in chemical reactions is to say that spontaneous changes result in
the conversion of less-probable states to more-probable states. This is why the changes occur.
Entropy is the overall impetus behind chemical reactions.
3.
What sort of generalizations can we make about the relative entropies of materials?
a) Gases have higher much entropies than liquids, and liquids have higher entropies than
solids. (Solids act as an entropy reference, at least in theory. The third law of
thermodynamics states, “The entropy of a perfect crystal at 0 Kelvin is zero.”)
b) Mixtures have higher entropies than pure substances.
c) Generally, entropy increases with molar mass.
d) Generally, entropy increases with temperature.
4.
Can we calculate entropy values for materials and reactions?
Sure. The table you possess that contains standard heats of formation also contains standard
entropy values.
5.
Does it help us to calculate entropy values for chemical reactions in the way we
calculated ∆H° values?
Not really. The entropy values that sit in your table of thermodynamic values apply only to the
system, but the second law applies to the sum of the system and surroundings.
6.
Is there something we can calculate for the system that will be more useful than
entropy?
Yes. Your table of thermodynamic values also contains ∆G° values. ∆G° is named the Gibbs
Free Energy of a system. It incorporates the system’s entropy change and the system’s heat
change (heat from the system increases the entropy of the surroundings). So, ∆G° is a useful
predictor of whether or not reactions will occur because it incorporates changes in both the
system and the surroundings. If we calculate ∆G° for a reaction and the result is (-), the reaction
will be spontaneous. If we calculate ∆G° for a reaction and the result is (+), the reaction will be
non-spontaneous – it won’t happen. The presence of a supposed catalyst won’t matter. You
can’t speed up a reaction that won’t happen in the first place.
7.
Does a reaction’s ∆G° value tell us anything else about the reaction?
Yes! A negative ∆G° value tells us the amount of work that can be extracted from a chemical
reaction. Because of entropy and the second law of thermodynamics, we know that not all of
the heat from a reaction can be converted to work. The ∆G° value of a reaction, because it
accounts for entropy, tells us the number of kilojoules of work that are possible to accomplish
with a chemical reaction.
8.
What happens when the work capacity of a chemical reaction is exhausted? (What
happens when the ∆G for a reaction reaches zero?)
A system not at thermal equilibrium can be made to do work. Once thermal equilibrium is
obtained, no more work can be done. In the same way, a system not at chemical equilibrium
can be made to do work. Once chemical equilibrium is reached, no more work can be done.
When the work capacity of a chemical reaction is exhausted, the reaction has reached chemical
equilibrium.
9.
What does chemical equilibrium look like?
Reactions that have reached equilibrium look like they have “run out of gas.” The system stops
changing. When the chemical reaction in a battery reaches chemical equilibrium, the battery is
dead.