Download Entropy in chemical thermodynamics

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Countercurrent exchange wikipedia , lookup

R-value (insulation) wikipedia , lookup

Thermal radiation wikipedia , lookup

Equation of state wikipedia , lookup

Thermoregulation wikipedia , lookup

Heat transfer wikipedia , lookup

Heat equation wikipedia , lookup

Thermal conduction wikipedia , lookup

Conservation of energy wikipedia , lookup

First law of thermodynamics wikipedia , lookup

Temperature wikipedia , lookup

Heat wikipedia , lookup

Internal energy wikipedia , lookup

Heat transfer physics wikipedia , lookup

Adiabatic process wikipedia , lookup

Non-equilibrium thermodynamics wikipedia , lookup

Entropy wikipedia , lookup

T-symmetry wikipedia , lookup

Chemical thermodynamics wikipedia , lookup

History of thermodynamics wikipedia , lookup

Thermodynamic system wikipedia , lookup

Second law of thermodynamics wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

Maximum entropy thermodynamics wikipedia , lookup

H-theorem wikipedia , lookup

Transcript
Entropy
Entropy is a concept applied across physics, information theory, mathematics and other
branches of science and engineering. The following definition is shared across all these
fields:
where S is the conventional symbol for entropy. The sum runs over all microstates
consistent with the given macrostate and is the probability of the ith microstate. The
constant of proportionality k depends on what units are chosen to measure S. When SI
units are chosen, we have k = kB = Boltzmann's constant = 1.38066×10−23 J K−1. If units
of bits are chosen, then k = 1/ln(2) so that
.
Entropy is central to the second law of thermodynamics. The second law in conjunction
with the fundamental thermodynamic relation places limits on a system's ability to do
useful work.
The second law can also be used to predict whether a physical process will proceed
spontaneously. Spontaneous changes in isolated systems occur with an increase in
entropy.
Correspondence
The statistical definition of entropy matches up with the thermodynamic formula for
calculating entropy, because adding heat to a system, which increases its classical
thermodynamic entropy, also increases the system's thermal fluctuations, so giving an
increased lack of information about the exact microscopic state of the system, i.e. an
increased statistical mechanical entropy.
The thermodynamics approach to entropy is less general, because it only applies to
systems where energy and temperature are well defined. In contrast, the statistical notion
of entropy applies to all of thermodynamics as well as to other systems, such as
cryptography, data compression and pattern recognition, where energy and temperature
may be irrelevant and/or undefinable.
Entropy versus heat and temperature
Loosely speaking, when a system's energy is divided into its "useful" energy (energy that
can be used, for example, to push a piston), and its "useless energy" (that energy which
cannot be used to do external work), then entropy can be used to estimate the "useless",
"stray", or "lost" energy, which depends on the entropy of the system and the absolute
temperature of the surroundings. As the "useful" and "useless" energy both depend on the
surroundings, neither one is a function of the state of the system, and both can be quite
tricky to quantify. This stands in contrast to the system's Gibbs free energy, Helmholtz
free energy, entropy, and temperature, all of which are well-defined functions of state.
The Gibbs and Helmholtz free energies depend on the temperature of the system (not the
surroundings), and do not purport to measure the "useful" energy.
When heat is added to a system at high temperature, the increase in entropy is small.
When heat is added to a system at low temperature, the increase in entropy is great. This
can be quantified as follows: in thermal systems, changes in the entropy can be
ascertained by observing the temperature while observing changes in energy. This is
restricted to situations where thermal conduction is the only form of energy transfer (in
contrast to frictional heating and other dissipative processes). It is further restricted to
systems at or near thermal equilibrium. In systems held at constant temperature, the
change in entropy, ΔS, is given by the equation
where Q is the amount of heat absorbed by the system in an isothermal and
reversible process in which the system goes from one state to another, and T is the
absolute temperature at which the process is occurring.
If the temperature of the system is not constant, then the relationship becomes a
differential equation:
Then the total change in entropy for a transformation is:
This thermodynamic approach to calculating the entropy is subject to several narrow
restrictions which must be respected. In contrast, the fundamental statistical definition of
entropy applies to any system, including systems far from equilibrium, and including
experiments where "heat" and "temperature" are undefinable. In situations where the
thermodynamic approach is valid, it can be shown to be consistent with the fundamental
statistical definition.
In any case, the statistical definition of entropy remains the fundamental definition, from
which all other definitions and all properties of entropy can be derived.
Entropy in chemical thermodynamics
Main article: Chemical thermodynamics
Thermodynamic entropy is central in chemical thermodynamics, enabling changes to be
quantified and the outcome of reactions predicted. The second law of thermodynamics
states that entropy in the combination of a system and its surroundings (or in an isolated
system by itself) increases during all spontaneous chemical and physical processes.
Spontaneity in chemistry means “by itself, or without any outside influence”, and has
nothing to do with speed. The Clausius equation of δqrev/T = ΔS introduces the
measurement of entropy change, ΔS. Entropy change describes the direction and
quantifies the magnitude of simple changes such as heat transfer between systems –
always from hotter to cooler spontaneously. Thus, when a mole of substance at 0 K is
warmed by its surroundings to 298 K, the sum of the incremental values of qrev/T
constitute each element's or compound's standard molar entropy, a fundamental physical
property and an indicator of the amount of energy stored by a substance at 298 K.
Entropy change also measures the mixing of substances as a summation of their relative
quantities in the final mixture.
Entropy is equally essential in predicting the extent of complex chemical reactions, i.e.
whether a process will go as written or proceed in the opposite direction. For such
applications, ΔS must be incorporated in an expression that includes both the system and
its surroundings, ΔSuniverse = ΔSsurroundings + ΔS system. This expression becomes, via some
steps, the Gibbs free energy equation for reactants and products in the system: ΔG [the
Gibbs free energy change of the system] = ΔH [the enthalpy change] −T ΔS [the entropy
change].
Entropy balance equation for open systems
In chemical engineering, the principles of thermodynamics are commonly applied to
"open systems", i.e. those in which heat, work, and mass flow across the system
boundary. In a system in which there are flows of both heat ( ) and work, i.e.
(shaft
work) and P(dV/dt) (pressure-volume work), across the system boundaries, the heat flow,
but not the work flow, causes a change in the entropy of the system. This rate of entropy
change is
where T is the absolute thermodynamic temperature of the system at the
point of the heat flow. If, in addition, there are mass flows across the system boundaries,
the total entropy of the system will also change due to this convected flow.
During steady-state continuous operation, an entropy balance applied to an open system
accounts for system entropy changes related to heat flow and mass flow across the system
boundary.
To derive a generalized entropy balanced equation, we start with the general balance
equation for the change in any extensive quantity Θ in a thermodynamic system, a
quantity that may be either conserved, such as energy, or non-conserved, such as entropy.
The basic generic balance expression states that dΘ/dt, i.e. the rate of change of Θ in the
system, equals the rate at which Θ enters the system at the boundaries, minus the rate at
which Θ leaves the system across the system boundaries, plus the rate at which Θ is
generated within the system. Using this generic balance equation, with respect to the rate
of change with time of the extensive quantity entropy S, the entropy balance equation
for an open thermodynamic system is:
where
= the net rate of entropy flow due to the flows of mass into and out of the
system (where = entropy per unit mass).
= the rate of entropy flow due to the flow of heat across the system boundary.
= the rate of internal generation of entropy within the system.
Note, also, that if there are multiple heat flows, the term is to be replaced by
where is the heat flow and Tj is the temperature at the jth heat flow port into
the system.
The concept of entropy can be described qualitatively as a measure of energy dispersal at
a specific temperature.[Similar terms have been in use from early in the history of
classical thermodynamics, and with the development of statistical thermodynamics and
quantum theory, entropy changes have been described in terms of the mixing or
"spreading" of the total energy of each constituent of a system over its particular
quantized energy levels.
Ambiguities in the terms disorder and chaos, which usually have meanings directly
opposed to equilibrium, contribute to widespread confusion and hamper comprehension
of entropy for most students. As the second law of thermodynamics shows, in an isolated
system internal portions at different temperatures will tend to adjust to a single uniform
temperature and thus produce equilibrium. A recently developed educational approach
avoids ambiguous terms and describes such spreading out of energy as dispersal, which
leads to loss of the differentials required for work even though the total energy remains
constant in accordance with the first law of thermodynamics. Physical chemist Peter
Atkins, for example, who previously wrote of dispersal leading to a disordered state, now
writes that "spontaneous changes are always accompanied by a dispersal of energy", and
has discarded 'disorder' as a description.