Download Entropy

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

First law of thermodynamics wikipedia , lookup

Heat wikipedia , lookup

Conservation of energy wikipedia , lookup

Temperature wikipedia , lookup

Heat transfer physics wikipedia , lookup

Adiabatic process wikipedia , lookup

Internal energy wikipedia , lookup

Ludwig Boltzmann wikipedia , lookup

Chemical thermodynamics wikipedia , lookup

History of thermodynamics wikipedia , lookup

Non-equilibrium thermodynamics wikipedia , lookup

Thermodynamic system wikipedia , lookup

Entropy wikipedia , lookup

T-symmetry wikipedia , lookup

Second law of thermodynamics wikipedia , lookup

Entropy in thermodynamics and information theory wikipedia , lookup

Maximum entropy thermodynamics wikipedia , lookup

H-theorem wikipedia , lookup

Transcript
Entropy
Depending on the topic and the context in which it is being used, the term entropy has
been used to describe any of numerous phenomena. The word entropy was introduced in
1865 by Rudolf Clausius, a German physicist. Two main areas, thermodynamic entropy
(including statistical mechanics) and information entropy, are discussed here.
The concept of thermodynamic entropy is central to the second law of thermodynamics,
which deals with physical processes and whether they occur spontaneously. In a general
sense the second law says that temperature differences between systems in contact with
each other tend to even out and that work can be obtained from these non-equilibrium
differences, but that loss of heat occurs, in the form of entropy, when work is done (1).
Thermodynamic entropy provides a comparative measure of the amount of this decrease
in internal energy of the system and the corresponding increase in internal energy of the
surroundings at a given temperature (1). Spontaneous changes tend to smooth out
differences in temperature, pressure, density, and chemical potential that may exist in a
system, and entropy is thus a measure of how far this smoothing-out process has
progressed (1). This contrasts with the first law of thermodynamics which states that
energy is conserved. The thermodynamic definition of entropy is only valid for a system
in equilibrium (1). The statistical definition of entropy applies to any system, and as such
is commonly referred to as the fundamental definition of entropy (1).
Statistical mechanics is the application of probability theory, which includes
mathematical tools for dealing with large populations, to the field of mechanics (4). It
provides a framework for relating the microscopic properties of individual atoms and
molecules to the macroscopic or bulk properties of materials that can be observed in
everyday life, therefore explaining thermodynamics as a natural result of statistics and
mechanics (classical and quantum) at the microscopic level (4). Entropy in statistical
mechanics can be described as the measure of uncertainty remaining about a system, after
the observable macroscopic properties have been taken into account (1). The equilibrium
state of a system maximizes the entropy due to the information known about initial
conditions decreasing.
Information entropy or Shannon entropy is a gauge of the ambiguity related to a random
variable in information theory. It can be interpreted as the average shortest message
length, in bits, that can be sent to communicate the true value of the random variable to a
recipient. This represents a fundamental mathematical limit on the best possible lossless
data compression of any communication: the shortest average number of bits that can be
sent to communicate one message out of all the possibilities is the Shannon entropy (5).
Equivalently, the Shannon entropy is a measure of the average information content the
recipient is missing when they do not know the value of the random variable (5).
In calculations, entropy is symbolized by S and is a measure at a particular instant, a state
function. Thus entropy as energy Q in relation to absolute temperature T is expressed as S
= Q/T. Often change in entropy, symbolized by ΔS, is referred to in relation to change in
energy, δQ (1). Entropy is a factor in determining free energy of a system. Entropy S of
the macroscopic state is defined as S = k log W (2). S is extensive, increases through the
collisions between the molecules, and in the special case of local equilibrium coincides
with the previously defined entropy (2).
Over the years, the meanings and uses of entropy have changed. When first used, entropy
referred to the energy losses from mechanical machines that cannot operate at 100%
efficiency when converting energy into work. In the late 19th century the word "disorder"
was used by Ludwig Boltzmann in developing statistical views of entropy using
probability theory to describe the increased molecular movement on the microscopic
level (3). Through most of the 20th century, textbooks tended to describe entropy as
"disorder", following Boltzmann's early conceptualization of the motional energy of
molecules (3). Describing entropy as “dispersal of energy” is currently preferred. Since
particles are energetic, entropy can also be used to describe the dispersion of particles.
When mixed together, energy and particles can disperse at different rates.
Entropy has a relatively short history that begins with Lazare Carnot and his
Fundamental Principles of Equilibrium and Movement that was published in 1803. He
proposed that the moving of parts in a machine meant losses of moment of activity (1).
Lazare’s son Sadi Carnot published his own work, Reflections on the Motive Power of
Fire in 1824. This was a premature look into the second law of thermodynamics. Rudolf
Clausius questioned some of the work by the younger Carnot, and this led to the creation
of the term entropy. Since then, Boltzmann, Gibbs and Maxwell have given entropy a
statistical basis (1).
(1) Entropy, http://en.wikipedia.org/wiki/Entropy.
(2) Paul H. E. Meijer, Editor. Views of a Physicist: Selected Papers of N.G. van Kampen.
World Scientific. 2000.
(3) Introduction to Entropy, http://en.wikipedia.org/wiki/Introduction_to_entropy.
(4) Statistical Mechanics, http://en.wikipedia.org/wiki/Statistical_mechanics.
(5) Information Entropy, http://en.wikipedia.org/wiki/Information_entropy.