Entropy

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Entropy

Robert Holmes
>From Jnl of Chemical Education, a forthright article on the misuse of
entropy:

http://jchemed.chem.wisc.edu/Journal/Issues/1999/Oct/abs1385.html

I came across the article because I was trying to recall the advice that von
Neumann gave Shannon about naming the property -?p.log(p). Von Neumann told
him "Call it entropy. No one knows what entropy really is, so in a debate
you will always have the advantage"



Reply | Threaded
Open this post in threaded view
|

Entropy

David Eric Smith
Guys, please don't just accept this:

There are those -- among whom I count most people who have thought
well about the subject -- who would say that the information-theoretic
definition of entropy is _the_ definition.  The interesting question
is why a large collection of interesting phenemona in physics and
chemistry exist in circumstances where information-theoretic
descriptions of measurement are appropriate.  

Granted, most people do talk sloppily about entropy most of the time,
and deserve to be smacked for it, and many of his examples may be
cases of this, and valid as such.  On the other hand, his own
statements that physical temperature is essential to the definition of
entropy are just silly.  They remind me of an ex-boss and civil
engineer, criticizing me for writing down solutions to differential
equations in terms of natural logs, who explained to me that ``natural
logs are what you use to characterize natural phenomena; engineered
phenomena are described by log10'' (which of course he wouldn't know
to call log10).  Also, his statement that thermal motion and
rearrangement are intrinsic to creating ``real'' entropy are wrong.
There he is confusing mechanisms that can sometimes create ergodic
sampling of distributions with properties of those distributions,
whether created by ergodic evolution or otherwise.

Sorry I can't take time to treat this long, long topic correctly, and
I don't even understand it as well as the better people do.  But one
can get trustworthy statements from Gell-Mann, and for all his
polemic is a bit of a nuisance, from E.T. Jaynes.  

In some sense, the field has gone way beyond the discussions in this
article, and it has some of the anachronistic feel of an old
guild-member convinced that the peculiar set of circumstances with
which he has concerned himself for a lifetime are the only _real_
circumstances in the world.  There really is an interesting question
why the interactions of atoms and molecules so often produce samplings
of state distributions that are reasonably unambiguous, and with them
``natural'' definitions of entropies.  This is related, though perhaps
peripherally in atomic/molecular physics, to the work on decoherence.
But the whole nonsense mess in discussions of entropy change in living
systems arises from the fact that those systems do not produce the
same sampling strategies, not that they change the laws of
thermodynamics.  Confusing the two, as this guy does, is the source of
that whole unnecessary wrong turn.  It does seem a shame if a
generation of chemists might be raised to miss the point
unnecessarily, where their grandfathers may not really have had an
alternative.

Sorry for the rant,

Eric


>
> >From Jnl of Chemical Education, a forthright article on the misuse of
> entropy:
>
> http://jchemed.chem.wisc.edu/Journal/Issues/1999/Oct/abs1385.html
>
> I came across the article because I was trying to recall the advice that von
> Neumann gave Shannon about naming the property -?p.log(p). Von Neumann told
> him "Call it entropy. No one knows what entropy really is, so in a debate
> you will always have the advantage"
>
>
>
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9AM @ Jane's Cafe
> Lecture schedule, archives, unsubscribe, etc.:
> http://www.friam.org
>