Posted by
Tom Carter on
URL: http://friam.383.s1.nabble.com/Notions-of-entropy-tp7584006p7584011.html
All --
Ah, "entropy" . . .
A couple of references:
http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf http://charlottewerndl.net/Entropy_Guide.pdf A couple of quotes from the E. T. Jaynes article (gibbs.vs.boltzmann reference above):
"It is interesting that although this field has long been regarded as one of the most puzzling and controversial parts of physics, the difficulties have not been mathematical."
and
"From this we see that entropy is an anthropomorphic concept, not only in the well-known sense that it measures the extent of human ignorance as to the microstate. (\em Even at the purely phenomenological level, entropy is an anthropomorphic concept.} For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it."
With respect to Gibbs ("physics") entropy and Shannon ("information theory", in the discrete case) entropy, up to a constant, they have the same expression (sum{p_i log(1/p_i)}). I'll observe that Boltzmann's Constant (k), which is the constant in the Gibbs formulation, is a "bridge" between micro-states and macro-states -- which, if you think about it, reinforces the notion that "entropy" is an "anthropomorphic concept" because, for us, a "macro-state" is a "human-sized state" (see pedagogical question below . . . :-) Note, though, that changing the base of the logarithm also introduces a constant in the "dimensionless" Shannon formulation. . .
Thanks . . .
Tom Carter
p.s. Pedagogical question: An exercise I do in class from time to time is to ask this question: "What would Avogadro's Number have been if the French Revolution had failed? (Justify your answer . . .)" (Hint: step 1: what possible relation might those have to each other? :-)
On Oct 11, 2013, at 1:00 PM, Nick Thompson <
[hidden email]> wrote:
> At Friam today, we had our first discussion of entropy in a while. It was like old times. I really enjoyed it.
>
> But the following disagreement came up. I am, I think, a bit of what philosophers call an essentialist. In other words, I assume that when people use the same words for two things, it aint for nothing, that there is something underlying the surface that makes those two things the same. So, underlying all the uses of the word “entropy” is a common core, and …. Here’s the tricky bit … that that common core could be expressed mathematically. However, I thought my fellow discussants disagreed with this naïve intuition and agreed that the physical and the information theoretical uses of the word “entropy” were “not mathematically equivalent”, which I take to mean that, no mathematical operation could be devised that would turn one into the other. That the uses of the word entropy were more like members of a family then they were like expressions of some essence.
>
> I wonder what you-all think about that.
>
> Nick
>
> Nicholas S. Thompson
> Emeritus Professor of Psychology and Biology
> Clark University
>
http://home.earthlink.net/~nickthompson/naturaldesigns/>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> to unsubscribe
http://redfish.com/mailman/listinfo/friam_redfish.com============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe
http://redfish.com/mailman/listinfo/friam_redfish.com