Re: Notions of entropy

Posted by Steve Smith on
URL: http://friam.383.s1.nabble.com/Notions-of-entropy-tp7584006p7584016.html

On 10/11/13 2:00 PM, Nick Thompson wrote:

At Friam today, we had our first discussion of entropy in a while.   It was like old times.  I really enjoyed it. 

 

But the following disagreement came up.   I am, I think, a bit of what philosophers call an essentialist.  In other words, I assume that  when people use the same words for two things, it aint for nothing, that there is something underlying the surface that makes those two things the same.  So, underlying all the uses of the word “entropy” is a common core, and …. Here’s the tricky bit … that that common core could be expressed mathematically.  However, I thought my fellow discussants disagreed with this naïve intuition and agreed that the physical and the information theoretical uses of the word “entropy” were “not mathematically equivalent”, which I take to mean that, no mathematical operation could be devised that would turn one into the other. That the uses of the word entropy were more like members of a family then they were like expressions of some essence.    

 

I wonder what you-all think about that.

He's Baaack! 

I think your question, whether naive (as you seem to suggest) or deeply astute is good for us to consider.  To the extent that we share a common interest in "complex systems", I believe that the regimes of Shannon/Information Entropy, Gibbs/Thermodynamic Entropy, and vonNeumann/QM entropy overlap in non-ergodic systems...   I think that the biggest distinction between these measures of entropy occur in near-equilibrium systems where the ergodic hypothesis is (most) relevant. 

Our patron Saint Guerin likes to invoke Spontaneous Symmetry Breaking which I believe is a consequence or indication of the ergodic hypothesis being broken in macroscopic systems.

As our technologists move further and further into the nano scale realm, designing and building systems at the atomic or molecular level, we will see a stronger practical overlap of Shannon and Gibbs (and Von Neumann?) entropy.

I *do* look forward to a rousing round of discussion on the topic here.

Welcome back,
 - Steve




============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com