Login  Register

entropy and uncertainty, REDUX

Posted by Nick Thompson on Aug 05, 2010; 2:04am
URL: http://friam.383.s1.nabble.com/entropy-and-uncertainty-REDUX-tp5375070.html

A month ago, an interesting correspondence occurred between Grant, Glen, Roger, and some others concerning entropy.  In particular, Grant wrote:

There are two distinct meanings of the "entropy" floating around; and they are generally conflated and confused to the detriment of us all.  Until these two meanings are identified and separated, any conversations we have about "entropy" will be confusing.

Before I came to Santa Fe, my knowledge of the notion of entropy came mostly from the study of animal communication and social organization.  The idea of signal entropy is introduced in E.O. Wilson’s Sociobiology, where he proposes to measure the influence of one animal’s behavior upon another’s in bits of information … reduction of uncertainty.  In this conceptualization, the animals are conceptualized as “sender” and “receiver”.  One uses the Shannon-Wiener formula to compute the uncertainty in the distribution of the Receiver’s behavior before and after the Sender has made his own behavioral choice.  The difference is the amount of information transmitted by the sender’s behavior. 

Social organization can be measured in a similar way.  A group might be said to be organized just to the extent that different elements of the species behavioral repertoire  are allocated to different types of individuals.    In that case, knowing something about a member’s identity decreases our uncertainty concerning its likely behavior.  So, if, for instance, monkey groups are organized and we know that a given monkey is a female, we know that the range of its activities is less than if did not know its sex.  This change in our uncertainty about its behavior could be measured in bits.  Similarly, if we know that a honey bee is a worker, rather than a queen or a drone, or, even more specifically, a young worker, as opposed to an older one, this information greatly reduces our uncertainty concerning what that bee is doing.  (Young workers may be usually be found helping maintain brood cells.) 

An analogous trick can be used to measure the groupiness of a group.  If we understand a group of animals as a set of animals that maintains proximity,  then our uncertainty concerning the direction of movement  of any group member is greatly reduced by knowing the movements of the others.  Something similar can be attempted with dispersion: How much is your uncertainty about the location of an animal decreased by knowing the location of another of the same group. 

However, Grant goes on to write:

Meaning #1 of "entropy": This is the "static" meaning. It represents the degree to which a system is "structured" or "organized". If a system is "highly disorganized" or "unstructured", then it has a lot of this kind of entropy. Or, if a system is highly "dispersed" or "dissipated", then it has a high degree of this kind of entropy.  …

And continues,
Meaning #2 of "entropy": This meaning is most clearly seen in Claude Shannon's book on the Mathematical Theory of Communications. Here "entropy" is independent of the notion of "organization/disorganization".


Thus, it should be clear that from my point of view, these two notions of entropy, dispersion and organization should be entirely compatible.  In fact, the former is just a special case of the latter.    And my problem is how to square my view of the situation, which is admittedly pretty primitive, with Grant’s. 

            One of the reasons this discussion interests me is that it relates to conversations I have had over years with some folks in FRIAM about the relation between information and organization.  At the risk of overstating their perspective, they seem to be saying that Nature is trying to increase entropy and seeks the best means for doing do.  When entropy is low … i.e., when there is a particularly steep gradient in some part of the universe, Nature contrives a means for reducing that entropy and that means, paradoxically is to INCREASE entropy in some localized portion of the universe in order to more efficiently decrease it over all.  Consider, for a moment, that much overworked example, the Bénard Cell:  hot plate at the bottom, cold plate at the top, fluid in the middle, gradually increasing supply of energy to the hot plate.  In the beginning, the molecules of the fluid near the heat source knock against each other and the molecules above them and the molecular motion diffuses upward through the fluid in a disorganized sort of way.  When I say, “disorganized” I mean that although the amount of molecular activity is moving upward, the  molecules themselves are not behaving in a patterned way.

However,  as the gradient is increased, a tipping point is reached in which the molecules themselves begin to march up and down in ordered columns.  Seen from the top of the liquid, contents of the vessel have organized themselves into hexagonal cells of molecules rising and descending through the height of the liquid.   This organization of the fluid consists in a tremendous increase in what can be inferred from the motion of one molecule about the motion of other molecules in its vicinity and is accompanied by a much more rapid diffusion of the gradient. 

All of this, it seems to me, can be accommodated by – indeed requires – a common language between information entropy and physics entropy, the very language which Glen seems to argue is impossible. 

Explanations?

The attachment is a word copy of this comment, along with the correspondence that provoked it.  I am working with a new version of the Great Satan’s Word, but this file should be downward compatible.  Get on to me if it isn’t. 

Nick

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/nthompson

http://www.cusf.org

 

 

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

entropy and uncertainty.docx (41K) Download Attachment