Login  Register

Re: entropy and uncertainty, REDUX

Posted by Grant Holland on Aug 05, 2010; 6:09pm
URL: http://friam.383.s1.nabble.com/entropy-and-uncertainty-REDUX-tp5375070p5377670.html

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant

glen e. p. ropella wrote:
Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    

OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.

  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    

We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!

Err with Gusto! ;-)

  

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org