entropy and uncertainty, REDUX

classic Classic list List threaded Threaded
28 messages Options
12
Reply | Threaded
Open this post in threaded view
|

entropy and uncertainty, REDUX

Nick Thompson

A month ago, an interesting correspondence occurred between Grant, Glen, Roger, and some others concerning entropy.  In particular, Grant wrote:

There are two distinct meanings of the "entropy" floating around; and they are generally conflated and confused to the detriment of us all.  Until these two meanings are identified and separated, any conversations we have about "entropy" will be confusing.

Before I came to Santa Fe, my knowledge of the notion of entropy came mostly from the study of animal communication and social organization.  The idea of signal entropy is introduced in E.O. Wilson’s Sociobiology, where he proposes to measure the influence of one animal’s behavior upon another’s in bits of information … reduction of uncertainty.  In this conceptualization, the animals are conceptualized as “sender” and “receiver”.  One uses the Shannon-Wiener formula to compute the uncertainty in the distribution of the Receiver’s behavior before and after the Sender has made his own behavioral choice.  The difference is the amount of information transmitted by the sender’s behavior. 

Social organization can be measured in a similar way.  A group might be said to be organized just to the extent that different elements of the species behavioral repertoire  are allocated to different types of individuals.    In that case, knowing something about a member’s identity decreases our uncertainty concerning its likely behavior.  So, if, for instance, monkey groups are organized and we know that a given monkey is a female, we know that the range of its activities is less than if did not know its sex.  This change in our uncertainty about its behavior could be measured in bits.  Similarly, if we know that a honey bee is a worker, rather than a queen or a drone, or, even more specifically, a young worker, as opposed to an older one, this information greatly reduces our uncertainty concerning what that bee is doing.  (Young workers may be usually be found helping maintain brood cells.) 

An analogous trick can be used to measure the groupiness of a group.  If we understand a group of animals as a set of animals that maintains proximity,  then our uncertainty concerning the direction of movement  of any group member is greatly reduced by knowing the movements of the others.  Something similar can be attempted with dispersion: How much is your uncertainty about the location of an animal decreased by knowing the location of another of the same group. 

However, Grant goes on to write:

Meaning #1 of "entropy": This is the "static" meaning. It represents the degree to which a system is "structured" or "organized". If a system is "highly disorganized" or "unstructured", then it has a lot of this kind of entropy. Or, if a system is highly "dispersed" or "dissipated", then it has a high degree of this kind of entropy.  …

And continues,
Meaning #2 of "entropy": This meaning is most clearly seen in Claude Shannon's book on the Mathematical Theory of Communications. Here "entropy" is independent of the notion of "organization/disorganization".


Thus, it should be clear that from my point of view, these two notions of entropy, dispersion and organization should be entirely compatible.  In fact, the former is just a special case of the latter.    And my problem is how to square my view of the situation, which is admittedly pretty primitive, with Grant’s. 

            One of the reasons this discussion interests me is that it relates to conversations I have had over years with some folks in FRIAM about the relation between information and organization.  At the risk of overstating their perspective, they seem to be saying that Nature is trying to increase entropy and seeks the best means for doing do.  When entropy is low … i.e., when there is a particularly steep gradient in some part of the universe, Nature contrives a means for reducing that entropy and that means, paradoxically is to INCREASE entropy in some localized portion of the universe in order to more efficiently decrease it over all.  Consider, for a moment, that much overworked example, the Bénard Cell:  hot plate at the bottom, cold plate at the top, fluid in the middle, gradually increasing supply of energy to the hot plate.  In the beginning, the molecules of the fluid near the heat source knock against each other and the molecules above them and the molecular motion diffuses upward through the fluid in a disorganized sort of way.  When I say, “disorganized” I mean that although the amount of molecular activity is moving upward, the  molecules themselves are not behaving in a patterned way.

However,  as the gradient is increased, a tipping point is reached in which the molecules themselves begin to march up and down in ordered columns.  Seen from the top of the liquid, contents of the vessel have organized themselves into hexagonal cells of molecules rising and descending through the height of the liquid.   This organization of the fluid consists in a tremendous increase in what can be inferred from the motion of one molecule about the motion of other molecules in its vicinity and is accompanied by a much more rapid diffusion of the gradient. 

All of this, it seems to me, can be accommodated by – indeed requires – a common language between information entropy and physics entropy, the very language which Glen seems to argue is impossible. 

Explanations?

The attachment is a word copy of this comment, along with the correspondence that provoked it.  I am working with a new version of the Great Satan’s Word, but this file should be downward compatible.  Get on to me if it isn’t. 

Nick

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/nthompson

http://www.cusf.org

 

 

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

entropy and uncertainty.docx (41K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Russ Abbott
Nick, Good to hear from you. Nice description of entropy in animal behavior!

I agree with pretty much everything you say. I'm concerned about this sentence, though.

When entropy is low … i.e., when there is a particularly steep gradient in some part of the universe, Nature contrives a means for reducing that entropy and that means, paradoxically is to INCREASE entropy in some localized portion of the universe in order to more efficiently decrease it over all. 

By a particularly steep gradient, I assume you mean any sharp change of some property. The gradient is in some property--not any particular property, and especially not necessarily in entropy.

I'm guessing you meant to say that Nature contrives a means for increasing that entropy.  (You did say it started out low. I agree with that. Smoothing out the gradient would increase the entropy.)

That would increase entropy locally and globally.

-- Russ



On Wed, Aug 4, 2010 at 7:04 PM, Nicholas Thompson <[hidden email]> wrote:
energy


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Stephen Guerin-3
Yep, that's what Nick meant I assure you. He's slapping his forehead  
thrice for making that reversal.

-S
_____________________________________________________________
[hidden email]
(m) 505-216-6226 (o) 505-995-0206
sfcomplex.org | simtable.com | ambientpixel.com | redfish.com

On Aug 5, 2010, at 12:04 AM, Russ Abbott wrote:

> Nick, Good to hear from you. Nice description of entropy in animal  
> behavior!
>
> I agree with pretty much everything you say. I'm concerned about  
> this sentence, though.
>>
>> When entropy is low … i.e., when there is a particularly steep  
>> gradient in some part of the universe, Nature contrives a means for  
>> reducing that entropy and that means, paradoxically is to INCREASE  
>> entropy in some localized portion of the universe in order to more  
>> efficiently decrease it over all.
>
> By a particularly steep gradient, I assume you mean any sharp change  
> of some property. The gradient is in some property--not any  
> particular property, and especially not necessarily in entropy.
>
> I'm guessing you meant to say that Nature contrives a means for  
> increasing that entropy.  (You did say it started out low. I agree  
> with that. Smoothing out the gradient would increase the entropy.)
>
> That would increase entropy locally and globally.
>
> -- Russ
>
>
> On Wed, Aug 4, 2010 at 7:04 PM, Nicholas Thompson <[hidden email]
> > wrote:
> energy
>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> lectures, archives, unsubscribe, maps at http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson
In reply to this post by Russ Abbott

Crap!  You are right.  I said it backwards. 

 

When entropy is low … i.e., when there is a particularly steep gradient in some part of the universe, Nature contrives a means for INCREASING that entropy and that means, paradoxically, is to DECREASE entropy in some localized portion of the universe in order to more efficiently INCREASE it over all. 

 

I HATE when that happens.

 

Nick  

 

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Russ Abbott
Sent: Thursday, August 05, 2010 12:05 AM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Nick, Good to hear from you. Nice description of entropy in animal behavior!

I agree with pretty much everything you say. I'm concerned about this sentence, though.

When entropy is low … i.e., when there is a particularly steep gradient in some part of the universe, Nature contrives a means for reducing that entropy and that means, paradoxically is to INCREASE entropy in some localized portion of the universe in order to more efficiently decrease it over all. 


By a particularly steep gradient, I assume you mean any sharp change of some property. The gradient is in some property--not any particular property, and especially not necessarily in entropy.

I'm guessing you meant to say that Nature contrives a means for increasing that entropy.  (You did say it started out low. I agree with that. Smoothing out the gradient would increase the entropy.)

That would increase entropy locally and globally.


-- Russ

 

On Wed, Aug 4, 2010 at 7:04 PM, Nicholas Thompson <[hidden email]> wrote:

energy

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson
In reply to this post by Stephen Guerin-3
My forehead is bruised with slapping.  

-----Original Message-----
From: [hidden email] [mailto:[hidden email]] On Behalf
Of Stephen Guerin
Sent: Thursday, August 05, 2010 12:09 AM
To: [hidden email]; The Friday Morning Applied Complexity Coffee
Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

Yep, that's what Nick meant I assure you. He's slapping his forehead thrice
for making that reversal.

-S
_____________________________________________________________
[hidden email]
(m) 505-216-6226 (o) 505-995-0206
sfcomplex.org | simtable.com | ambientpixel.com | redfish.com

On Aug 5, 2010, at 12:04 AM, Russ Abbott wrote:

> Nick, Good to hear from you. Nice description of entropy in animal
> behavior!
>
> I agree with pretty much everything you say. I'm concerned about this
> sentence, though.
>>
>> When entropy is low . i.e., when there is a particularly steep
>> gradient in some part of the universe, Nature contrives a means for
>> reducing that entropy and that means, paradoxically is to INCREASE
>> entropy in some localized portion of the universe in order to more
>> efficiently decrease it over all.
>
> By a particularly steep gradient, I assume you mean any sharp change
> of some property. The gradient is in some property--not any particular
> property, and especially not necessarily in entropy.
>
> I'm guessing you meant to say that Nature contrives a means for
> increasing that entropy.  (You did say it started out low. I agree
> with that. Smoothing out the gradient would increase the entropy.)
>
> That would increase entropy locally and globally.
>
> -- Russ
>
>
> On Wed, Aug 4, 2010 at 7:04 PM, Nicholas Thompson
> <[hidden email]
> > wrote:
> energy
>
> ============================================================
> FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe
> at St. John's College lectures, archives, unsubscribe, maps at
> http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives,
unsubscribe, maps at http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

glen e. p. ropella-2
In reply to this post by Nick Thompson
Nicholas Thompson wrote  circa 08/04/2010 07:04 PM:
> All of this, it seems to me, can be accommodated by – indeed requires –
> a common language between information entropy and physics entropy, the
> very language which Glen seems to argue is impossible.

This part confuses me.  As I understand it, Grant was on one side of the
dialectic, arguing that organization is (or at least can be) independent
of uncertainty.  I was on the _other_ side, claiming that they aren't
independent, but can be distinct.  I.e. Grant's claiming they are
fundamentally different things.  I'm claiming they are distinct aspects
of the same thing.

Grant resolved this (off-list) with his further explanation that he is
treating both meanings (organization vs. uncertainty) as distinct
measures of the behavior of the same system.  As measures, he defines
(or wants to define) them with independent co-domains so that they are
_allowed_ to vary independently, if that's how it all turns out when he
applies the measures to the system.  That's not to say that, with any
particular system, they will or won't... just that they _might_.  Then,
if he studies a huge sample of systems and, in all cases, they vary in a
correlated way, I can step in and make my assertion that they are
aspects of the same thing.  If not, then he can step in and make an
assertion that they really measure different things.  But until we have
the separated (not conflated) measures for the two separated concepts,
we will stay lost in the conflation.

But, to my knowledge, neither of us have made the case that the the
language used to express the measures is fundamentally different, much
less impossible.  In fact, I think the original irritant for Grant was
that because the language used to describe the two is so similar that it
leads directly to the conflation between the two concepts.  So Grant is
lamenting the fact that the two (independent) concepts are expressed in
the same language.  I would take it even further and say that the two
(distinct but intimately related) concepts _should_ be expressed in the
same language because they measure the same thing, just in different ways.

So, I'm confused why you think I argue that the common language between
the two would be impossible.

--
glen e. p. ropella, 971-222-9095, http://agent-based-modeling.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson

Dear Glen,

 

This is another error on my part.  The text should read:

 

All of this, it seems to me, can be accommodated by – indeed requires – a common language between information entropy and physics entropy, the very language which GRANT seems to argue is impossible. 

 

In fact, I did see you as “on my side” in the argument, although, I confess I did not wholly understand how you reached your conclusion. 

 

Right now I am trying to get ready for a trip, so can’t deal with the substance of your argument, but look forward to doing so in a few days. 

 

I would like to apologize to everybody for these errors.  I am beginning to think I am too old to be trusted with a distribution list.  It’s not that I don’t go over the posts before I send them … and in fact, what I sent represented weeks of thinking and a couple of evenings of drafting … believe it or not!  It seems that there are SOME sorts of errors I cannot see until they are pointed out to me, and these seem to be, of late, the fatal ones.

 

All the best to everybody,

 

Nick

 

-----Original Message-----
From: [hidden email] [mailto:[hidden email]] On Behalf Of glen e. p. ropella
Sent: Thursday, August 05, 2010 10:49 AM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Nicholas Thompson wrote  circa 08/04/2010 07:04 PM:

> All of this, it seems to me, can be accommodated by – indeed requires

> – a common language between information entropy and physics entropy,

> the very language which Glen seems to argue is impossible.

 

This part confuses me.  As I understand it, Grant was on one side of the dialectic, arguing that organization is (or at least can be) independent of uncertainty.  I was on the _other_ side, claiming that they aren't independent, but can be distinct.  I.e. Grant's claiming they are fundamentally different things.  I'm claiming they are distinct aspects of the same thing.

 

Grant resolved this (off-list) with his further explanation that he is treating both meanings (organization vs. uncertainty) as distinct measures of the behavior of the same system.  As measures, he defines (or wants to define) them with independent co-domains so that they are _allowed_ to vary independently, if that's how it all turns out when he applies the measures to the system.  That's not to say that, with any particular system, they will or won't... just that they _might_.  Then, if he studies a huge sample of systems and, in all cases, they vary in a correlated way, I can step in and make my assertion that they are aspects of the same thing.  If not, then he can step in and make an assertion that they really measure different things.  But until we have the separated (not conflated) measures for the two separated concepts, we will stay lost in the conflation.

 

But, to my knowledge, neither of us have made the case that the the language used to express the measures is fundamentally different, much less impossible.  In fact, I think the original irritant for Grant was that because the language used to describe the two is so similar that it leads directly to the conflation between the two concepts.  So Grant is lamenting the fact that the two (independent) concepts are expressed in the same language.  I would take it even further and say that the two (distinct but intimately related) concepts _should_ be expressed in the same language because they measure the same thing, just in different ways.

 

So, I'm confused why you think I argue that the common language between the two would be impossible.

 

--

glen e. p. ropella, 971-222-9095, http://agent-based-modeling.com

 

 

============================================================

FRIAM Applied Complexity Group listserv

Meets Fridays 9a-11:30 at cafe at St. John's College

lectures, archives, unsubscribe, maps at http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

glen e. p. ropella-2
Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
> All of this, it seems to me, can be accommodated by – indeed requires –
> a common language between information entropy and physics entropy, the
> very language which GRANT seems to argue is impossible.

OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.

> I would like to apologize to everybody for these errors.  I am beginning
> to think I am too old to be trusted with a distribution list.  It’s not
> that I don’t go over the posts before I send them … and in fact, what I
> sent represented weeks of thinking and a couple of evenings of drafting
> … believe it or not!  It seems that there are SOME sorts of errors I
> cannot see until they are pointed out to me, and these seem to be, of
> late, the fatal ones.

We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!

Err with Gusto! ;-)

--
glen e. p. ropella, 971-222-9095, http://agent-based-modeling.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Grant Holland
Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant

glen e. p. ropella wrote:
Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    

OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.

  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    

We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!

Err with Gusto! ;-)

  

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Russ Abbott
Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy.
But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural
entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually
insoluble liquids in a bottle, one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential
energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ



On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:
Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant


glen e. p. ropella wrote:
Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.

  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    
We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!

Err with Gusto! ;-)

  

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Grant Holland
Russ - Yes.

I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:
Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ


On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:
Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant


glen e. p. ropella wrote:
Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.

  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    
We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!

Err with Gusto! ;-)

  

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at http://www.friam.org

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson
In reply to this post by Russ Abbott

Comments below …

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Russ Abbott
Sent: Saturday, August 07, 2010 1:03 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

 

But behavior IS a structure…. A structure of acts, and “acts” are a structure of muscle twitches.  It’s structures all the way down.  So, are we talking about a distinction between structures in space and structures in time?  I wish Grant would get back into this discussion. 



Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator.

 

Does it matter where the disorder comes from.  It’s disordered. 

 

So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

 

This example reminds me of Dennett’s contribution to the Emergence book, but I don’t have it with me and can’t remember it well enough to trot it out.  Do you remember it, Russ?



This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what?

 

Well, I think the straightforward example is that there is more entropy in case of two dimensional uncertainty. 

 

Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

 

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle, one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

 

This example is reminding me of Roger C.’s example of two vessels equalizing in a vacuum jar;  How did that go?  I wonder if these two examples could be combined in a thought experiment.


What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.

 

Does this relate to what Grant was getting at?  I wish Steve would put his oar in.  And what about gravity?  Don’t piles tend to accrete? 

 

Nick

 

 




-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.
 
  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    
We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!
 
Err with Gusto! ;-)
 
  

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson
In reply to this post by Grant Holland

Grant –

 

Glad you are on board, here.  I will read this carefully. 

 

Does this have anything to do with the Realism Idealism thing.  Predictibility requires a person to be predicting; organization is there even if there is no one there to predict one part from another.

 

N

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 2:06 PM
To: [hidden email]; The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ - Yes.

I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.
 
  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    
We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!
 
Err with Gusto! ;-)
 
  

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org



-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Russ Abbott
If you call it behavioral rather than predictable it doesn't require a predictor. It's just an arrangement in time.

-- Russ



On Sat, Aug 7, 2010 at 12:14 PM, Nicholas Thompson <[hidden email]> wrote:

Grant –

 

Glad you are on board, here.  I will read this carefully. 

 

Does this have anything to do with the Realism Idealism thing.  Predictibility requires a person to be predicting; organization is there even if there is no one there to predict one part from another.

 

N

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 2:06 PM
To: [hidden email]; The Friday Morning Applied Complexity Coffee Group


Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ - Yes.



I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.
 
  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    

We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!
 
Err with Gusto! ;-)
 
  

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions

404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org



-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Grant Holland
Russ, Nick,

You both make an interesting point about one of these dimensions (unpredictability) requiring an observer, while the other (organization, or structure) does not.

However, Heinz von Foerster, I believe, would disagree. I believe he would say that BOTH require an observer!

Another way to think about this is: Science is by definition an empirical enterprise. It requires dispassionate observation, refutable observations (Popper), etc. Science therefore requires an observer. The very idea of components being "related", or not, into an "organization" or "structure" is itself an abstraction on the part of an observer.

Of course this thread of thought takes us away from our main point of discussion about entropy...which I will continue on my next missive. ;-)

Grant

Russ Abbott wrote:
If you call it behavioral rather than predictable it doesn't require a predictor. It's just an arrangement in time.

-- Russ



On Sat, Aug 7, 2010 at 12:14 PM, Nicholas Thompson <[hidden email]> wrote:

Grant –

 

Glad you are on board, here.  I will read this carefully. 

 

Does this have anything to do with the Realism Idealism thing.  Predictibility requires a person to be predicting; organization is there even if there is no one there to predict one part from another.

 

N

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 2:06 PM
To: [hidden email]; The Friday Morning Applied Complexity Coffee Group


Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ - Yes.



I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.
 
  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    
We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!
 
Err with Gusto! ;-)
 
  

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org



-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at http://www.friam.org

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

lrudolph
In reply to this post by Nick Thompson
On 7 Aug 2010 at 15:14, Nicholas  Thompson wrote:

> Predictibility requires a person to be predicting; organization is there
> even if there is no one there to predict one part from another.

That's hardly obvious, and I don't think it's true.
I happy to concede the statement about "predictability"
without further proof (because it agrees with my bias);
but I can't see any principled way to accept it, and
not deny the statement about "organization"--my bias
is to say that organization, too, requires "a person"
to be assessing the system as organized or not (and,
if organized, maybe "how much").  Convince me that
the assymmetry you purport is justifiable.



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Russ Abbott
In reply to this post by Grant Holland
That seems to me to be a different point--and one that Glen made about entropy a while ago.  Scientific realists assume that what one sees is what there is, more or less, that structure in any dimension is presumed to be part of the universe, and that as observers we just see what is.  (I know that's oversimplified, but that's the basic idea.)  Predictability is different in that it's a matter of predicting something unknown when the prediction is made.

-- Russ



On Sat, Aug 7, 2010 at 12:25 PM, Grant Holland <[hidden email]> wrote:
t


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Roger Critchlow-2
Introducing another thread, the measure of diversity used in ecology
is Shannon's entropy.

-- rec --

---------------------
http://www.semcoop.com/book/9780226562261
---------------------
Biology's First Law: The Tendency for Diversity and Complexity to
Increase in Evolutionary Systems (Paperback)

Description

Life on earth is characterized by three striking phenomena that demand
explanation: adaptation—the marvelous fit between organism and
environment; diversity—the great variety of organisms; and
complexity—the enormous intricacy of their internal structure. Natural
selection explains adaptation. But what explains diversity and
complexity? Daniel W. McShea and Robert N. Brandon argue that there
exists in evolution a spontaneous tendency toward increased diversity
and complexity, one that acts whether natural selection is present or
not. They call this tendency a biological law—the Zero-Force
Evolutionary Law, or ZFEL. This law unifies the principles and data of
biology under a single framework and invites a reconceptualization of
the field of the same sort that Newton’s First Law brought to physics.



Biology’s First Law shows how the ZFEL can be applied to the study of
diversity and complexity and examines its wider implications for
biology. Intended for evolutionary biologists, paleontologists, and
other scientists studying complex systems, and written in a concise
and engaging format that speaks to students and interdisciplinary
practitioners alike, this book will also find an appreciative audience
in the philosophy of science.

About the Author

Daniel W. McShea is Associate Professor of Biology, with a secondary
appointment in Philosophy, and Robert N. Brandon is Professor of
Philosophy, with a secondary appointment in Biology, both at Duke
University.

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson
In reply to this post by Grant Holland

Grant

 

You see that I am still struggling to understand your original distinction between prediction and organization.  Clearly organization affords prediction.   I was trying to read you as saying that organization is the thing that’s there and prediction is what we make of it.  We can use organizations to make predictions.  But you just blocked that interpretation. 

 

I apologize if I am not reading astutely enough.  If you or somebody else could help me out, here, I would be in your debt.  What is the distinction that you see between these two things that seem so much the same to me.

 

Nick

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 3:25 PM
To: [hidden email]; The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ, Nick,

You both make an interesting point about one of these dimensions (unpredictability) requiring an observer, while the other (organization, or structure) does not.

However, Heinz von Foerster, I believe, would disagree. I believe he would say that BOTH require an observer!

Another way to think about this is: Science is by definition an empirical enterprise. It requires dispassionate observation, refutable observations (Popper), etc. Science therefore requires an observer. The very idea of components being "related", or not, into an "organization" or "structure" is itself an abstraction on the part of an observer.

Of course this thread of thought takes us away from our main point of discussion about entropy...which I will continue on my next missive. ;-)

Grant

Russ Abbott wrote:

If you call it behavioral rather than predictable it doesn't require a predictor. It's just an arrangement in time.


-- Russ



On Sat, Aug 7, 2010 at 12:14 PM, Nicholas Thompson <[hidden email]> wrote:

Grant –

 

Glad you are on board, here.  I will read this carefully. 

 

Does this have anything to do with the Realism Idealism thing.  Predictibility requires a person to be predicting; organization is there even if there is no one there to predict one part from another.

 

N

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 2:06 PM
To: [hidden email]; The Friday Morning Applied Complexity Coffee Group


Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ - Yes.



I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.
 
  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    
 
We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!
 
Err with Gusto! ;-)
 
  

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
 
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org



-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Russ Abbott
Nick,

Did you read what I wrote?  I think that explains it.

-- Russ



On Sat, Aug 7, 2010 at 1:03 PM, Nicholas Thompson <[hidden email]> wrote:

Grant

 

You see that I am still struggling to understand your original distinction between prediction and organization.  Clearly organization affords prediction.   I was trying to read you as saying that organization is the thing that’s there and prediction is what we make of it.  We can use organizations to make predictions.  But you just blocked that interpretation. 

 

I apologize if I am not reading astutely enough.  If you or somebody else could help me out, here, I would be in your debt.  What is the distinction that you see between these two things that seem so much the same to me.

 

Nick

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 3:25 PM


To: [hidden email]; The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ, Nick,

You both make an interesting point about one of these dimensions (unpredictability) requiring an observer, while the other (organization, or structure) does not.

However, Heinz von Foerster, I believe, would disagree. I believe he would say that BOTH require an observer!

Another way to think about this is: Science is by definition an empirical enterprise. It requires dispassionate observation, refutable observations (Popper), etc. Science therefore requires an observer. The very idea of components being "related", or not, into an "organization" or "structure" is itself an abstraction on the part of an observer.

Of course this thread of thought takes us away from our main point of discussion about entropy...which I will continue on my next missive. ;-)

Grant

Russ Abbott wrote:

If you call it behavioral rather than predictable it doesn't require a predictor. It's just an arrangement in time.


-- Russ



On Sat, Aug 7, 2010 at 12:14 PM, Nicholas Thompson <[hidden email]> wrote:

Grant –

 

Glad you are on board, here.  I will read this carefully. 

 

Does this have anything to do with the Realism Idealism thing.  Predictibility requires a person to be predicting; organization is there even if there is no one there to predict one part from another.

 

N

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 2:06 PM
To: [hidden email]; The Friday Morning Applied Complexity Coffee Group


Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ - Yes.



I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.
 
  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    

 
We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!
 
Err with Gusto! ;-)
 
  

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions

 
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org



-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
12