entropy and uncertainty, REDUX

classic Classic list List threaded Threaded
28 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson

But you agree that good prediction requires there to be structure or a process that provides the frame work in which a prediction can be made. 

 

Minimally, I think we assume that what we see is a feature of what is there.  Not all careful observational techniques reveal the same aspect. 

 

n

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Russ Abbott
Sent: Saturday, August 07, 2010 3:45 PM
To: Grant Holland
Cc: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

That seems to me to be a different point--and one that Glen made about entropy a while ago.  Scientific realists assume that what one sees is what there is, more or less, that structure in any dimension is presumed to be part of the universe, and that as observers we just see what is.  (I know that's oversimplified, but that's the basic idea.)  Predictability is different in that it's a matter of predicting something unknown when the prediction is made.


-- Russ



On Sat, Aug 7, 2010 at 12:25 PM, Grant Holland <[hidden email]> wrote:

t

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Grant Holland
In reply to this post by Nick Thompson
Nick,

Lemme try to present three examples of these two orthogonal dimensions (Organization/Disorganization dimension vs Predictability/Unpredictability dimension).
It all boils down to what phenomena one chooses to be interested in.
(Even if both dimensions are arguably present in a particular phenomenon, one can choose to observationally ignore one of them in the analysis of that phenomenon.)

The first example will be exclusively interested in the Organization/Disorganization dimension.
The second example will be exclusively interested in the Predictability/Unpredictability dimension.
The third example will be jointly interested in both.

Example I: An interest in Organization/Disorganization (structure or lack of), with no interest in Predictability/Unpredictability.
Lets say we are interested in observing Hydrogen and Oxygen atoms within a small region of space.
These atoms are capable of combining into several possible bonding configurations: O2, HO, H2O, etc.
Suppose we observe this region for some finite time and make a list of any of these configurations that are observed.
This list captures an interest in Organization/Disorganization, but not in Predictability/Unpredictability.
We could go further and even develop a metric for the "degree of organization" observed. That would be a continuation of the same interest.

Example II: An interest in Unpredictability/Predictability, with no interest in Organization/Disorganization.
Lets say we are interested, as in Example I above, in observing Hydrogen and Oxygen atoms within a small region of space.
However, this time, we have no interest in whether these atoms occur in a bonded or unbonded form.
What we are emphasizing instead this time is the probability of selecting one of these two atoms at random from the region -
AND in whether or not the resulting probability distribution describes an Unpredictable situation, a Predictable situation, or somewhere in between.
(Assume that we will use the best experimental practices to arrive at an estimate of the population parameters from sample statistics.)
Let's say that we conclude that the distribution results in a .75 prob for H and .25 prob for O. (We "throw back" other atoms.)
Then, we can conclude that the Shannon entropy for this distribution is -[(.75)*(log2(.75)) + (.25)*(log2(.25))] = .675
So, an interest in Unpredictability/Predictability can be measured by Shannon entropy.
(The subject of "degree of dissipation" or of disorganization never arises here.)

Example III: A compound interest in both dimensions (Organization X Predictability) jointly.
Let's go back to Example I above, where we are interested in the various ways that H and O can bond.
Suppose that we take that interest a little further an ask...
"What is the probability distribution of the observed molecules and ions involving H and O?"
Now, we have combined our interest in both "organizations" of H and O, as well as
the relative probabilities of their occurrences.
Thus, our probability (sample) space now, by definition, has the following possible outcomes: O2, HO, H2O, etc.
And, each has its observed probability, and thus we have a joint probability distribution that we can apply Shannon's entropy against.
Depending on the probabilities of each of these "H-O compounds", the Shannon entropy may be high or it may be low.

HTH,
Grant





Nicholas Thompson wrote:

Grant –

 

Glad you are on board, here.  I will read this carefully. 

 

Does this have anything to do with the Realism Idealism thing.  Predictibility requires a person to be predicting; organization is there even if there is no one there to predict one part from another.

 

N

 

From: [hidden email] [[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 2:06 PM
To: [hidden email]; The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ - Yes.

I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:  
    
All of this, it seems to me, can be accommodated by – indeed requires –  
a common language between information entropy and physics entropy, the  
very language which GRANT seems to argue is impossible.  
      
OK.  But that doesn't change the sense much.  Grant seemed to be arguing  
that it's because we use a common language to talk about the two  
concepts, the concepts are erroneously conflated.  I.e. Grant not only  
admits the possibility of a common language, he _laments_ the common  
language because it facilitates the conflation of the two different  
concepts ... unless I've misinterpreted what he's said, of course.  
   
    
I would like to apologize to everybody for these errors.  I am beginning  
to think I am too old to be trusted with a distribution list.  It’s not  
that I don’t go over the posts before I send them … and in fact, what I  
sent represented weeks of thinking and a couple of evenings of drafting  
… believe it or not!  It seems that there are SOME sorts of errors I  
cannot see until they are pointed out to me, and these seem to be, of  
late, the fatal ones.  
      
We're all guilty of this.  It's why things like peer review and  
criticism are benevolent gifts from those who donate their time and  
effort to criticize others.  It's also why e-mail and forums are more  
powerful and useful than the discredit they usually receive.  While it's  
true that face-to-face conversation has higher bandwidth, e-mail,  
forums, and papers force us to think deeply and seriously about what we  
say ... and, therefore think.  So, as embarrassing as "errors" like this  
feel, they provide the fulcrum for clear and critical thinking.  I say  
let's keep making them!  
   
Err with Gusto! ;-)  
   
    

 

--   
Grant Holland  
VP, Product Development and Software Engineering  
NuTech Solutions  
404.427.4759  


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

   
 
   
============================================================  
FRIAM Applied Complexity Group listserv  
Meets Fridays 9a-11:30 at cafe at St. John's College  
lectures, archives, unsubscribe, maps at http://www.friam.org  



--   
Grant Holland  
VP, Product Development and Software Engineering  
NuTech Solutions  
404.427.4759  

============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at http://www.friam.org

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Grant Holland
In reply to this post by Nick Thompson
Nick,

Maybe let me explain how I use these two "dimensions" together in my "Organic Complex Systems" theory:

I am interested in 1) the Organization (structure) of organic systems, and 2) how that organization changes/evolves.

So, yes, Organization is "what is there" as you say. But, also "how that organization changes" is also "what is there".

But, furthermore, my theory is also very interested in something else about "organizational change" beyond just "how it changes". I am profoundly interested also in how "random" versus  how "deterministic" that change can be. I am interested in this because I suspect that, in living systems, the randomness versus determinism thing is all over the map. Living system dynamics sometimes behaves randomly and sometimes behaves deterministically, and mostly "somewhere in between". At least it looks so to me.

Therefore, when it comes to #2) above - how the organization of living systems changes, I am also profoundly interested in characterizing the "predictability/unpredictability" aspects of that change, as well as the mechanism of "how" that change occurs. I need to represent how that "degree of unpredictability of change of organization" can itself change from time to time in biology. Shannon's entropy is the perfect model for this.

To recap, the organization of living systems can change: from disorganized to disorganized, from disorganized to organized, from organized to disorganized, and from organized to organized. (All four of these are actually continua.) But - and this is the point - all 4 of those types of changes can either be predictable or unpredictable.

(Yes, in biology, it sometimes occurs that a disorganized situation transitions to an organized situation with a high degree of probability. That's what make biology different from thermodynamics, and makes biology appear to contradict the second law sometimes.)

Consequently, you can see that I need a mathematics that lets Organization/Disorganization vary independently from Predictability/Unpredictability sometimes. Shannon entropy has a part to play in that - but thermodynamic entropy does not, because I am not doing Physics.

Grant

Nicholas Thompson wrote:

But you agree that good prediction requires there to be structure or a process that provides the frame work in which a prediction can be made. 

 

Minimally, I think we assume that what we see is a feature of what is there.  Not all careful observational techniques reveal the same aspect. 

 

n

 

From: [hidden email] [[hidden email]] On Behalf Of Russ Abbott
Sent: Saturday, August 07, 2010 3:45 PM
To: Grant Holland
Cc: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

That seems to me to be a different point--and one that Glen made about entropy a while ago.  Scientific realists assume that what one sees is what there is, more or less, that structure in any dimension is presumed to be part of the universe, and that as observers we just see what is.  (I know that's oversimplified, but that's the basic idea.)  Predictability is different in that it's a matter of predicting something unknown when the prediction is made.


-- Russ



On Sat, Aug 7, 2010 at 12:25 PM, Grant Holland <[hidden email]> wrote:

t

 


============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at http://www.friam.org

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson
In reply to this post by Russ Abbott

Russ,

 

Ok.  I will read it again.  I am in the swirl of somebody else’s vacation house, so not perhaps focusing as well as I should.

 

Tx,

 

n

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Russ Abbott
Sent: Saturday, August 07, 2010 4:10 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Nick,

Did you read what I wrote?  I think that explains it.


-- Russ

 

On Sat, Aug 7, 2010 at 1:03 PM, Nicholas Thompson <[hidden email]> wrote:

Grant

 

You see that I am still struggling to understand your original distinction between prediction and organization.  Clearly organization affords prediction.   I was trying to read you as saying that organization is the thing that’s there and prediction is what we make of it.  We can use organizations to make predictions.  But you just blocked that interpretation. 

 

I apologize if I am not reading astutely enough.  If you or somebody else could help me out, here, I would be in your debt.  What is the distinction that you see between these two things that seem so much the same to me.

 

Nick

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 3:25 PM


To: [hidden email]; The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ, Nick,

You both make an interesting point about one of these dimensions (unpredictability) requiring an observer, while the other (organization, or structure) does not.

However, Heinz von Foerster, I believe, would disagree. I believe he would say that BOTH require an observer!

Another way to think about this is: Science is by definition an empirical enterprise. It requires dispassionate observation, refutable observations (Popper), etc. Science therefore requires an observer. The very idea of components being "related", or not, into an "organization" or "structure" is itself an abstraction on the part of an observer.

Of course this thread of thought takes us away from our main point of discussion about entropy...which I will continue on my next missive. ;-)

Grant

Russ Abbott wrote:

If you call it behavioral rather than predictable it doesn't require a predictor. It's just an arrangement in time.


-- Russ

 

On Sat, Aug 7, 2010 at 12:14 PM, Nicholas Thompson <[hidden email]> wrote:

Grant –

 

Glad you are on board, here.  I will read this carefully. 

 

Does this have anything to do with the Realism Idealism thing.  Predictibility requires a person to be predicting; organization is there even if there is no one there to predict one part from another.

 

N

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 2:06 PM
To: [hidden email]; The Friday Morning Applied Complexity Coffee Group


Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ - Yes.



I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <[hidden email]> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:
  
All of this, it seems to me, can be accommodated by – indeed requires –
a common language between information entropy and physics entropy, the
very language which GRANT seems to argue is impossible.
    
OK.  But that doesn't change the sense much.  Grant seemed to be arguing
that it's because we use a common language to talk about the two
concepts, the concepts are erroneously conflated.  I.e. Grant not only
admits the possibility of a common language, he _laments_ the common
language because it facilitates the conflation of the two different
concepts ... unless I've misinterpreted what he's said, of course.
 
  
I would like to apologize to everybody for these errors.  I am beginning
to think I am too old to be trusted with a distribution list.  It’s not
that I don’t go over the posts before I send them … and in fact, what I
sent represented weeks of thinking and a couple of evenings of drafting
… believe it or not!  It seems that there are SOME sorts of errors I
cannot see until they are pointed out to me, and these seem to be, of
late, the fatal ones.
    
 
 
 
We're all guilty of this.  It's why things like peer review and
criticism are benevolent gifts from those who donate their time and
effort to criticize others.  It's also why e-mail and forums are more
powerful and useful than the discredit they usually receive.  While it's
true that face-to-face conversation has higher bandwidth, e-mail,
forums, and papers force us to think deeply and seriously about what we
say ... and, therefore think.  So, as embarrassing as "errors" like this
feel, they provide the fulcrum for clear and critical thinking.  I say
let's keep making them!
 
Err with Gusto! ;-)
 
  

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
 
 
 
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
 
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson
In reply to this post by Grant Holland

Grant, Russ, Glen,

 

Ok.  I think I got it.  Paradoxically, it has to do with emergence.  You would think I would have seen it right away.  Thanks for your help.  More later when the swirl dies down.

 

N

 

From: Grant Holland [mailto:[hidden email]]
Sent: Saturday, August 07, 2010 4:52 PM
To: The Friday Morning Applied Complexity Coffee Group
Cc: [hidden email]; Nicholas Thompson; glen e. p. ropella
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Nick,

Maybe let me explain how I use these two "dimensions" together in my "Organic Complex Systems" theory:

I am interested in 1) the Organization (structure) of organic systems, and 2) how that organization changes/evolves.

So, yes, Organization is "what is there" as you say. But, also "how that organization changes" is also "what is there".

But, furthermore, my theory is also very interested in something else about "organizational change" beyond just "how it changes". I am profoundly interested also in how "random" versus  how "deterministic" that change can be. I am interested in this because I suspect that, in living systems, the randomness versus determinism thing is all over the map. Living system dynamics sometimes behaves randomly and sometimes behaves deterministically, and mostly "somewhere in between". At least it looks so to me.

Therefore, when it comes to #2) above - how the organization of living systems changes, I am also profoundly interested in characterizing the "predictability/unpredictability" aspects of that change, as well as the mechanism of "how" that change occurs. I need to represent how that "degree of unpredictability of change of organization" can itself change from time to time in biology. Shannon's entropy is the perfect model for this.

To recap, the organization of living systems can change: from disorganized to disorganized, from disorganized to organized, from organized to disorganized, and from organized to organized. (All four of these are actually continua.) But - and this is the point - all 4 of those types of changes can either be predictable or unpredictable.

(Yes, in biology, it sometimes occurs that a disorganized situation transitions to an organized situation with a high degree of probability. That's what make biology different from thermodynamics, and makes biology appear to contradict the second law sometimes.)

Consequently, you can see that I need a mathematics that lets Organization/Disorganization vary independently from Predictability/Unpredictability sometimes. Shannon entropy has a part to play in that - but thermodynamic entropy does not, because I am not doing Physics.

Grant

Nicholas Thompson wrote:

But you agree that good prediction requires there to be structure or a process that provides the frame work in which a prediction can be made. 

 

Minimally, I think we assume that what we see is a feature of what is there.  Not all careful observational techniques reveal the same aspect. 

 

n

 

From: [hidden email] [[hidden email]] On Behalf Of Russ Abbott
Sent: Saturday, August 07, 2010 3:45 PM
To: Grant Holland
Cc: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

That seems to me to be a different point--and one that Glen made about entropy a while ago.  Scientific realists assume that what one sees is what there is, more or less, that structure in any dimension is presumed to be part of the universe, and that as observers we just see what is.  (I know that's oversimplified, but that's the basic idea.)  Predictability is different in that it's a matter of predicting something unknown when the prediction is made.


-- Russ

 

On Sat, Aug 7, 2010 at 12:25 PM, Grant Holland <[hidden email]> wrote:

t

 

 

 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org



-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Russ Abbott
In reply to this post by Roger Critchlow-2
I'm not convinced. Much of his complexity has to do with things breaking down, which is more like an increase in entropy rather than complexity.

Besides that, it seems less like a "law" than like an observation--similar to the fact that there are power law relationships all over the place. That doesn't mean there is a "power law force" in nature.

-- Russ



On Sat, Aug 7, 2010 at 12:53 PM, Roger Critchlow <[hidden email]> wrote:
Introducing another thread, the measure of diversity used in ecology
is Shannon's entropy.

-- rec --

---------------------
http://www.semcoop.com/book/9780226562261
---------------------
Biology's First Law: The Tendency for Diversity and Complexity to
Increase in Evolutionary Systems (Paperback)

Description

Life on earth is characterized by three striking phenomena that demand
explanation: adaptation—the marvelous fit between organism and
environment; diversity—the great variety of organisms; and
complexity—the enormous intricacy of their internal structure. Natural
selection explains adaptation. But what explains diversity and
complexity? Daniel W. McShea and Robert N. Brandon argue that there
exists in evolution a spontaneous tendency toward increased diversity
and complexity, one that acts whether natural selection is present or
not. They call this tendency a biological law—the Zero-Force
Evolutionary Law, or ZFEL. This law unifies the principles and data of
biology under a single framework and invites a reconceptualization of
the field of the same sort that Newton’s First Law brought to physics.



Biology’s First Law shows how the ZFEL can be applied to the study of
diversity and complexity and examines its wider implications for
biology. Intended for evolutionary biologists, paleontologists, and
other scientists studying complex systems, and written in a concise
and engaging format that speaks to students and interdisciplinary
practitioners alike, this book will also find an appreciative audience
in the philosophy of science.

About the Author

Daniel W. McShea is Associate Professor of Biology, with a secondary
appointment in Philosophy, and Robert N. Brandon is Professor of
Philosophy, with a secondary appointment in Biology, both at Duke
University.

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Eric Charles
In reply to this post by Grant Holland
Sorry, the following email should be completely rewritten as a query rather than an assertion, but I think I am too tired to do it coherently, and I am hoping that an answer to my question may clarify the conversation.

Hmmm....
This email from Grant, the next one, and much of the past discussion leads me to believe that much of the confusion involves shifts in levels of analysis. Surely, degree of organization is not just correlated with degree of predictability, but the one is the other quantified in a different manner (and different measures have different properties desirable for different purposes). All the examples of situations that "vary orthogonally" seem to involve a shift of interest in which the narrator (here Grant, but elsewhere others) points out that something about a system is organized in one sense, but unpredictable in another. I have yet to detect a principled reason why it couldn't be called predictable in the first sense, but disorganized in the other.

For example, if I tell you that three married couples just entered the room, you would know that there are now six more people in the room, three men and three women (as couples are typically ORGANIZED) in that manner. It is not terribly interesting (to me at least) for you to then point out that the color of their eyes in difficult to PREDICT. Note, I could just as easily say that you had a high probability of being correct if you PREDICTED they were three men and three women, and that the eye color was still mysterious because there is not clear ORGANIZATION of married vs. unmarried couples based on eye color.

As a general style of approaching a problem, my bias would be to assert that there was some underlying phenomenon of interest that has been quantified in different ways depending on the interests of those doing the quantifying.

So I guess my question is, on what basis, does one declare one inquiry a problem of determining level of organization and another inquiry a problem of determining predictability?

Eric



On Sat, Aug 7, 2010 04:25 PM, Grant Holland <[hidden email]> wrote:
Nick,

Lemme try to present three examples of these two orthogonal dimensions (Organization/Disorganization dimension vs Predictability/Unpredictability dimension).
It all boils down to what phenomena one chooses to be interested in.
(Even if both dimensions are arguably present in a particular phenomenon, one can choose to observationally ignore one of them in the analysis of that phenomenon.)

The first example will be exclusively interested in the Organization/Disorganization dimension.
The second example will be exclusively interested in the Predictability/Unpredictability dimension.
The third example will be jointly interested in both.

Example I: An interest in Organization/Disorganization (structure or lack of), with no interest in Predictability/Unpredictability.
Lets say we are interested in observing Hydrogen and Oxygen atoms within a small region of space.
These atoms are capable of combining into several possible bonding configurations: O2, HO, H2O, etc.
Suppose we observe this region for some finite time and make a list of any of these configurations that are observed.
This list captures an interest in Organization/Disorganization, but not in Predictability/Unpredictability.
We could go further and even develop a metric for the "degree of organization" observed. That would be a continuation of the same interest.

Example II: An interest in Unpredictability/Predictability, with no interest in Organization/Disorganization.
Lets say we are interested, as in Example I above, in observing Hydrogen and Oxygen atoms within a small region of space.
However, this time, we have no interest in whether these atoms occur in a bonded or unbonded form.
What we are emphasizing instead this time is the probability of selecting one of these two atoms at random from the region -
AND in whether or not the resulting probability distribution describes an Unpredictable situation, a Predictable situation, or somewhere in between.
(Assume that we will use the best experimental practices to arrive at an estimate of the population parameters from sample statistics.)
Let's say that we conclude that the distribution results in a .75 prob for H and .25 prob for O. (We "throw back" other atoms.)
Then, we can conclude that the Shannon entropy for this distribution is -[(.75)*(log2(.75)) + (.25)*(log2(.25))] = .675
So, an interest in Unpredictability/Predictability can be measured by Shannon entropy.
(The subject of "degree of dissipation" or of disorganization never arises here.)

Example III: A compound interest in both dimensions (Organization X Predictability) jointly.
Let's go back to Example I above, where we are interested in the various ways that H and O can bond.
Suppose that we take that interest a little further an ask...
"What is the probability distribution of the observed molecules and ions involving H and O?"
Now, we have combined our interest in both "organizations" of H and O, as well as
the relative probabilities of their occurrences.
Thus, our probability (sample) space now, by definition, has the following possible outcomes: O2, HO, H2O, etc.
And, each has its observed probability, and thus we have a joint probability distribution that we can apply Shannon's entropy against.
Depending on the probabilities of each of these "H-O compounds", the Shannon entropy may be high or it may be low.

HTH,
Grant





Nicholas Thompson wrote:

Grant –

 

Glad you are on board, here.  I will read this carefully. 

 

Does this have anything to do with the Realism Idealism thing.  Predictibility requires a person to be predicting; organization is there even if there is no one there to predict one part from another.

 

N

 

From: friam-bounces@... [mailto:friam-bounces@...] On Behalf Of Grant Holland
Sent: Saturday, August 07, 2010 2:06 PM
To: Russ.Abbott@...; The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

Russ - Yes.

I use the terms "organizational" and "predictable", rather than "structural" and "behavioral", because of my particular interests. They amount to the same ideas. Basically they are two orthogonal dimensions of certain state spaces as they change.

I lament the fact that the same term "entropy" is used to apply to both meanings, however. Especially since few realize that these two meanings are being conflated with the same word. Von Foerster actually defined the word "entropy" in two different places within the same book of essays to mean each of these two meanings! Often the word "disorder" is used. And people don't know whether "disorder" refers to "disorganization" or whether it refers to "unpredictability". This word has fostered the further unfortunate confusion.

It seems few people make the distinction that you have. This conflation causes no end of confusion. I really wish there were 2 distinct terms. In my work, I have come up with the acronym "DOUPBT" for the "unpredictable" meaning of entropy. (Or, "behavioral", as you call it.) This stands for Degree Of UnPredictaBiliTy.) I actually use Shannon's formula for this meaning.

This all came about because 1) Clausius invented the term entropy to mean "dissipation" (a kind of dis-organization, in my terms). 2) But then Gibbs came along and started measuring the degree of unpredictability involved in knowing the "arrangements" (positions and momenta) of molecules in an ideal gas. The linguistic problem was that Gibbs (and Boltzmann) use the same term - entropy - as had Clausius, even though Clausius emphasized a structural (dissipation) idea, whereas Gibbs emphasized an unpredictability idea (admittedly, unpredictability of "structural" change).

To confuse things even more, Shannon came along and defined entropy in purely probabilistic terms - as a direct measure of unpredictability. So, historically, the term went from a purely structural meaning, to a mixture of structure and unpredictability to a pure unpredictability meaning. No wonder everyone is confused.

Another matter is that Clausius, Boltzmann and Gibbs were all doing Physics. But Shannon was doing Mathematics.

My theory is Mathematics. I'm not doing Physics. So I strictly need Shannon's meaning. My "social problem" is that every time I say "entropy", too many people assume I'm talking about "dissipation" when I am not. I'm always talking about "disorganization" when I use the term in my work. So, I have gone to using the phrase "Shannon's entropy", and never the word in its naked form. (Admittedly, I eventually also combine in a way similar to Gibbs :-[ . But I do not refer to the combined result as "entropy".)

:-P
Grant


Russ Abbott wrote:

Is it fair to say that Grant is talking about what one might call structural vs. behavioral entropy?

Let's say I have a number of bits in a row. That has very low structural entropy. It takes very few bits to describe that row of bits. But let's say each is hooked up to a random signal. So behaviorally the whole thing has high entropy. But the behavioral uncertainty of the bits is based on the assumed randomness of the signal generator. So it isn't really the bits themselves that have high behavioral entropy. They are just a "window" through which we are observing the high entropy randomness behind them. 

This is a very contrived example. Is it at all useful for a discussion of structural entropy vs. behavioral entropy? I'm asking that in all seriousness; I don't have a good sense of how to think about this.

This suggests another thought. A system may have high entropy in one dimension and low entropy in another. Then what? Most of us are very close to the ground most of the time. But we don't stay in one place in that relatively 2-dimensional world. This sounds a bit like Nick's example. If you know that an animal is female, you can predict more about how she will act than if you don't know that.

One other thought Nick talked about gradients and the tendency for them to dissipate.  Is that really so? If you put two mutually insoluble liquids in a bottle , one heavier than another, the result will be a layer cake of liquids with a very sharp gradient between them. Will that ever dissipate?

What I think is more to the point is that potential energy gradients will dissipate. Nature abhors a potential energy gradient -- but not all gradients.


-- Russ

 

On Thu, Aug 5, 2010 at 11:09 AM, Grant Holland <grant.holland.sf@...> wrote:

Glen is very close to interpreting what I mean to say. Thanks, Glen!

(But of course, I have to try one more time, since I've  thought of another - hopefully more compact - way to approach it...)

Logically speaking, "degree of unpredictability" and "degree of disorganization" are orthogonal concepts and ought to be able to vary independently - at least in certain domains. If one were to develop a theory about them (and I am), then that theory should provide for them to be able to vary independently.

Of course, for some "applications" of that theory, these "predictability/unpredictability" and "organization/disorganization" variables may be dependent on each other. For example, in Thermodynamics, it may be that the degree unpredictability and the degree of disorganization are correlated. (This is how many people seem to interpret the second law.) But this is specific to a Physics application.

However, in other applications, it could be that the degree uncertainty and the degree of disorganization vary independently. For example, I'm developing a mathematic theory of living and lifelike systems. Sometimes in that domain there is a high degree of predictability that an organo-chemical entity is organized, and sometimes there is unpredictability around that. The same statement goes for predictability or unpredictability around disorganization.  Thus, in the world of  living systems,  unpredictability and  disorganization can vary independently.

To make matters more interesting, these two variables can be joined in a joint space. For example, in the "living systems example" we could ask about the probability of advancing from a certain disorganized state in one moment to a certain organized state in the next moment. In fact, we could look at the entire probability distribution of advancing from this certain disorganized state at this moment to all possible states at the next moment - some of which are more disorganized than others. But if we ask this question, then we are asking about a probability distribution of states that have varying degrees of organization associated with them. But, we also have a probability distribution involved now, so we can ask "what is it's Shannon entropy?" That is, what is its degree of unpredictability? So we have created a joint space that asks about both disorganization and unpredictability at the same time. This is what I do in my theory ("Organic Complex Systems").

Statistical Thermodynamics (statistical mechanics) also mixes these two orthogonal variables in a similar way. This is another way of looking at what Gibbs (and Boltzmann) contributed. Especially Gibbs talks about the probability distributions of various "arrangements" (organizations) of molecules in an ideal gas (these arrangements, states, are defined by position and momentum). So he is interested in probabilities of various "organizations" of molecules. And, the Gibbs formula for entropy is a measurement of this combination of interests. I suspect that it is this combination that is confusing to so many. (Does "disorder" mean "disorganization", or does it mean "unpredictability". In fact, I believe reasonable to say that Gibbs formula measures "the unpredictability of being able to talk about which "arrangements" will obtain."

In fact, Gibbs formula for thermodynamic entropy looks exactly like Shannon's - except for the presence of a constant in Gibbs formula. They are isomorphic! However, they are speaking to different domains. Gibbs is modeling a physics phenomena, and Shannon is modeling a mathematical statistics phenomena. The second law applies to Gibbs conversation - but not to Shannon's.

In my theory, I use Shannon's - but not Gibbs'.

(Oops, I guess that wasn't any shorter than Glen's explanation. :-[ )

Grant



glen e. p. ropella wrote:

Nicholas Thompson wrote  circa 08/05/2010 08:30 AM:  
    
All of this, it seems to me, can be accommodated by – indeed requires – 
a common language between information entropy and physics entropy, the  
very language which GRANT seems to argue is impossible.  
      
OK.  But that doesn't change the sense much.  Grant seemed to be
arguing  
that it's because we use a common language to talk about the two  
concepts, the concepts are erroneously conflated.  I.e. Grant not only 
admits the possibility of a common language, he _laments_ the common  
language because it facilitates the conflation of the two different  
concepts ... unless I've misinterpreted what he's said, of course.  
   
    
I would like to apologize to everybody for these errors.  I am beginning 
to think I am too old to be trusted with a distribution list.  It’s not 
that I don’t go over the posts before I send them … and in fact, what I 
sent represented weeks of thinking and a couple of evenings of drafting  
… believe it or not!  It seems that there are SOME sorts of errors I 
cannot see until they are pointed out to me, and these seem to be, of  
late, the fatal ones.  
      
We're all guilty of this.  It's why things like peer review and  
criticism are benevolent gifts from those who donate their time and  
effort to criticize others.  It's also why e-mail and forums are more 
powerful and useful than the discredit they usually receive.  While it's 
true that face-to-face conversation has higher bandwidth, e-mail,  
forums, and papers force us to think deeply and seriously about what we  
say ... and, therefore think.  So, as embarrassing as "errors" like this 
feel, they provide the fulcrum for clear and critical thinking.  I say 
let's keep making them!  
   
Err with Gusto! ;-)  
   
    

 

--   
Grant Holland  
VP, Product Development and Software Engineering  
NuTech Solutions  
404.427.4759  


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a moz-do-not-send="" href="http://www.friam.org" onclick="window.open('http://www.friam.org');return false;">http://www.friam.org

 

   
 
   
============================================================  
FRIAM Applied Complexity Group listserv  
Meets Fridays 9a-11:30 at cafe at St. John's College  
lectures, archives, unsubscribe, maps at <a moz-do-not-send=""
href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org  



--   
Grant Holland  
VP, Product Development and Software Engineering  
NuTech Solutions  
404.427.4759  

============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at <a class="moz-txt-link-freetext" href="http://www.friam.org" onclick="window.open('http://www.friam.org');return false;">http://www.friam.org

-- 
Grant Holland
VP, Product Development and Software Engineering
NuTech Solutions
404.427.4759
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Eric Charles

Professional Student and
Assistant Professor of Psychology
Penn State University
Altoona, PA 16601



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: entropy and uncertainty, REDUX

Nick Thompson
In reply to this post by Russ Abbott

Russ,

 

Just to be irksome:  What was it that Newton said about hypotheses?  I don’t make any? 

 

On this account, What more could the fact that there are power law relationships all over the place mean?

 

I should admit, that I agree with you and that my … um … intuition is that Newton was wrong …. About himself. 

 

Roger

 

Is that book well-written enough, broad enough, important enough, that a group of us might take it on in the spring?

 

Nick  

 

 

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of Russ Abbott
Sent: Saturday, August 07, 2010 4:05 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] entropy and uncertainty, REDUX

 

I'm not convinced. Much of his complexity has to do with things breaking down, which is more like an increase in entropy rather than complexity.

Besides that, it seems less like a "law" than like an observation--similar to the fact that there are power law relationships all over the place. That doesn't mean there is a "power law force" in nature.


-- Russ

 

On Sat, Aug 7, 2010 at 12:53 PM, Roger Critchlow <[hidden email]> wrote:

Introducing another thread, the measure of diversity used in ecology
is Shannon's entropy.

-- rec --

---------------------
http://www.semcoop.com/book/9780226562261
---------------------
Biology's First Law: The Tendency for Diversity and Complexity to
Increase in Evolutionary Systems (Paperback)

Description

Life on earth is characterized by three striking phenomena that demand
explanation: adaptation—the marvelous fit between organism and
environment; diversity—the great variety of organisms; and
complexity—the enormous intricacy of their internal structure. Natural
selection explains adaptation. But what explains diversity and
complexity? Daniel W. McShea and Robert N. Brandon argue that there
exists in evolution a spontaneous tendency toward increased diversity
and complexity, one that acts whether natural selection is present or
not. They call this tendency a biological law—the Zero-Force
Evolutionary Law, or ZFEL. This law unifies the principles and data of
biology under a single framework and invites a reconceptualization of
the field of the same sort that Newton’s First Law brought to physics.



Biology’s First Law shows how the ZFEL can be applied to the study of
diversity and complexity and examines its wider implications for
biology. Intended for evolutionary biologists, paleontologists, and
other scientists studying complex systems, and written in a concise
and engaging format that speaks to students and interdisciplinary
practitioners alike, this book will also find an appreciative audience
in the philosophy of science.

About the Author

Daniel W. McShea is Associate Professor of Biology, with a secondary
appointment in Philosophy, and Robert N. Brandon is Professor of
Philosophy, with a secondary appointment in Biology, both at Duke
University.


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
12