Notions of entropy

classic Classic list List threaded Threaded
41 messages Options
123
Reply | Threaded
Open this post in threaded view
|

Notions of entropy

Nick Thompson

At Friam today, we had our first discussion of entropy in a while.   It was like old times.  I really enjoyed it. 

 

But the following disagreement came up.   I am, I think, a bit of what philosophers call an essentialist.  In other words, I assume that  when people use the same words for two things, it aint for nothing, that there is something underlying the surface that makes those two things the same.  So, underlying all the uses of the word “entropy” is a common core, and …. Here’s the tricky bit … that that common core could be expressed mathematically.  However, I thought my fellow discussants disagreed with this naïve intuition and agreed that the physical and the information theoretical uses of the word “entropy” were “not mathematically equivalent”, which I take to mean that, no mathematical operation could be devised that would turn one into the other. That the uses of the word entropy were more like members of a family then they were like expressions of some essence.    

 

I wonder what you-all think about that.

 

Nick

 

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/~nickthompson/naturaldesigns/

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Russell Standish-2
I'd say bollocks to that. Entropy is clearly information in
disguise. That changes in it can also be related to the amount of heat
processed at a given temperature is also something that comes out of
considering rearrangements of kinetic molecular motion, and ultimately
gives rise to the well known Landauer limit on computing.

On Fri, Oct 11, 2013 at 02:00:35PM -0600, Nick Thompson wrote:

> At Friam today, we had our first discussion of entropy in a while.   It was
> like old times.  I really enjoyed it.  
>
>  
>
> But the following disagreement came up.   I am, I think, a bit of what
> philosophers call an essentialist.  In other words, I assume that  when
> people use the same words for two things, it aint for nothing, that there is
> something underlying the surface that makes those two things the same.  So,
> underlying all the uses of the word “entropy” is a common core, and ….
> Here’s the tricky bit … that that common core could be expressed
> mathematically.  However, I thought my fellow discussants disagreed with
> this naïve intuition and agreed that the physical and the information
> theoretical uses of the word “entropy” were “not mathematically equivalent”,
> which I take to mean that, no mathematical operation could be devised that
> would turn one into the other. That the uses of the word entropy were more
> like members of a family then they were like expressions of some essence.
>
>
>  
>
> I wonder what you-all think about that.
>
>  
>
> Nick
>
>  
>
> Nicholas S. Thompson
>
> Emeritus Professor of Psychology and Biology
>
> Clark University
>
> http://home.earthlink.net/~nickthompson/naturaldesigns/
>
>  
>

> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


--

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Jochen Fromm-5
Nice to see the list is still alive :-) Entropy as
information in disguise. Interesting. Isn't Entropy
related to disorder, that is to say lack of information?

-J.

On 10/11/2013 11:10 PM, Russell Standish wrote:

> I'd say bollocks to that. Entropy is clearly information in
> disguise. That changes in it can also be related to the amount of heat
> processed at a given temperature is also something that comes out of
> considering rearrangements of kinetic molecular motion, and ultimately
> gives rise to the well known Landauer limit on computing.
>
> On Fri, Oct 11, 2013 at 02:00:35PM -0600, Nick Thompson wrote:
>> At Friam today, we had our first discussion of entropy in a while.   It was
>> like old times.  I really enjoyed it.
>>
>>  
>>
>> But the following disagreement came up.   I am, I think, a bit of what
>> philosophers call an essentialist.  In other words, I assume that  when
>> people use the same words for two things, it aint for nothing, that there is
>> something underlying the surface that makes those two things the same.  So,
>> underlying all the uses of the word “entropy” is a common core, and ….
>> Here’s the tricky bit … that that common core could be expressed
>> mathematically.  However, I thought my fellow discussants disagreed with
>> this naïve intuition and agreed that the physical and the information
>> theoretical uses of the word “entropy” were “not mathematically equivalent”,
>> which I take to mean that, no mathematical operation could be devised that
>> would turn one into the other. That the uses of the word entropy were more
>> like members of a family then they were like expressions of some essence.
>>
>>
>>  
>>
>> I wonder what you-all think about that.
>>
>>  
>>
>> Nick
>>
>>  
>>
>> Nicholas S. Thompson
>>
>> Emeritus Professor of Psychology and Biology
>>
>> Clark University
>>
>> http://home.earthlink.net/~nickthompson/naturaldesigns/
>>
>>  
>>
>> ============================================================
>> FRIAM Applied Complexity Group listserv
>> Meets Fridays 9a-11:30 at cafe at St. John's College
>> to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
>


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Tom Carter
In reply to this post by Nick Thompson
All --

  Ah, "entropy" . . .

  A couple of references:

     http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf

     http://charlottewerndl.net/Entropy_Guide.pdf

  A couple of quotes from the E. T. Jaynes article (gibbs.vs.boltzmann reference above):

     "It is interesting that although this field has long been regarded as one of the most puzzling and controversial parts of physics, the difficulties have not been mathematical."

and

     "From this we see that entropy is an anthropomorphic concept, not only in the well-known sense that it measures the extent of human ignorance as to the microstate.  (\em Even at the purely phenomenological level, entropy is an anthropomorphic concept.}  For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it."


  With respect to Gibbs ("physics") entropy and Shannon ("information theory", in the discrete case) entropy, up to a constant, they have the same expression (sum{p_i log(1/p_i)}).  I'll observe that Boltzmann's Constant (k), which is the constant in the Gibbs formulation, is a "bridge" between micro-states and macro-states -- which, if you think about it, reinforces the notion that "entropy" is an "anthropomorphic concept" because, for us, a "macro-state" is a "human-sized state" (see pedagogical question below . . . :-)  Note, though, that changing the base of the logarithm also introduces a constant in the "dimensionless" Shannon formulation. . .  

  Thanks . . .

Tom Carter

p.s.  Pedagogical question:  An exercise I do in class from time to time is to ask this question:  "What would Avogadro's Number have been if the French Revolution had failed? (Justify your answer . . .)"  (Hint:  step 1:  what possible relation might those have to each other? :-)

On Oct 11, 2013, at 1:00 PM, Nick Thompson <[hidden email]> wrote:

> At Friam today, we had our first discussion of entropy in a while.   It was like old times.  I really enjoyed it.
>  
> But the following disagreement came up.   I am, I think, a bit of what philosophers call an essentialist.  In other words, I assume that  when people use the same words for two things, it aint for nothing, that there is something underlying the surface that makes those two things the same.  So, underlying all the uses of the word “entropy” is a common core, and …. Here’s the tricky bit … that that common core could be expressed mathematically.  However, I thought my fellow discussants disagreed with this naïve intuition and agreed that the physical and the information theoretical uses of the word “entropy” were “not mathematically equivalent”, which I take to mean that, no mathematical operation could be devised that would turn one into the other. That the uses of the word entropy were more like members of a family then they were like expressions of some essence.    
>  
> I wonder what you-all think about that.
>  
> Nick
>  
> Nicholas S. Thompson
> Emeritus Professor of Psychology and Biology
> Clark University
> http://home.earthlink.net/~nickthompson/naturaldesigns/
>  
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Russell Standish-2
In reply to this post by Jochen Fromm-5
On Fri, Oct 11, 2013 at 11:08:18PM +0200, Jochen Fromm wrote:
> Nice to see the list is still alive :-) Entropy as
> information in disguise. Interesting. Isn't Entropy
> related to disorder, that is to say lack of information?
>
> -J.

Something like that. The exact relationship is

S + I = SM

where S is entropy, I is information and SM is the log of total number
of states the system could be in.

Information is sometimes said to be "negentropy", because

\Delta I = - \Delta S

when your system size remains constant over time.


--

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Jochen Fromm-5
 From what I remember the Entropy S should be
equal to S = k ln(W) where W is the number of
microstates. Ordered states have a lower number
of microstates, or something like that.
http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

-J.

On 10/11/2013 11:38 PM, Russell Standish wrote:

> On Fri, Oct 11, 2013 at 11:08:18PM +0200, Jochen Fromm wrote:
>> Nice to see the list is still alive :-) Entropy as
>> information in disguise. Interesting. Isn't Entropy
>> related to disorder, that is to say lack of information?
>>
>> -J.
> Something like that. The exact relationship is
>
> S + I = SM
>
> where S is entropy, I is information and SM is the log of total number
> of states the system could be in.
>
> Information is sometimes said to be "negentropy", because
>
> \Delta I = - \Delta S
>
> when your system size remains constant over time.
>
>


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Steve Smith
In reply to this post by Nick Thompson
On 10/11/13 2:00 PM, Nick Thompson wrote:

At Friam today, we had our first discussion of entropy in a while.   It was like old times.  I really enjoyed it. 

 

But the following disagreement came up.   I am, I think, a bit of what philosophers call an essentialist.  In other words, I assume that  when people use the same words for two things, it aint for nothing, that there is something underlying the surface that makes those two things the same.  So, underlying all the uses of the word “entropy” is a common core, and …. Here’s the tricky bit … that that common core could be expressed mathematically.  However, I thought my fellow discussants disagreed with this naïve intuition and agreed that the physical and the information theoretical uses of the word “entropy” were “not mathematically equivalent”, which I take to mean that, no mathematical operation could be devised that would turn one into the other. That the uses of the word entropy were more like members of a family then they were like expressions of some essence.    

 

I wonder what you-all think about that.

He's Baaack! 

I think your question, whether naive (as you seem to suggest) or deeply astute is good for us to consider.  To the extent that we share a common interest in "complex systems", I believe that the regimes of Shannon/Information Entropy, Gibbs/Thermodynamic Entropy, and vonNeumann/QM entropy overlap in non-ergodic systems...   I think that the biggest distinction between these measures of entropy occur in near-equilibrium systems where the ergodic hypothesis is (most) relevant. 

Our patron Saint Guerin likes to invoke Spontaneous Symmetry Breaking which I believe is a consequence or indication of the ergodic hypothesis being broken in macroscopic systems.

As our technologists move further and further into the nano scale realm, designing and building systems at the atomic or molecular level, we will see a stronger practical overlap of Shannon and Gibbs (and Von Neumann?) entropy.

I *do* look forward to a rousing round of discussion on the topic here.

Welcome back,
 - Steve




============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Merle Lefkoff-2
In reply to this post by Russell Standish-2



On Fri, Oct 11, 2013 at 3:38 PM, Russell Standish <[hidden email]> wrote:
On Fri, Oct 11, 2013 at 11:08:18PM +0200, Jochen Fromm wrote:
> Nice to see the list is still alive :-) Entropy as
> information in disguise. Interesting. Isn't Entropy
> related to disorder, that is to say lack of information?
>
> -J.

Something like that. The exact relationship is

S + I = SM

where S is entropy, I is information and SM is the log of total number
of states the system could be in.

Information is sometimes said to be "negentropy", because

\Delta I = - \Delta S

when your system size remains constant over time.




Hi Nick,

You may be the only one interested in my TEDx talk.  Here's the link:  www.youtube.com/watch?v=A_BfrWTxA_U

Sorry to miss the entropy discussion this morning, since I'm trying to rearrange the global economic system.

Merle

Merle Lefkoff, Ph.D.
President, Center for Emergent Diplomacy
Santa Fe, New Mexico, USA
[hidden email]
mobile:  <a href="tel:%28303%29%20859-5609" value="+13038595609" target="_blank">(303) 859-5609
skype:  merlelefkoffHi 

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Avogadro vs Loschmidt vs Jeane Baptiste Perrin

Steve Smith
In reply to this post by Tom Carter
Tom Carter sed:
p.s. Pedagogical question: An exercise I do in class from time to time is to ask this question: "What would Avogadro's Number have been if the French Revolution had failed? (Justify your answer . . .)" (Hint: step 1: what possible relation might those have to each other? :-)

Would it be Loschmidts number? 2.686 7774(47)×1025  molecules per cubic meter

or  would it be simply expressed in other units?
6.02214129(27)×1023 mol−1
2.73159734(12)×1026 (lb-mol)−1
1.707248434(77)×1025 (oz-mol)−1
I'm hazy on whether to attribute any such change to France's relinquishment of the Piedmont region (whence Avogadro hailed), Avogadro's introduction of the SI system into Italian science (influenced by the French?), or Jean Baptiste Perrin's (French Nobel prizewinner) eminence that allowed him to actually name this magic number in honor of Amedeo Avogadro?

I have a question for you...

If Avogadro's number is a mole, what is an Avacado's number.

Inquiring minds want to know!
hint: it is a bad pun
- Steve





============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Avogadro vs Loschmidt vs Jeane Baptiste Perrin

Russell Standish-2
Speaking of which, I have very fond memory of "avacodo's number" being
prepared at the table at a restaurant just outside Santa Fe that Andy
Wuensche took me to.

Cheers


On Fri, Oct 11, 2013 at 04:08:18PM -0600, Steve Smith wrote:

>
> I have a question for you...
>
>    If Avogadro's number is a mole, what is an Avacado's number.
>
>
> Inquiring minds want to know!
>
>    hint: it is a bad pun
>
> - Steve


--

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Avogadro vs Loschmidt vs Jeane Baptiste Perrin

Steve Smith
Russel -



> Speaking of which, I have very fond memory of "avacodo's number" being
> prepared at the table at a restaurant just outside Santa Fe that Andy
> Wuensche took me to.
That's gotta be Gabriel's (unless it was more than 20 years ago, then it
was likely the same location then called Los Brazos?)...

Does anyone else remember when this dish was presented in a traditional
molcajete (bowl carved from basalt) rather than the ones of the same
style now made of plastic?  The trick, of course, was to line the bowl
with lettuce to limit the otherwise large number of molecules of avocado
wasted.

I believe molcajetes were originally mortars (to be paired with a
pestle-tejolete).

I also just discovered (totally unrelated) that the seed or pit of the
avocado is edible despite the high concentration of tannins... very much
like acorn nuts.

- Steve

>
> Cheers
>
>
> On Fri, Oct 11, 2013 at 04:08:18PM -0600, Steve Smith wrote:
>> I have a question for you...
>>
>>     If Avogadro's number is a mole, what is an Avacado's number.
>>
>>
>> Inquiring minds want to know!
>>
>>     hint: it is a bad pun
>>
>> - Steve
>


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Steve Smith
In reply to this post by Merle Lefkoff-2
Merle -

Thanks for the link to your TedX talk...

I certainly was inspired by the sentiments expressed in general but was hoping for a more direct connection with Complexity Science...   Is there anything written up, making a deeper connection perhaps?

- Steve



On Fri, Oct 11, 2013 at 3:38 PM, Russell Standish <[hidden email]> wrote:
On Fri, Oct 11, 2013 at 11:08:18PM +0200, Jochen Fromm wrote:
> Nice to see the list is still alive :-) Entropy as
> information in disguise. Interesting. Isn't Entropy
> related to disorder, that is to say lack of information?
>
> -J.

Something like that. The exact relationship is

S + I = SM

where S is entropy, I is information and SM is the log of total number
of states the system could be in.

Information is sometimes said to be "negentropy", because

\Delta I = - \Delta S

when your system size remains constant over time.




Hi Nick,

You may be the only one interested in my TEDx talk.  Here's the link:  www.youtube.com/watch?v=A_BfrWTxA_U

Sorry to miss the entropy discussion this morning, since I'm trying to rearrange the global economic system.

Merle

Merle Lefkoff, Ph.D.
President, Center for Emergent Diplomacy
Santa Fe, New Mexico, USA
[hidden email]
mobile:  <a moz-do-not-send="true" href="tel:%28303%29%20859-5609" value="+13038595609" target="_blank">(303) 859-5609
skype:  merlelefkoffHi 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Avogadro vs Loschmidt vs Jeane Baptiste Perrin

Russell Standish-2
In reply to this post by Steve Smith
On Fri, Oct 11, 2013 at 04:53:35PM -0600, Steve Smith wrote:

> Russel -
>
>
>
> >Speaking of which, I have very fond memory of "avacodo's number" being
> >prepared at the table at a restaurant just outside Santa Fe that Andy
> >Wuensche took me to.
> That's gotta be Gabriel's (unless it was more than 20 years ago,
> then it was likely the same location then called Los Brazos?)...
>
> Does anyone else remember when this dish was presented in a
> traditional molcajete (bowl carved from basalt) rather than the ones
> of the same style now made of plastic?  The trick, of course, was to
> line the bowl with lettuce to limit the otherwise large number of
> molecules of avocado wasted.
>
> I believe molcajetes were originally mortars (to be paired with a
> pestle-tejolete).
>
> I also just discovered (totally unrelated) that the seed or pit of
> the avocado is edible despite the high concentration of tannins...
> very much like acorn nuts.
>

Probably Gabriel's. It was 15 years ago. And still the best I've ever
tasted.

Cheers

--

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Avogadro vs Loschmidt vs Jeane Baptiste Perrin

Steve Smith
Russell -

 > Probably Gabriel's. It was 15 years ago. And still the best I've ever
tasted. Cheers
And it is the ultimate in "open source"... all the ingredients and the
method of preparation are presented to you completely transparently...
and somehow, nobody else seems to be able to beat it!

I grew up partially on the AZ/MX border where the restaurants (on the MX
side) had avocados year around (on the US side, they were very seasonal)
and the standard was to prepare it right at the table as done at
Gabriel's today.  Of course, the economics were completely different, I
vaguely remember the going price (mid seventies) being 10 pesos ($.80
US) for a Molcajete with 2 avocados.

I think the results were every bit as good as Gabriel's but somehow
spending $12US for 1.5 avocados makes it ever so much more desirable?  
I suspect margaritas were also roughly $1US as well...

- Steve

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Avogadro vs Loschmidt vs Jeane Baptiste Perrin

Tom Carter
In reply to this post by Steve Smith
Steve -

  Mostly going for "other units" (I like "oz-mol"s . . . but I also like the "teaspoon-mol" -- what it might have been if the British had taken charge . . . :-) -- but going in the direction of something like "science as an embodied, social/political activity", etc., etc. . . .

  Also, distinguishing (somehow or other) between "natural constants" (e.g., Avogadro's Number is roughly the number of atoms in your pinky, which depends strongly on you being "human sized" rather than, say, "planet sized" -- viz. "Solaris" by Stanislas Lem -- or "bacterium sized") and "constants of nature" (e.g., the mass of an electron, or the speed of light . . .).  Meters, grams, seconds, for example, are "human sized" units.  They're only "natural" in the sense that humans are part of nature.  Wikipedia on the Metric System has a paragraph starting "At the outbreak of the French Revolution in 1789, most countries and even some cities had their own system of measurement." (and thus, sort of, my question . . . :-)

  Anyway, the discussion goes on . . .  in a class I'm teaching now, we're reading Bruno Latour's "Pandora's Hope" . . .

  Also, forgot this reference in my previous post:  I've recently been carrying around "Time's Arrow: The Origins of Thermodynamic Behavior" by Michael G. Mackey . . . nice little book . . .

  Thanks . . .

tom

p.s.  More pedagogy:  Which weighs more, a pound of gold or a pound of wheat?  (And, for extra credit, which weighs more, an ounce of gold or an ounce of wheat?)    (explain your answers . . .  :-)    (I'm teaching an undergraduate "General Education" course for Juniors in our Honor's program, called "Methods of Discovery" . . .)

On Oct 11, 2013, at 3:08 PM, Steve Smith <[hidden email]> wrote:

> Tom Carter sed:
> p.s. Pedagogical question: An exercise I do in class from time to time is to ask this question: "What would Avogadro's Number have been if the French Revolution had failed? (Justify your answer . . .)" (Hint: step 1: what possible relation might those have to each other? :-)
>
> Would it be Loschmidts number? 2.686 7774(47)×1025  molecules per cubic meter
>
> or  would it be simply expressed in other units?
> 6.02214129(27)×1023 mol−1
> 2.73159734(12)×1026 (lb-mol)−1
> 1.707248434(77)×1025 (oz-mol)−1
> I'm hazy on whether to attribute any such change to France's relinquishment of the Piedmont region (whence Avogadro hailed), Avogadro's introduction of the SI system into Italian science (influenced by the French?), or Jean Baptiste Perrin's (French Nobel prizewinner) eminence that allowed him to actually name this magic number in honor of Amedeo Avogadro?
>
> I have a question for you...
>
> If Avogadro's number is a mole, what is an Avacado's number.
>
> Inquiring minds want to know!
> hint: it is a bad pun
> - Steve
>
>
>
>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

smime.p7s (3K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

lrudolph
In reply to this post by Nick Thompson
Nick writes, in relevant part:

> I am, I think, a bit of what
> philosophers call an essentialist.  In other words, I assume that when
> people use the same words for two things, it aint for nothing, that there is
> something underlying the surface that makes those two things the same.  So,
> underlying all the uses of the word "entropy" is a common core, and ....

I'm not going to go anywhere near the mathematical question here.  What
I want to do is challenge your "In other words" sentence (which, by the
way, I *hope* is not what philosophers would mean by calling you "an
essentialist").

One thing I have learned in the last three or four years, much of
which I have spent trawling through huge corpora of scholarly (and
less scholarly) writing, including Google Scholar (and just plain
Google Books), JStor, MUSE, PsycInfo, Mathematical Reviews, etc.,
is that "when people use the same words for two things",
it's distressingly common that it IS "for nothing", or nearly
nothing--either two or more different groups of scholars have
adopted a word from Common English into their own jargons, with
no (ac)knowledge(ment) of the other groups' jargon, or two or
more different groups of scholars have independently *coined*
a word (most usually from New Latin or New Greek roots that are
part of scholars' common store).

Actually, the first case of this that I really noticed was several
years before I got involved professionally.  In a social newsgroup,
a linguist of my acquaintance happened to use the word "assonance".  
And he used it WRONG.  That is, he used it entirely inconsistently
with the meaning that it has had for eons in the theory of prosody,
and that every poet learns (essentially, assonance in prosody is
vowel harmony).  When I challenged him on this, my friend said that
the word had been introduced to linguistics by the (very eminent,
now very dead) Yale linguist Dwight Bollinger.  And he implied
that the linguists weren't about to change.  Tant pis, said I.

Then I got involved in the Kitchen Seminar (FRIAMers, you can
ignore that; it's a note to Nick), and began to hear psychologists
(but not Nick!) use the phrase "dynamic system" (or occasionally
"dynamical system"). As a mathematician I knew what that phrase
meant, and they were WRONG.  

After some years in the Kitchen, I began work on my book on
mathematical modeling for psychology; eventually I saw I
needed to write a chapter clarifying the uses of those phrases.  
Three or four years of work on _The Varieties of Dynamic(al)
Experience_ later, I had accumulated *enormous* amounts of
textual evidence that there had been NO cross-pollination:
the two phrases arose entirely independently.  (Then,
unfortunately, hapless psychologists and other "human
scientists" started appropriating [what they badly understood
of] the mathematical results that can be proved about
mathematicians' "dynamical systems" to draw ENTIRELY
UNSUBSTANTIATED conclusions about psychologists' "dynamic
systems".)

Most recently, I've been going through the same exercise
(again for a chapter, now not in a book of my own) for
"recursion" and "recursive".  Again, I have accumulated
(and documented) enormous amounts of textual evidence
(from all those corpora); here is a brief outline of
the situation (with examples and all, the whole thing
is about 25 pages at the moment, interlarded with another
25 pages on "infinity" and topped off--I mean, bottomed
off--with 15 pages of references).  Before the outline,
however, I will quote four practitioners of various
human sciences who have had cause to complain of the
present mess.

==a sociologist of law:==
In the context of causal analysis, as carried on in empirical research
(e.g. path analysis) nonrecursive models are employed, to denote the
case of mutual influencing of variables. When the autopoesis literature
speaks of recursive processes, it is presumably those nonrecursive
models of causal analysis that are meant. What a tower of Babel!
(Rottleuthner, 1988, p. 119)

==a physicist turned cognitive scientist (via LOGO):==
One is led to wonder if all authors are talking about and experimenting
with the same notion and, if not, what this notion could be. As it
happens, a careful reading shows that it is not so and that, unless a
very loose and rather useless definition of the term ["recursive"] is
assumed, it could be worthwhile to separate this confusing braid into
its constituent strands [...]. (Vitale, 1989, p. 253)

==an evolutionary linguist:==
Definitions of recursion found in the linguistics and computer science
literatures suffer from inconsistency and opacity (Kinsella, 2010, p.
179)

==a political scientist:==
The term `recursive´ [...] has multiple uses in the political science
literature. [... Political scientists should address] [t]he problem of
divergent meaning [...] through a survey of potential for reconciliation
or possible substitute terminology (Towne, 2010, p. 259) ===

Now, the outline.

o  There are three distinct meanings of "recursive"/"recursion"--let me
abbreviate that to R/R from now on--in mathematics.  The oldest one
describes so-called "recurrence relations" (like the one that defines
the Fibonnaci sequence: F1=1, F2=1, Fn = Fn-1 + Fn-2).  The next oldest,
dating only from last century, is the one used in mathematical logic;
it's derived from the oldest but it's much more general ("recursive
functions").  There's an entirely UNrelated one used in a minor branch
of dynamical systems theory (which has had no influence outside of a
very small circle), apparently named because of a connection to
"recurrence" in colloquial English (think "Poincare section" if
that helps).  

o  The oldest mathematical sense has spawned a meaning that
started in economics and then spread (it's the one that
Rottleuthner was talking about); mathematically, it corresponds
to upper-triangular matrices (coding causalities).

o  The next-oldest has spawned the present, barely coherent
(cf. Kinsella), use of R/R in linguistics and linguistics-inspired
social science. *Some* of Seymour Papert's--and, thence, the LOGO
community's--uses of R/R come from this tradition (one of his two
Ph.D.s is, after all, in mathematics).  

o  Another sense of R/R comes from Piaget (with a nod towards
Poincare).  *The rest* of Seymour Papert's--and, thence, the LOGO
community's--uses of R/R come from this tradition (his second
Ph.D., in Psychology, was supervised by Piaget).  Piaget, I am
afraid, is responsible for a great deal of muddle on this subject.

o  Yet another sense of R/R, used in human ecology,
anthropology, political science, sociology, and educational
theory sprang--somehow--out of cybernetics and General
Systems Theory (even though none of the early cyberneticists
like von Neumann, Shannon, and Weiner, and none of the early
GS people like Bertallanfy and Rapoport, ever seem to have
used the word AT ALL, except for a couple of times in early
papers of von Neumann where he was using it in the oldest
mathematical meaning).  It really seems that Bateson pulled
the word out of the air (that is, out of his no doubt rigorous
classical education) at some point, and it spread from him,
in a (typically) incoherent fashion, and apparently mostly
by word of mouth--he didn't commit either word to print until
the year before his death, though his biographer Harries-Jones
has seen a notebook in which Bateson recorded using the word
in a lecture in 1975.  (Harries-Jones's title for the biography,
_A recursive vision: Ecological understanding and Gregory
Bateson_, is, in my opinion, irredeemably tendentious, and
a perfect example of muddle.)  Insofar as Bateson ever tries
to actually *define* R/R, it's here:

==
[T]here seem to be two species of recursiveness, of somewhat different nature, of which the
first goes back to Norbert Wiener and is well-known: the "feedback" that is perhaps the best
known feature of the whole cybernetic syndrome. The point is that self-corrective and quasi
purposive systems necessarily and always have the characteristic that causal trains within the
system are themselves circular. [...] The second type of recursiveness has been proposed by
Varela and Maturana. These theoreticians discuss the case in which some property of a whole is
fed back into the system, producing a somewhat different type of recursiveness[...]. We live in
a universe in which causal trains endure, survive through time, only if they are recursive.
(Bateson, 1977, p. 220)
===

Needless to say, Wiener never called feedback (or anything
else) "recursive", and it's a real stretch to connect the
mathematics of feedback to mathematical notions of R/R.
Nor did Varela and Maturana EVER use R/R (in print at least)
before 1977; they instead coined "autopoeisis", which again,
insofar as it can be mathematicized, is not mathematical
R/R.   (Later Maturana does use "recursive".)

o  An Australian economic geographer named Walmsley somehow came up
with a notion of R/R c. 1972; until and unless he answers my e-mail
(pending now for several months, so I'm not holding my breath), I can
only assume, from references he cites, that he somehow came up with
his idea by combining General Systems Theory (though the word doesn't
appear there) with Piaget.  Given that he states in one place that
"Shopping is a form of recursive behavior", you won't be surprised
that his idea--whatever it may be--appears entirely unrelated to
mathematical (or linguistic) R/R.  In any case, he doesn't seem to
have inspired any followers.

o  A sociologist named Scheff starts using the *words* "recursive"
and "recursion" c. 2005, for ideas (either his or others'; see
below) that were around starting in 1967.

==Scheff (2005):===
In one of my own earlier articles (Scheff 1967), I proposed a model of consensus that has a
recursive quality like the one that runs through Goffman's frame analysis. [...] As it happened,
Goffman (1969) pursued a similar idea in some parts of his book on strategic interaction. [...]
[A] similar treatment can be found in a book by the Russian mathematician Lefebvre (1977), The
Structure of Awareness. [...]I wonder whether Lefebvre came up with the idea of reflexive mutual
awareness independently of my model. He cites Laing, Phillipson, and Lee (1966), a brief work
devoted to a recursive model of mutual awareness that preceded Lefebvre´s book (1977).
However, he also cites his own earliest work on recursive awareness, an article (1965) that
precedes the Laing, Phillipson, and Lee book.
        It is possible that Lefebvre´s work was based on my (1967) model of recursive awareness, even
though the evidence is only circumstantial. As Laing, Phillipson, and Lee (1966) indicate,
their book developed from my presentation of the model in Laing's seminar in 1964. Since there
were some 20 persons there, Lefebvre could have heard about the seminar from one of those, or
indirectly by way of others in contact with a seminar member.
===

However, despite all the heavy lifting involved in Scheff's
name-dropping, the words "recursive" and "recursion" appear
nowhere in the cited works by Laing, Phillipson & Lee (1966),
Scheff (1967), or Goffman (1969). Lefebvre (1977, but not 1965)
does use "recursive" in the two major mathematical senses, and
even quotes Chomsky (although I think it likely--I haven't been
able to get the Russian originals of Lefebvre--that all that
was introduced by his translator, Rapaport of GS fame).
Rather, Laing, Phillipson & Lee, Scheff, and Goffman
consistently use the words "reflexive", "reflection", and
"reflexivity". These are glossed by Scheff in a variety of ways:
"recursive awareness", "mutual awareness" (harkening back to
Goffman´s signature phrase, "mutual consideration"), "not
only understanding the other, but also understanding that
one is understood, and vice versa", "not only a first-level
agreement, but, when necessary, second and higher levels of
understanding that there is an agreement", etc.  Sheesh.

o  Finally (thank you for the reference, Nick), Peter Lipton
and Nick Thompson published an article in 1988 titled "Comparative
psychology and the recursive structure of filter explanations."  
It's a great article, but the sense in which it uses "recursive"
(Lipton's coinage) is unrelated to any of the other senses
(nor has it been taken up since, as far as I can tell).  

[Here endeth the outline.]

The "common core", if there is one, is nothing more than
the collocation of the morphemes "re-" and "-cur-", of which
the former is still very productive in English, while the
latter is (at most New) Latin and no longer productive at
all; semantically, this makes the meaning of that common
core approximately "RUN AGAIN", which I submit is AT BEST a
trivial commonality of the various different uses, and (as
far as I understand some of the woolier uses, which is not
that far) not a commonality AT ALL of the entire set.  
If that be essence, make the least of it!

Lee Rudolph
   


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Russell Standish-2
On Fri, Oct 11, 2013 at 08:49:44PM -0400, [hidden email] wrote:
>
> Most recently, I've been going through the same exercise
> (again for a chapter, now not in a book of my own) for
> "recursion" and "recursive".  Again, I have accumulated

... what a mess!

Back in the day when I was teaching computational science for a
living, I had to carefully explain the difference between two distinct
meanings of recursion.

1) A "recursive loop" is one whose iterations depend on values computed
in the previous loop. Related obviously to the "oldest" mathematical
definition you gave. It impedes vectorisation and parallelisation of
said loop.

2) A "recursive function" is one that calls itself, a term quite
familiar to people brought up in computer science.

In the good old days, when men programmed in Fortran, concept 1 was
always meant, as Fortran did not support recursion. That has all
changed now :).

And there is a third meaning for recursion used by theoretical
computer scientists, where is basically means a computable
function. See page 29 of Li and Vitanyi's tome of Kolmogorov complexity.

Cheers

-

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Owen Densmore
Administrator
OK, I'll bite.  Why NOT let entropy simply be an equation that is useful in certain domains?

I rather like the lack of ambiguity.

   -- Owen


On Fri, Oct 11, 2013 at 7:52 PM, Russell Standish <[hidden email]> wrote:
On Fri, Oct 11, 2013 at 08:49:44PM -0400, [hidden email] wrote:
>
> Most recently, I've been going through the same exercise
> (again for a chapter, now not in a book of my own) for
> "recursion" and "recursive".  Again, I have accumulated

... what a mess!

Back in the day when I was teaching computational science for a
living, I had to carefully explain the difference between two distinct
meanings of recursion.

1) A "recursive loop" is one whose iterations depend on values computed
in the previous loop. Related obviously to the "oldest" mathematical
definition you gave. It impedes vectorisation and parallelisation of
said loop.

2) A "recursive function" is one that calls itself, a term quite
familiar to people brought up in computer science.

In the good old days, when men programmed in Fortran, concept 1 was
always meant, as Fortran did not support recursion. That has all
changed now :).

And there is a third meaning for recursion used by theoretical
computer scientists, where is basically means a computable
function. See page 29 of Li and Vitanyi's tome of Kolmogorov complexity.

Cheers

-

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Nick Thompson

OK, I’ll bite your bite.  For the same reason that the world was outraged when some experimental psychologists defined emotionality as the number of turds left in an open field maze by a white rat. 

 

N

 

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/~nickthompson/naturaldesigns/

 

From: Friam [mailto:[hidden email]] On Behalf Of Owen Densmore
Sent: Friday, October 11, 2013 8:28 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] Notions of entropy

 

OK, I'll bite.  Why NOT let entropy simply be an equation that is useful in certain domains?

 

I rather like the lack of ambiguity.

 

   -- Owen

 

On Fri, Oct 11, 2013 at 7:52 PM, Russell Standish <[hidden email]> wrote:

On Fri, Oct 11, 2013 at 08:49:44PM -0400, [hidden email] wrote:
>
> Most recently, I've been going through the same exercise
> (again for a chapter, now not in a book of my own) for
> "recursion" and "recursive".  Again, I have accumulated

... what a mess!

Back in the day when I was teaching computational science for a
living, I had to carefully explain the difference between two distinct
meanings of recursion.

1) A "recursive loop" is one whose iterations depend on values computed
in the previous loop. Related obviously to the "oldest" mathematical
definition you gave. It impedes vectorisation and parallelisation of
said loop.

2) A "recursive function" is one that calls itself, a term quite
familiar to people brought up in computer science.

In the good old days, when men programmed in Fortran, concept 1 was
always meant, as Fortran did not support recursion. That has all
changed now :).

And there is a third meaning for recursion used by theoretical
computer scientists, where is basically means a computable
function. See page 29 of Li and Vitanyi's tome of Kolmogorov complexity.

Cheers


-

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Owen Densmore
Administrator
Do you really feel that turds are equivalent to probability measures?

I can see that a mouse emits more turds when excited, that's fine.  It doesn't lead to measures of security on the internet.  It doesn't quantify information.

And thermodynamics and information theory have made good use of the probabilistic reduction of entropy.  And it is concrete and well defined.

So what's wrong with that?

   -- Owen


On Fri, Oct 11, 2013 at 8:38 PM, Nick Thompson <[hidden email]> wrote:

OK, I’ll bite your bite.  For the same reason that the world was outraged when some experimental psychologists defined emotionality as the number of turds left in an open field maze by a white rat. 

 

N

 

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/~nickthompson/naturaldesigns/

 

From: Friam [mailto:[hidden email]] On Behalf Of Owen Densmore
Sent: Friday, October 11, 2013 8:28 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] Notions of entropy

 

OK, I'll bite.  Why NOT let entropy simply be an equation that is useful in certain domains?

 

I rather like the lack of ambiguity.

 

   -- Owen

 

On Fri, Oct 11, 2013 at 7:52 PM, Russell Standish <[hidden email]> wrote:

On Fri, Oct 11, 2013 at 08:49:44PM -0400, [hidden email] wrote:
>
> Most recently, I've been going through the same exercise
> (again for a chapter, now not in a book of my own) for
> "recursion" and "recursive".  Again, I have accumulated

... what a mess!

Back in the day when I was teaching computational science for a
living, I had to carefully explain the difference between two distinct
meanings of recursion.

1) A "recursive loop" is one whose iterations depend on values computed
in the previous loop. Related obviously to the "oldest" mathematical
definition you gave. It impedes vectorisation and parallelisation of
said loop.

2) A "recursive function" is one that calls itself, a term quite
familiar to people brought up in computer science.

In the good old days, when men programmed in Fortran, concept 1 was
always meant, as Fortran did not support recursion. That has all
changed now :).

And there is a third meaning for recursion used by theoretical
computer scientists, where is basically means a computable
function. See page 29 of Li and Vitanyi's tome of Kolmogorov complexity.

Cheers


-

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
123