Notions of entropy

classic Classic list List threaded Threaded
41 messages Options
123
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Nick Thompson

OMIGOSH, no.  You complete mistook me. Sorry.   I am the last person to compare math to rat turds.  Turds of any kind, for that matter.  My {recently} late big brother was a mathematician.  Some of my best friends (and favorite collaborators)  are mathematicians.  The point was only that number of rat turds …”fecal boluses” was the term of art … was desirable just because it was clear and easily understood (like mathematical expressions for entropy).   Yet, just as you could never get the world to agree that emotionality was just the number of fecal  boluses left by a rat in an open field maze, you will never get the world to agree that entropy is just the output of a mathematical formula.  They might say, “that is a useful measure of entropy, but that is not what it IS.”   To put the matter more technically, no matter how much reliability a definition buys you, it still does not necessarily buy you validity.  The same point might be made about f=ma.  (I fear being flamed by Bruce, at this point, but let it go.) Non fingo hypotheses and all that.  One could, like a good positivist, simply assert that a thing IS that which most reliably measures it, but few people outside your field will be comfortable with that, and everybody, even including your closest colleagues, will continue to use the word in some other sense at cocktail parties.  It was my position that the lab bench meaning and the cocktail meaning have some common core that we have some responsibility to try to find. 

 

Sorry, again.  I certainly didn’t mean to be insulting 

Nick

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/~nickthompson/naturaldesigns/

 

From: Friam [mailto:[hidden email]] On Behalf Of Owen Densmore
Sent: Friday, October 11, 2013 8:46 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] Notions of entropy

 

Do you really feel that turds are equivalent to probability measures?

 

I can see that a mouse emits more turds when excited, that's fine.  It doesn't lead to measures of security on the internet.  It doesn't quantify information.

 

And thermodynamics and information theory have made good use of the probabilistic reduction of entropy.  And it is concrete and well defined.

 

So what's wrong with that?

 

   -- Owen

 

On Fri, Oct 11, 2013 at 8:38 PM, Nick Thompson <[hidden email]> wrote:

OK, I’ll bite your bite.  For the same reason that the world was outraged when some experimental psychologists defined emotionality as the number of turds left in an open field maze by a white rat. 

 

N

 

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/~nickthompson/naturaldesigns/

 

From: Friam [mailto:[hidden email]] On Behalf Of Owen Densmore
Sent: Friday, October 11, 2013 8:28 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] Notions of entropy

 

OK, I'll bite.  Why NOT let entropy simply be an equation that is useful in certain domains?

 

I rather like the lack of ambiguity.

 

   -- Owen

 

On Fri, Oct 11, 2013 at 7:52 PM, Russell Standish <[hidden email]> wrote:

On Fri, Oct 11, 2013 at 08:49:44PM -0400, [hidden email] wrote:
>
> Most recently, I've been going through the same exercise
> (again for a chapter, now not in a book of my own) for
> "recursion" and "recursive".  Again, I have accumulated

... what a mess!

Back in the day when I was teaching computational science for a
living, I had to carefully explain the difference between two distinct
meanings of recursion.

1) A "recursive loop" is one whose iterations depend on values computed
in the previous loop. Related obviously to the "oldest" mathematical
definition you gave. It impedes vectorisation and parallelisation of
said loop.

2) A "recursive function" is one that calls itself, a term quite
familiar to people brought up in computer science.

In the good old days, when men programmed in Fortran, concept 1 was
always meant, as Fortran did not support recursion. That has all
changed now :).

And there is a third meaning for recursion used by theoretical
computer scientists, where is basically means a computable
function. See page 29 of Li and Vitanyi's tome of Kolmogorov complexity.

Cheers


-

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      [hidden email]
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Avogadro vs Loschmidt vs Jeane Baptiste Perrin

Steve Smith
In reply to this post by Tom Carter
On 10/11/13 6:22 PM, Tom Carter wrote:
> Steve -
>
>    Mostly going for "other units" (I like "oz-mol"s . . . but I also like the "teaspoon-mol" -- what it might have been if the British had taken charge . . . :-) -- but going in the direction of something like "science as an embodied, social/political activity", etc., etc. . . .
>
>    Also, distinguishing (somehow or other) between "natural constants" (e.g., Avogadro's Number is roughly the number of atoms in your pinky, which depends strongly on you being "human sized" rather than, say, "planet sized" -- viz. "Solaris" by Stanislas Lem -- or "bacterium sized") and "constants of nature" (e.g., the mass of an electron, or the speed of light . . .).  Meters, grams, seconds, for example, are "human sized" units.  They're only "natural" in the sense that humans are part of nature.  Wikipedia on the Metric System has a paragraph starting "At the outbreak of the French Revolution in 1789, most countries and even some cities had their own system of measurement." (and thus, sort of, my question . . . :-)
So you are saying that a significant consequence (or correlated change?)
of the French Revolution was the standardization of measuring systems
across europe?

It seems that before travel (and trade) was very extensive, local
measures were fine... A foot is a foot, even if the local ruler is a
midget or giant... all locals have the opportunity to calibrate to the
"ruler's foot" and all the traders are motivated to do the same when
they arrive at any given locale.  I assume there was even something like
arbitrage going on where the size of your ruler's foot may have given
you a minor (dis)advantage in trade?  But then there would be local
currency as well...  Guilders per cubic knuckle vs Florens per cubic
nose-length?   Tower of Babel indeed!  The thing about standards is that
we have so many to choose from!
>
>    Anyway, the discussion goes on . . .  in a class I'm teaching now, we're reading Bruno Latour's "Pandora's Hope" . . .
>
>    Also, forgot this reference in my previous post:  I've recently been carrying around "Time's Arrow: The Origins of Thermodynamic Behavior" by Michael G. Mackey . . . nice little book . . .
I remember when Time's Arrow came out... it was an excellent (highly
motivated) layman's guide to some of the more interesting unexpected
implications of thermodynamics.   I'd recommend it to Nick for his
contemplation of the deeper meaning of Entropy.
>
>    Thanks . . .
>
> tom
>
> p.s.  More pedagogy:  Which weighs more, a pound of gold or a pound of wheat?  (And, for extra credit, which weighs more, an ounce of gold or an ounce of wheat?)    (explain your answers . . .  :-)
I think we are returning to the domain of the Buttload (earlier
thread)...   and a distinction between troy and avoirdupois.   I
remember being excited at the (relative) absoluteness of the Mole as a
measure to bootstrap from.   Similarly Pi and "c".    The "intrinsic"
measure of a "grain" is of course, also regionally/circumstantially
adjusted... I'm reminded that precious metals/gems *and* gunpowder still
use the "grain" as a basic measure.
>     (I'm teaching an undergraduate "General Education" course for Juniors in our Honor's program, called "Methods of Discovery" . . .)
This sounds like a very good course for many... where do you teach? Is
this part of a liberal arts or more science/engineering program?   In my
day, it was strictly the luck of the draw whether you happened upon a
teacher/professor who offered anything more than the dry, linear
interpretation of well... pretty much everything/anything.

- Steve


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Steve Smith
In reply to this post by lrudolph
Lee -

That was fascinating... thanks for sharing a peek into what was clearly
a lot of careful work.

- Steve

> Nick writes, in relevant part:
>
>> I am, I think, a bit of what
>> philosophers call an essentialist.  In other words, I assume that when
>> people use the same words for two things, it aint for nothing, that there is
>> something underlying the surface that makes those two things the same.  So,
>> underlying all the uses of the word "entropy" is a common core, and ....
> I'm not going to go anywhere near the mathematical question here.  What
> I want to do is challenge your "In other words" sentence (which, by the
> way, I *hope* is not what philosophers would mean by calling you "an
> essentialist").
>
> One thing I have learned in the last three or four years, much of
> which I have spent trawling through huge corpora of scholarly (and
> less scholarly) writing, including Google Scholar (and just plain
> Google Books), JStor, MUSE, PsycInfo, Mathematical Reviews, etc.,
> is that "when people use the same words for two things",
> it's distressingly common that it IS "for nothing", or nearly
> nothing--either two or more different groups of scholars have
> adopted a word from Common English into their own jargons, with
> no (ac)knowledge(ment) of the other groups' jargon, or two or
> more different groups of scholars have independently *coined*
> a word (most usually from New Latin or New Greek roots that are
> part of scholars' common store).
>
> Actually, the first case of this that I really noticed was several
> years before I got involved professionally.  In a social newsgroup,
> a linguist of my acquaintance happened to use the word "assonance".
> And he used it WRONG.  That is, he used it entirely inconsistently
> with the meaning that it has had for eons in the theory of prosody,
> and that every poet learns (essentially, assonance in prosody is
> vowel harmony).  When I challenged him on this, my friend said that
> the word had been introduced to linguistics by the (very eminent,
> now very dead) Yale linguist Dwight Bollinger.  And he implied
> that the linguists weren't about to change.  Tant pis, said I.
>
> Then I got involved in the Kitchen Seminar (FRIAMers, you can
> ignore that; it's a note to Nick), and began to hear psychologists
> (but not Nick!) use the phrase "dynamic system" (or occasionally
> "dynamical system"). As a mathematician I knew what that phrase
> meant, and they were WRONG.
>
> After some years in the Kitchen, I began work on my book on
> mathematical modeling for psychology; eventually I saw I
> needed to write a chapter clarifying the uses of those phrases.
> Three or four years of work on _The Varieties of Dynamic(al)
> Experience_ later, I had accumulated *enormous* amounts of
> textual evidence that there had been NO cross-pollination:
> the two phrases arose entirely independently.  (Then,
> unfortunately, hapless psychologists and other "human
> scientists" started appropriating [what they badly understood
> of] the mathematical results that can be proved about
> mathematicians' "dynamical systems" to draw ENTIRELY
> UNSUBSTANTIATED conclusions about psychologists' "dynamic
> systems".)
>
> Most recently, I've been going through the same exercise
> (again for a chapter, now not in a book of my own) for
> "recursion" and "recursive".  Again, I have accumulated
> (and documented) enormous amounts of textual evidence
> (from all those corpora); here is a brief outline of
> the situation (with examples and all, the whole thing
> is about 25 pages at the moment, interlarded with another
> 25 pages on "infinity" and topped off--I mean, bottomed
> off--with 15 pages of references).  Before the outline,
> however, I will quote four practitioners of various
> human sciences who have had cause to complain of the
> present mess.
>
> ==a sociologist of law:==
> In the context of causal analysis, as carried on in empirical research
> (e.g. path analysis) nonrecursive models are employed, to denote the
> case of mutual influencing of variables. When the autopoesis literature
> speaks of recursive processes, it is presumably those nonrecursive
> models of causal analysis that are meant. What a tower of Babel!
> (Rottleuthner, 1988, p. 119)
>
> ==a physicist turned cognitive scientist (via LOGO):==
> One is led to wonder if all authors are talking about and experimenting
> with the same notion and, if not, what this notion could be. As it
> happens, a careful reading shows that it is not so and that, unless a
> very loose and rather useless definition of the term ["recursive"] is
> assumed, it could be worthwhile to separate this confusing braid into
> its constituent strands [...]. (Vitale, 1989, p. 253)
>
> ==an evolutionary linguist:==
> Definitions of recursion found in the linguistics and computer science
> literatures suffer from inconsistency and opacity (Kinsella, 2010, p.
> 179)
>
> ==a political scientist:==
> The term `recursive´ [...] has multiple uses in the political science
> literature. [... Political scientists should address] [t]he problem of
> divergent meaning [...] through a survey of potential for reconciliation
> or possible substitute terminology (Towne, 2010, p. 259) ===
>
> Now, the outline.
>
> o  There are three distinct meanings of "recursive"/"recursion"--let me
> abbreviate that to R/R from now on--in mathematics.  The oldest one
> describes so-called "recurrence relations" (like the one that defines
> the Fibonnaci sequence: F1=1, F2=1, Fn = Fn-1 + Fn-2).  The next oldest,
> dating only from last century, is the one used in mathematical logic;
> it's derived from the oldest but it's much more general ("recursive
> functions").  There's an entirely UNrelated one used in a minor branch
> of dynamical systems theory (which has had no influence outside of a
> very small circle), apparently named because of a connection to
> "recurrence" in colloquial English (think "Poincare section" if
> that helps).
>
> o  The oldest mathematical sense has spawned a meaning that
> started in economics and then spread (it's the one that
> Rottleuthner was talking about); mathematically, it corresponds
> to upper-triangular matrices (coding causalities).
>
> o  The next-oldest has spawned the present, barely coherent
> (cf. Kinsella), use of R/R in linguistics and linguistics-inspired
> social science. *Some* of Seymour Papert's--and, thence, the LOGO
> community's--uses of R/R come from this tradition (one of his two
> Ph.D.s is, after all, in mathematics).
>
> o  Another sense of R/R comes from Piaget (with a nod towards
> Poincare).  *The rest* of Seymour Papert's--and, thence, the LOGO
> community's--uses of R/R come from this tradition (his second
> Ph.D., in Psychology, was supervised by Piaget).  Piaget, I am
> afraid, is responsible for a great deal of muddle on this subject.
>
> o  Yet another sense of R/R, used in human ecology,
> anthropology, political science, sociology, and educational
> theory sprang--somehow--out of cybernetics and General
> Systems Theory (even though none of the early cyberneticists
> like von Neumann, Shannon, and Weiner, and none of the early
> GS people like Bertallanfy and Rapoport, ever seem to have
> used the word AT ALL, except for a couple of times in early
> papers of von Neumann where he was using it in the oldest
> mathematical meaning).  It really seems that Bateson pulled
> the word out of the air (that is, out of his no doubt rigorous
> classical education) at some point, and it spread from him,
> in a (typically) incoherent fashion, and apparently mostly
> by word of mouth--he didn't commit either word to print until
> the year before his death, though his biographer Harries-Jones
> has seen a notebook in which Bateson recorded using the word
> in a lecture in 1975.  (Harries-Jones's title for the biography,
> _A recursive vision: Ecological understanding and Gregory
> Bateson_, is, in my opinion, irredeemably tendentious, and
> a perfect example of muddle.)  Insofar as Bateson ever tries
> to actually *define* R/R, it's here:
>
> ==
> [T]here seem to be two species of recursiveness, of somewhat different nature, of which the
> first goes back to Norbert Wiener and is well-known: the "feedback" that is perhaps the best
> known feature of the whole cybernetic syndrome. The point is that self-corrective and quasi
> purposive systems necessarily and always have the characteristic that causal trains within the
> system are themselves circular. [...] The second type of recursiveness has been proposed by
> Varela and Maturana. These theoreticians discuss the case in which some property of a whole is
> fed back into the system, producing a somewhat different type of recursiveness[...]. We live in
> a universe in which causal trains endure, survive through time, only if they are recursive.
> (Bateson, 1977, p. 220)
> ===
>
> Needless to say, Wiener never called feedback (or anything
> else) "recursive", and it's a real stretch to connect the
> mathematics of feedback to mathematical notions of R/R.
> Nor did Varela and Maturana EVER use R/R (in print at least)
> before 1977; they instead coined "autopoeisis", which again,
> insofar as it can be mathematicized, is not mathematical
> R/R.   (Later Maturana does use "recursive".)
>
> o  An Australian economic geographer named Walmsley somehow came up
> with a notion of R/R c. 1972; until and unless he answers my e-mail
> (pending now for several months, so I'm not holding my breath), I can
> only assume, from references he cites, that he somehow came up with
> his idea by combining General Systems Theory (though the word doesn't
> appear there) with Piaget.  Given that he states in one place that
> "Shopping is a form of recursive behavior", you won't be surprised
> that his idea--whatever it may be--appears entirely unrelated to
> mathematical (or linguistic) R/R.  In any case, he doesn't seem to
> have inspired any followers.
>
> o  A sociologist named Scheff starts using the *words* "recursive"
> and "recursion" c. 2005, for ideas (either his or others'; see
> below) that were around starting in 1967.
>
> ==Scheff (2005):===
> In one of my own earlier articles (Scheff 1967), I proposed a model of consensus that has a
> recursive quality like the one that runs through Goffman's frame analysis. [...] As it happened,
> Goffman (1969) pursued a similar idea in some parts of his book on strategic interaction. [...]
> [A] similar treatment can be found in a book by the Russian mathematician Lefebvre (1977), The
> Structure of Awareness. [...]I wonder whether Lefebvre came up with the idea of reflexive mutual
> awareness independently of my model. He cites Laing, Phillipson, and Lee (1966), a brief work
> devoted to a recursive model of mutual awareness that preceded Lefebvre´s book (1977).
> However, he also cites his own earliest work on recursive awareness, an article (1965) that
> precedes the Laing, Phillipson, and Lee book.
> It is possible that Lefebvre´s work was based on my (1967) model of recursive awareness, even
> though the evidence is only circumstantial. As Laing, Phillipson, and Lee (1966) indicate,
> their book developed from my presentation of the model in Laing's seminar in 1964. Since there
> were some 20 persons there, Lefebvre could have heard about the seminar from one of those, or
> indirectly by way of others in contact with a seminar member.
> ===
>
> However, despite all the heavy lifting involved in Scheff's
> name-dropping, the words "recursive" and "recursion" appear
> nowhere in the cited works by Laing, Phillipson & Lee (1966),
> Scheff (1967), or Goffman (1969). Lefebvre (1977, but not 1965)
> does use "recursive" in the two major mathematical senses, and
> even quotes Chomsky (although I think it likely--I haven't been
> able to get the Russian originals of Lefebvre--that all that
> was introduced by his translator, Rapaport of GS fame).
> Rather, Laing, Phillipson & Lee, Scheff, and Goffman
> consistently use the words "reflexive", "reflection", and
> "reflexivity". These are glossed by Scheff in a variety of ways:
> "recursive awareness", "mutual awareness" (harkening back to
> Goffman´s signature phrase, "mutual consideration"), "not
> only understanding the other, but also understanding that
> one is understood, and vice versa", "not only a first-level
> agreement, but, when necessary, second and higher levels of
> understanding that there is an agreement", etc.  Sheesh.
>
> o  Finally (thank you for the reference, Nick), Peter Lipton
> and Nick Thompson published an article in 1988 titled "Comparative
> psychology and the recursive structure of filter explanations."
> It's a great article, but the sense in which it uses "recursive"
> (Lipton's coinage) is unrelated to any of the other senses
> (nor has it been taken up since, as far as I can tell).
>
> [Here endeth the outline.]
>
> The "common core", if there is one, is nothing more than
> the collocation of the morphemes "re-" and "-cur-", of which
> the former is still very productive in English, while the
> latter is (at most New) Latin and no longer productive at
> all; semantically, this makes the meaning of that common
> core approximately "RUN AGAIN", which I submit is AT BEST a
> trivial commonality of the various different uses, and (as
> far as I understand some of the woolier uses, which is not
> that far) not a commonality AT ALL of the entire set.
> If that be essence, make the least of it!
>
> Lee Rudolph
>    
>
>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
>


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Steve Smith
In reply to this post by lrudolph
On the topic of language, this hyperpolyglot's description of his
experience learning/speaking many languages is fascinating:
http://www.youtube.com/watch?v=Km9-DiFaxpU

His assertion that he "thinks differently" when he speaks different
languages and that he begins learning a new language by listening to it
a lot and emulating what he hears (television).

> Nick writes, in relevant part:
>
>> I am, I think, a bit of what
>> philosophers call an essentialist.  In other words, I assume that when
>> people use the same words for two things, it aint for nothing, that there is
>> something underlying the surface that makes those two things the same.  So,
>> underlying all the uses of the word "entropy" is a common core, and ....
> I'm not going to go anywhere near the mathematical question here.  What
> I want to do is challenge your "In other words" sentence (which, by the
> way, I *hope* is not what philosophers would mean by calling you "an
> essentialist").
>
> One thing I have learned in the last three or four years, much of
> which I have spent trawling through huge corpora of scholarly (and
> less scholarly) writing, including Google Scholar (and just plain
> Google Books), JStor, MUSE, PsycInfo, Mathematical Reviews, etc.,
> is that "when people use the same words for two things",
> it's distressingly common that it IS "for nothing", or nearly
> nothing--either two or more different groups of scholars have
> adopted a word from Common English into their own jargons, with
> no (ac)knowledge(ment) of the other groups' jargon, or two or
> more different groups of scholars have independently *coined*
> a word (most usually from New Latin or New Greek roots that are
> part of scholars' common store).
>
> Actually, the first case of this that I really noticed was several
> years before I got involved professionally.  In a social newsgroup,
> a linguist of my acquaintance happened to use the word "assonance".
> And he used it WRONG.  That is, he used it entirely inconsistently
> with the meaning that it has had for eons in the theory of prosody,
> and that every poet learns (essentially, assonance in prosody is
> vowel harmony).  When I challenged him on this, my friend said that
> the word had been introduced to linguistics by the (very eminent,
> now very dead) Yale linguist Dwight Bollinger.  And he implied
> that the linguists weren't about to change.  Tant pis, said I.
>
> Then I got involved in the Kitchen Seminar (FRIAMers, you can
> ignore that; it's a note to Nick), and began to hear psychologists
> (but not Nick!) use the phrase "dynamic system" (or occasionally
> "dynamical system"). As a mathematician I knew what that phrase
> meant, and they were WRONG.
>
> After some years in the Kitchen, I began work on my book on
> mathematical modeling for psychology; eventually I saw I
> needed to write a chapter clarifying the uses of those phrases.
> Three or four years of work on _The Varieties of Dynamic(al)
> Experience_ later, I had accumulated *enormous* amounts of
> textual evidence that there had been NO cross-pollination:
> the two phrases arose entirely independently.  (Then,
> unfortunately, hapless psychologists and other "human
> scientists" started appropriating [what they badly understood
> of] the mathematical results that can be proved about
> mathematicians' "dynamical systems" to draw ENTIRELY
> UNSUBSTANTIATED conclusions about psychologists' "dynamic
> systems".)
>
> Most recently, I've been going through the same exercise
> (again for a chapter, now not in a book of my own) for
> "recursion" and "recursive".  Again, I have accumulated
> (and documented) enormous amounts of textual evidence
> (from all those corpora); here is a brief outline of
> the situation (with examples and all, the whole thing
> is about 25 pages at the moment, interlarded with another
> 25 pages on "infinity" and topped off--I mean, bottomed
> off--with 15 pages of references).  Before the outline,
> however, I will quote four practitioners of various
> human sciences who have had cause to complain of the
> present mess.
>
> ==a sociologist of law:==
> In the context of causal analysis, as carried on in empirical research
> (e.g. path analysis) nonrecursive models are employed, to denote the
> case of mutual influencing of variables. When the autopoesis literature
> speaks of recursive processes, it is presumably those nonrecursive
> models of causal analysis that are meant. What a tower of Babel!
> (Rottleuthner, 1988, p. 119)
>
> ==a physicist turned cognitive scientist (via LOGO):==
> One is led to wonder if all authors are talking about and experimenting
> with the same notion and, if not, what this notion could be. As it
> happens, a careful reading shows that it is not so and that, unless a
> very loose and rather useless definition of the term ["recursive"] is
> assumed, it could be worthwhile to separate this confusing braid into
> its constituent strands [...]. (Vitale, 1989, p. 253)
>
> ==an evolutionary linguist:==
> Definitions of recursion found in the linguistics and computer science
> literatures suffer from inconsistency and opacity (Kinsella, 2010, p.
> 179)
>
> ==a political scientist:==
> The term `recursive´ [...] has multiple uses in the political science
> literature. [... Political scientists should address] [t]he problem of
> divergent meaning [...] through a survey of potential for reconciliation
> or possible substitute terminology (Towne, 2010, p. 259) ===
>
> Now, the outline.
>
> o  There are three distinct meanings of "recursive"/"recursion"--let me
> abbreviate that to R/R from now on--in mathematics.  The oldest one
> describes so-called "recurrence relations" (like the one that defines
> the Fibonnaci sequence: F1=1, F2=1, Fn = Fn-1 + Fn-2).  The next oldest,
> dating only from last century, is the one used in mathematical logic;
> it's derived from the oldest but it's much more general ("recursive
> functions").  There's an entirely UNrelated one used in a minor branch
> of dynamical systems theory (which has had no influence outside of a
> very small circle), apparently named because of a connection to
> "recurrence" in colloquial English (think "Poincare section" if
> that helps).
>
> o  The oldest mathematical sense has spawned a meaning that
> started in economics and then spread (it's the one that
> Rottleuthner was talking about); mathematically, it corresponds
> to upper-triangular matrices (coding causalities).
>
> o  The next-oldest has spawned the present, barely coherent
> (cf. Kinsella), use of R/R in linguistics and linguistics-inspired
> social science. *Some* of Seymour Papert's--and, thence, the LOGO
> community's--uses of R/R come from this tradition (one of his two
> Ph.D.s is, after all, in mathematics).
>
> o  Another sense of R/R comes from Piaget (with a nod towards
> Poincare).  *The rest* of Seymour Papert's--and, thence, the LOGO
> community's--uses of R/R come from this tradition (his second
> Ph.D., in Psychology, was supervised by Piaget).  Piaget, I am
> afraid, is responsible for a great deal of muddle on this subject.
>
> o  Yet another sense of R/R, used in human ecology,
> anthropology, political science, sociology, and educational
> theory sprang--somehow--out of cybernetics and General
> Systems Theory (even though none of the early cyberneticists
> like von Neumann, Shannon, and Weiner, and none of the early
> GS people like Bertallanfy and Rapoport, ever seem to have
> used the word AT ALL, except for a couple of times in early
> papers of von Neumann where he was using it in the oldest
> mathematical meaning).  It really seems that Bateson pulled
> the word out of the air (that is, out of his no doubt rigorous
> classical education) at some point, and it spread from him,
> in a (typically) incoherent fashion, and apparently mostly
> by word of mouth--he didn't commit either word to print until
> the year before his death, though his biographer Harries-Jones
> has seen a notebook in which Bateson recorded using the word
> in a lecture in 1975.  (Harries-Jones's title for the biography,
> _A recursive vision: Ecological understanding and Gregory
> Bateson_, is, in my opinion, irredeemably tendentious, and
> a perfect example of muddle.)  Insofar as Bateson ever tries
> to actually *define* R/R, it's here:
>
> ==
> [T]here seem to be two species of recursiveness, of somewhat different nature, of which the
> first goes back to Norbert Wiener and is well-known: the "feedback" that is perhaps the best
> known feature of the whole cybernetic syndrome. The point is that self-corrective and quasi
> purposive systems necessarily and always have the characteristic that causal trains within the
> system are themselves circular. [...] The second type of recursiveness has been proposed by
> Varela and Maturana. These theoreticians discuss the case in which some property of a whole is
> fed back into the system, producing a somewhat different type of recursiveness[...]. We live in
> a universe in which causal trains endure, survive through time, only if they are recursive.
> (Bateson, 1977, p. 220)
> ===
>
> Needless to say, Wiener never called feedback (or anything
> else) "recursive", and it's a real stretch to connect the
> mathematics of feedback to mathematical notions of R/R.
> Nor did Varela and Maturana EVER use R/R (in print at least)
> before 1977; they instead coined "autopoeisis", which again,
> insofar as it can be mathematicized, is not mathematical
> R/R.   (Later Maturana does use "recursive".)
>
> o  An Australian economic geographer named Walmsley somehow came up
> with a notion of R/R c. 1972; until and unless he answers my e-mail
> (pending now for several months, so I'm not holding my breath), I can
> only assume, from references he cites, that he somehow came up with
> his idea by combining General Systems Theory (though the word doesn't
> appear there) with Piaget.  Given that he states in one place that
> "Shopping is a form of recursive behavior", you won't be surprised
> that his idea--whatever it may be--appears entirely unrelated to
> mathematical (or linguistic) R/R.  In any case, he doesn't seem to
> have inspired any followers.
>
> o  A sociologist named Scheff starts using the *words* "recursive"
> and "recursion" c. 2005, for ideas (either his or others'; see
> below) that were around starting in 1967.
>
> ==Scheff (2005):===
> In one of my own earlier articles (Scheff 1967), I proposed a model of consensus that has a
> recursive quality like the one that runs through Goffman's frame analysis. [...] As it happened,
> Goffman (1969) pursued a similar idea in some parts of his book on strategic interaction. [...]
> [A] similar treatment can be found in a book by the Russian mathematician Lefebvre (1977), The
> Structure of Awareness. [...]I wonder whether Lefebvre came up with the idea of reflexive mutual
> awareness independently of my model. He cites Laing, Phillipson, and Lee (1966), a brief work
> devoted to a recursive model of mutual awareness that preceded Lefebvre´s book (1977).
> However, he also cites his own earliest work on recursive awareness, an article (1965) that
> precedes the Laing, Phillipson, and Lee book.
> It is possible that Lefebvre´s work was based on my (1967) model of recursive awareness, even
> though the evidence is only circumstantial. As Laing, Phillipson, and Lee (1966) indicate,
> their book developed from my presentation of the model in Laing's seminar in 1964. Since there
> were some 20 persons there, Lefebvre could have heard about the seminar from one of those, or
> indirectly by way of others in contact with a seminar member.
> ===
>
> However, despite all the heavy lifting involved in Scheff's
> name-dropping, the words "recursive" and "recursion" appear
> nowhere in the cited works by Laing, Phillipson & Lee (1966),
> Scheff (1967), or Goffman (1969). Lefebvre (1977, but not 1965)
> does use "recursive" in the two major mathematical senses, and
> even quotes Chomsky (although I think it likely--I haven't been
> able to get the Russian originals of Lefebvre--that all that
> was introduced by his translator, Rapaport of GS fame).
> Rather, Laing, Phillipson & Lee, Scheff, and Goffman
> consistently use the words "reflexive", "reflection", and
> "reflexivity". These are glossed by Scheff in a variety of ways:
> "recursive awareness", "mutual awareness" (harkening back to
> Goffman´s signature phrase, "mutual consideration"), "not
> only understanding the other, but also understanding that
> one is understood, and vice versa", "not only a first-level
> agreement, but, when necessary, second and higher levels of
> understanding that there is an agreement", etc.  Sheesh.
>
> o  Finally (thank you for the reference, Nick), Peter Lipton
> and Nick Thompson published an article in 1988 titled "Comparative
> psychology and the recursive structure of filter explanations."
> It's a great article, but the sense in which it uses "recursive"
> (Lipton's coinage) is unrelated to any of the other senses
> (nor has it been taken up since, as far as I can tell).
>
> [Here endeth the outline.]
>
> The "common core", if there is one, is nothing more than
> the collocation of the morphemes "re-" and "-cur-", of which
> the former is still very productive in English, while the
> latter is (at most New) Latin and no longer productive at
> all; semantically, this makes the meaning of that common
> core approximately "RUN AGAIN", which I submit is AT BEST a
> trivial commonality of the various different uses, and (as
> far as I understand some of the woolier uses, which is not
> that far) not a commonality AT ALL of the entire set.
> If that be essence, make the least of it!
>
> Lee Rudolph
>    
>
>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
>


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Nick Thompson
In reply to this post by lrudolph

Lee,

 

I am one of those cooks who never looks up the recipe until after the dinner is in the oven ... if, at all.  So, regrettably, I never looked up the term essentialism before I used it. 

 

However, if SEP is to believed, I seem to have come close.  Remember I only claimed to be "a bit of an essentialist."  Perhaps I should have said “teensy.” 

 

Kind essentialism has a number of tenets. One tenet is that all and only the members of a kind have a common essence. A second tenet is that the essence of a kind is responsible for the traits typically associated with the members of that kind. For example, gold's atomic structure is responsible for gold's disposition to melt at certain temperatures. Third, knowing a kind's essence helps us explain and predict those properties typically associated with a kind. The application of any of these tenets to species is problematic. But to see the failure of essentialism we need only consider the first tenet.

 

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/~nickthompson/naturaldesigns/

 

-----Original Message-----
From: Friam [mailto:[hidden email]] On Behalf Of [hidden email]
Sent: Friday, October 11, 2013 6:50 PM
To: Friam
Subject: Re: [FRIAM] Notions of entropy

 

Nick writes, in relevant part:

 

> I am, I think, a bit of what

> philosophers call an essentialist.  In other words, I assume that when

> people use the same words for two things, it aint for nothing, that

> there is something underlying the surface that makes those two things

> the same.  So, underlying all the uses of the word "entropy" is a common core, and ....

 

I'm not going to go anywhere near the mathematical question here.  What I want to do is challenge your "In other words" sentence (which, by the way, I *hope* is not what philosophers would mean by calling you "an essentialist").

 

One thing I have learned in the last three or four years, much of which I have spent trawling through huge corpora of scholarly (and less scholarly) writing, including Google Scholar (and just plain Google Books), JStor, MUSE, PsycInfo, Mathematical Reviews, etc., is that "when people use the same words for two things", it's distressingly common that it IS "for nothing", or nearly nothing--either two or more different groups of scholars have adopted a word from Common English into their own jargons, with no (ac)knowledge(ment) of the other groups' jargon, or two or more different groups of scholars have independently *coined* a word (most usually from New Latin or New Greek roots that are part of scholars' common store).

 

Actually, the first case of this that I really noticed was several years before I got involved professionally.  In a social newsgroup, a linguist of my acquaintance happened to use the word "assonance". 

And he used it WRONG.  That is, he used it entirely inconsistently with the meaning that it has had for eons in the theory of prosody, and that every poet learns (essentially, assonance in prosody is vowel harmony).  When I challenged him on this, my friend said that the word had been introduced to linguistics by the (very eminent, now very dead) Yale linguist Dwight Bollinger.  And he implied that the linguists weren't about to change.  Tant pis, said I.

 

Then I got involved in the Kitchen Seminar (FRIAMers, you can ignore that; it's a note to Nick), and began to hear psychologists (but not Nick!) use the phrase "dynamic system" (or occasionally "dynamical system"). As a mathematician I knew what that phrase meant, and they were WRONG. 

 

After some years in the Kitchen, I began work on my book on mathematical modeling for psychology; eventually I saw I

needed to write a chapter clarifying the uses of those phrases.  

Three or four years of work on _The Varieties of Dynamic(al) Experience_ later, I had accumulated *enormous* amounts of textual evidence that there had been NO cross-pollination:

the two phrases arose entirely independently.  (Then, unfortunately, hapless psychologists and other "human scientists" started appropriating [what they badly understood of] the mathematical results that can be proved about mathematicians' "dynamical systems" to draw ENTIRELY UNSUBSTANTIATED conclusions about psychologists' "dynamic

systems".)

 

Most recently, I've been going through the same exercise (again for a chapter, now not in a book of my own) for "recursion" and "recursive".  Again, I have accumulated (and documented) enormous amounts of textual evidence (from all those corpora); here is a brief outline of the situation (with examples and all, the whole thing is about 25 pages at the moment, interlarded with another

25 pages on "infinity" and topped off--I mean, bottomed off--with 15 pages of references).  Before the outline, however, I will quote four practitioners of various human sciences who have had cause to complain of the present mess.

 

==a sociologist of law:==

In the context of causal analysis, as carried on in empirical research (e.g. path analysis) nonrecursive models are employed, to denote the case of mutual influencing of variables. When the autopoesis literature speaks of recursive processes, it is presumably those nonrecursive models of causal analysis that are meant. What a tower of Babel!

(Rottleuthner, 1988, p. 119)

 

==a physicist turned cognitive scientist (via LOGO):== One is led to wonder if all authors are talking about and experimenting with the same notion and, if not, what this notion could be. As it happens, a careful reading shows that it is not so and that, unless a very loose and rather useless definition of the term ["recursive"] is assumed, it could be worthwhile to separate this confusing braid into its constituent strands [...]. (Vitale, 1989, p. 253)

 

==an evolutionary linguist:==

Definitions of recursion found in the linguistics and computer science literatures suffer from inconsistency and opacity (Kinsella, 2010, p.

179)

 

==a political scientist:==

The term `recursive´ [...] has multiple uses in the political science literature. [... Political scientists should address] [t]he problem of divergent meaning [...] through a survey of potential for reconciliation or possible substitute terminology (Towne, 2010, p. 259) ===

 

Now, the outline.

 

o  There are three distinct meanings of "recursive"/"recursion"--let me abbreviate that to R/R from now on--in mathematics.  The oldest one describes so-called "recurrence relations" (like the one that defines the Fibonnaci sequence: F1=1, F2=1, Fn = Fn-1 + Fn-2).  The next oldest, dating only from last century, is the one used in mathematical logic; it's derived from the oldest but it's much more general ("recursive functions").  There's an entirely UNrelated one used in a minor branch of dynamical systems theory (which has had no influence outside of a very small circle), apparently named because of a connection to "recurrence" in colloquial English (think "Poincare section" if that helps). 

 

o  The oldest mathematical sense has spawned a meaning that started in economics and then spread (it's the one that Rottleuthner was talking about); mathematically, it corresponds to upper-triangular matrices (coding causalities).

 

o  The next-oldest has spawned the present, barely coherent (cf. Kinsella), use of R/R in linguistics and linguistics-inspired social science. *Some* of Seymour Papert's--and, thence, the LOGO community's--uses of R/R come from this tradition (one of his two Ph.D.s is, after all, in mathematics). 

 

o  Another sense of R/R comes from Piaget (with a nod towards Poincare).  *The rest* of Seymour Papert's--and, thence, the LOGO community's--uses of R/R come from this tradition (his second Ph.D., in Psychology, was supervised by Piaget).  Piaget, I am afraid, is responsible for a great deal of muddle on this subject.

 

o  Yet another sense of R/R, used in human ecology, anthropology, political science, sociology, and educational theory sprang--somehow--out of cybernetics and General Systems Theory (even though none of the early cyberneticists like von Neumann, Shannon, and Weiner, and none of the early GS people like Bertallanfy and Rapoport, ever seem to have used the word AT ALL, except for a couple of times in early papers of von Neumann where he was using it in the oldest mathematical meaning).  It really seems that Bateson pulled the word out of the air (that is, out of his no doubt rigorous classical education) at some point, and it spread from him, in a (typically) incoherent fashion, and apparently mostly by word of mouth--he didn't commit either word to print until the year before his death, though his biographer Harries-Jones has seen a notebook in which Bateson recorded using the word in a lecture in 1975.  (Harries-Jones's title for the biography, _A recursive vision: Ecological understanding and Gregory Bateson_, is, in my opinion, irredeemably tendentious, and a perfect example of muddle.)  Insofar as Bateson ever tries to actually *define* R/R, it's here:

 

==

[T]here seem to be two species of recursiveness, of somewhat different nature, of which the first goes back to Norbert Wiener and is well-known: the "feedback" that is perhaps the best known feature of the whole cybernetic syndrome. The point is that self-corrective and quasi purposive systems necessarily and always have the characteristic that causal trains within the system are themselves circular. [...] The second type of recursiveness has been proposed by Varela and Maturana. These theoreticians discuss the case in which some property of a whole is fed back into the system, producing a somewhat different type of recursiveness[...]. We live in a universe in which causal trains endure, survive through time, only if they are recursive.

(Bateson, 1977, p. 220)

===

 

Needless to say, Wiener never called feedback (or anything

else) "recursive", and it's a real stretch to connect the mathematics of feedback to mathematical notions of R/R.

Nor did Varela and Maturana EVER use R/R (in print at least) before 1977; they instead coined "autopoeisis", which again, insofar as it can be mathematicized, is not mathematical

R/R.   (Later Maturana does use "recursive".)

 

o  An Australian economic geographer named Walmsley somehow came up with a notion of R/R c. 1972; until and unless he answers my e-mail (pending now for several months, so I'm not holding my breath), I can only assume, from references he cites, that he somehow came up with his idea by combining General Systems Theory (though the word doesn't appear there) with Piaget.  Given that he states in one place that "Shopping is a form of recursive behavior", you won't be surprised that his idea--whatever it may be--appears entirely unrelated to mathematical (or linguistic) R/R.  In any case, he doesn't seem to have inspired any followers.

 

o  A sociologist named Scheff starts using the *words* "recursive"

and "recursion" c. 2005, for ideas (either his or others'; see

below) that were around starting in 1967.

 

==Scheff (2005):===

In one of my own earlier articles (Scheff 1967), I proposed a model of consensus that has a recursive quality like the one that runs through Goffman's frame analysis. [...] As it happened, Goffman (1969) pursued a similar idea in some parts of his book on strategic interaction. [...] [A] similar treatment can be found in a book by the Russian mathematician Lefebvre (1977), The Structure of Awareness. [...]I wonder whether Lefebvre came up with the idea of reflexive mutual awareness independently of my model. He cites Laing, Phillipson, and Lee (1966), a brief work devoted to a recursive model of mutual awareness that preceded Lefebvre´s book (1977).

However, he also cites his own earliest work on recursive awareness, an article (1965) that precedes the Laing, Phillipson, and Lee book.

               It is possible that Lefebvre´s work was based on my (1967) model of recursive awareness, even though the evidence is only circumstantial. As Laing, Phillipson, and Lee (1966) indicate, their book developed from my presentation of the model in Laing's seminar in 1964. Since there were some 20 persons there, Lefebvre could have heard about the seminar from one of those, or indirectly by way of others in contact with a seminar member.

===

 

However, despite all the heavy lifting involved in Scheff's name-dropping, the words "recursive" and "recursion" appear nowhere in the cited works by Laing, Phillipson & Lee (1966), Scheff (1967), or Goffman (1969). Lefebvre (1977, but not 1965) does use "recursive" in the two major mathematical senses, and even quotes Chomsky (although I think it likely--I haven't been able to get the Russian originals of Lefebvre--that all that was introduced by his translator, Rapaport of GS fame).

Rather, Laing, Phillipson & Lee, Scheff, and Goffman consistently use the words "reflexive", "reflection", and "reflexivity". These are glossed by Scheff in a variety of ways:

"recursive awareness", "mutual awareness" (harkening back to Goffman´s signature phrase, "mutual consideration"), "not only understanding the other, but also understanding that one is understood, and vice versa", "not only a first-level agreement, but, when necessary, second and higher levels of understanding that there is an agreement", etc.  Sheesh.

 

o  Finally (thank you for the reference, Nick), Peter Lipton and Nick Thompson published an article in 1988 titled "Comparative psychology and the recursive structure of filter explanations." 

It's a great article, but the sense in which it uses "recursive"

(Lipton's coinage) is unrelated to any of the other senses (nor has it been taken up since, as far as I can tell). 

 

[Here endeth the outline.]

 

The "common core", if there is one, is nothing more than the collocation of the morphemes "re-" and "-cur-", of which the former is still very productive in English, while the latter is (at most New) Latin and no longer productive at all; semantically, this makes the meaning of that common core approximately "RUN AGAIN", which I submit is AT BEST a trivial commonality of the various different uses, and (as far as I understand some of the woolier uses, which is not that far) not a commonality AT ALL of the entire set. 

If that be essence, make the least of it!

 

Lee Rudolph

  

 

 

============================================================

FRIAM Applied Complexity Group listserv

Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Nick Thompson
In reply to this post by Steve Smith

Lee,

 

I sympathize with your lament about recursion.  CF the use of the terms ‘song’, note, and phrase in the bird song literature.  And these people were all working in the same narrow subfield of animal behavior.  See at least the abstract of http://home.earthlink.net/~nickthompson/naturaldesigns/id60.html   An incautious click on the abstract will down load the paper.  By the way, my suggested clarification and simplification of the terminology has not only not been put into practice but, so far as I know, has been read only once. 

 

Nick

 

 

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/~nickthompson/naturaldesigns/

 

-----Original Message-----
From: Friam [mailto:[hidden email]] On Behalf Of Steve Smith
Sent: Friday, October 11, 2013 11:01 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] Notions of entropy

 

On the topic of language, this hyperpolyglot's description of his experience learning/speaking many languages is fascinating:

http://www.youtube.com/watch?v=Km9-DiFaxpU

 

His assertion that he "thinks differently" when he speaks different languages and that he begins learning a new language by listening to it a lot and emulating what he hears (television).

 

> Nick writes, in relevant part:

> 

>> I am, I think, a bit of what

>> philosophers call an essentialist.  In other words, I assume that

>> when people use the same words for two things, it aint for nothing,

>> that there is something underlying the surface that makes those two

>> things the same.  So, underlying all the uses of the word "entropy" is a common core, and ....

> I'm not going to go anywhere near the mathematical question here. 

> What I want to do is challenge your "In other words" sentence (which,

> by the way, I *hope* is not what philosophers would mean by calling

> you "an essentialist").

> 

> One thing I have learned in the last three or four years, much of

> which I have spent trawling through huge corpora of scholarly (and

> less scholarly) writing, including Google Scholar (and just plain

> Google Books), JStor, MUSE, PsycInfo, Mathematical Reviews, etc., is

> that "when people use the same words for two things", it's

> distressingly common that it IS "for nothing", or nearly

> nothing--either two or more different groups of scholars have adopted

> a word from Common English into their own jargons, with no

> (ac)knowledge(ment) of the other groups' jargon, or two or more

> different groups of scholars have independently *coined* a word (most

> usually from New Latin or New Greek roots that are part of scholars'

> common store).

> 

> Actually, the first case of this that I really noticed was several

> years before I got involved professionally.  In a social newsgroup, a

> linguist of my acquaintance happened to use the word "assonance".

> And he used it WRONG.  That is, he used it entirely inconsistently

> with the meaning that it has had for eons in the theory of prosody,

> and that every poet learns (essentially, assonance in prosody is vowel

> harmony).  When I challenged him on this, my friend said that the word

> had been introduced to linguistics by the (very eminent, now very

> dead) Yale linguist Dwight Bollinger.  And he implied that the

> linguists weren't about to change.  Tant pis, said I.

> 

> Then I got involved in the Kitchen Seminar (FRIAMers, you can ignore

> that; it's a note to Nick), and began to hear psychologists (but not

> Nick!) use the phrase "dynamic system" (or occasionally "dynamical

> system"). As a mathematician I knew what that phrase meant, and they

> were WRONG.

> 

> After some years in the Kitchen, I began work on my book on

> mathematical modeling for psychology; eventually I saw I needed to

> write a chapter clarifying the uses of those phrases.

> Three or four years of work on _The Varieties of Dynamic(al)

> Experience_ later, I had accumulated *enormous* amounts of textual

> evidence that there had been NO cross-pollination:

> the two phrases arose entirely independently.  (Then, unfortunately,

> hapless psychologists and other "human scientists" started

> appropriating [what they badly understood of] the mathematical results

> that can be proved about mathematicians' "dynamical systems" to draw

> ENTIRELY UNSUBSTANTIATED conclusions about psychologists' "dynamic

> systems".)

> 

> Most recently, I've been going through the same exercise (again for a

> chapter, now not in a book of my own) for "recursion" and "recursive". 

> Again, I have accumulated (and documented) enormous amounts of textual

> evidence (from all those corpora); here is a brief outline of the

> situation (with examples and all, the whole thing is about 25 pages at

> the moment, interlarded with another

> 25 pages on "infinity" and topped off--I mean, bottomed off--with 15

> pages of references).  Before the outline, however, I will quote four

> practitioners of various human sciences who have had cause to complain

> of the present mess.

> 

> ==a sociologist of law:==

> In the context of causal analysis, as carried on in empirical research

> (e.g. path analysis) nonrecursive models are employed, to denote the

> case of mutual influencing of variables. When the autopoesis

> literature speaks of recursive processes, it is presumably those

> nonrecursive models of causal analysis that are meant. What a tower of Babel!

> (Rottleuthner, 1988, p. 119)

> 

> ==a physicist turned cognitive scientist (via LOGO):== One is led to

> wonder if all authors are talking about and experimenting with the

> same notion and, if not, what this notion could be. As it happens, a

> careful reading shows that it is not so and that, unless a very loose

> and rather useless definition of the term ["recursive"] is assumed, it

> could be worthwhile to separate this confusing braid into its

> constituent strands [...]. (Vitale, 1989, p. 253)

> 

> ==an evolutionary linguist:==

> Definitions of recursion found in the linguistics and computer science

> literatures suffer from inconsistency and opacity (Kinsella, 2010, p.

> 179)

> 

> ==a political scientist:==

> The term `recursive´ [...] has multiple uses in the political science

> literature. [... Political scientists should address] [t]he problem of

> divergent meaning [...] through a survey of potential for

> reconciliation or possible substitute terminology (Towne, 2010, p.

> 259) ===

> 

> Now, the outline.

> 

> o  There are three distinct meanings of "recursive"/"recursion"--let

> me abbreviate that to R/R from now on--in mathematics.  The oldest one

> describes so-called "recurrence relations" (like the one that defines

> the Fibonnaci sequence: F1=1, F2=1, Fn = Fn-1 + Fn-2).  The next

> oldest, dating only from last century, is the one used in mathematical

> logic; it's derived from the oldest but it's much more general

> ("recursive functions").  There's an entirely UNrelated one used in a

> minor branch of dynamical systems theory (which has had no influence

> outside of a very small circle), apparently named because of a

> connection to "recurrence" in colloquial English (think "Poincare

> section" if that helps).

> 

> o  The oldest mathematical sense has spawned a meaning that started in

> economics and then spread (it's the one that Rottleuthner was talking

> about); mathematically, it corresponds to upper-triangular matrices

> (coding causalities).

> 

> o  The next-oldest has spawned the present, barely coherent (cf.

> Kinsella), use of R/R in linguistics and linguistics-inspired social

> science. *Some* of Seymour Papert's--and, thence, the LOGO

> community's--uses of R/R come from this tradition (one of his two

> Ph.D.s is, after all, in mathematics).

> 

> o  Another sense of R/R comes from Piaget (with a nod towards

> Poincare).  *The rest* of Seymour Papert's--and, thence, the LOGO

> community's--uses of R/R come from this tradition (his second Ph.D.,

> in Psychology, was supervised by Piaget).  Piaget, I am afraid, is

> responsible for a great deal of muddle on this subject.

> 

> o  Yet another sense of R/R, used in human ecology, anthropology,

> political science, sociology, and educational theory

> sprang--somehow--out of cybernetics and General Systems Theory (even

> though none of the early cyberneticists like von Neumann, Shannon, and

> Weiner, and none of the early GS people like Bertallanfy and Rapoport,

> ever seem to have used the word AT ALL, except for a couple of times

> in early papers of von Neumann where he was using it in the oldest

> mathematical meaning).  It really seems that Bateson pulled the word

> out of the air (that is, out of his no doubt rigorous classical

> education) at some point, and it spread from him, in a (typically)

> incoherent fashion, and apparently mostly by word of mouth--he didn't

> commit either word to print until the year before his death, though

> his biographer Harries-Jones has seen a notebook in which Bateson

> recorded using the word in a lecture in 1975.  (Harries-Jones's title

> for the biography, _A recursive vision: Ecological understanding and

> Gregory Bateson_, is, in my opinion, irredeemably tendentious, and a

> perfect example of muddle.)  Insofar as Bateson ever tries to actually

> *define* R/R, it's here:

> 

> ==

> [T]here seem to be two species of recursiveness, of somewhat different

> nature, of which the first goes back to Norbert Wiener and is

> well-known: the "feedback" that is perhaps the best known feature of

> the whole cybernetic syndrome. The point is that self-corrective and

> quasi purposive systems necessarily and always have the characteristic

> that causal trains within the system are themselves circular. [...]

> The second type of recursiveness has been proposed by Varela and

> Maturana. These theoreticians discuss the case in which some property of a whole is fed back into the system, producing a somewhat different type of recursiveness[...]. We live in a universe in which causal trains endure, survive through time, only if they are recursive.

> (Bateson, 1977, p. 220)

> ===

> 

> Needless to say, Wiener never called feedback (or anything

> else) "recursive", and it's a real stretch to connect the mathematics

> of feedback to mathematical notions of R/R.

> Nor did Varela and Maturana EVER use R/R (in print at least) before

> 1977; they instead coined "autopoeisis", which again, insofar as it

> can be mathematicized, is not mathematical

> R/R.   (Later Maturana does use "recursive".)

> 

> o  An Australian economic geographer named Walmsley somehow came up

> with a notion of R/R c. 1972; until and unless he answers my e-mail

> (pending now for several months, so I'm not holding my breath), I can

> only assume, from references he cites, that he somehow came up with

> his idea by combining General Systems Theory (though the word doesn't

> appear there) with Piaget.  Given that he states in one place that

> "Shopping is a form of recursive behavior", you won't be surprised

> that his idea--whatever it may be--appears entirely unrelated to

> mathematical (or linguistic) R/R.  In any case, he doesn't seem to

> have inspired any followers.

> 

> o  A sociologist named Scheff starts using the *words* "recursive"

> and "recursion" c. 2005, for ideas (either his or others'; see

> below) that were around starting in 1967.

> 

> ==Scheff (2005):===

> In one of my own earlier articles (Scheff 1967), I proposed a model of

> consensus that has a recursive quality like the one that runs through

> Goffman's frame analysis. [...] As it happened, Goffman (1969) pursued

> a similar idea in some parts of his book on strategic interaction.

> [...] [A] similar treatment can be found in a book by the Russian

> mathematician Lefebvre (1977), The Structure of Awareness. [...]I

> wonder whether Lefebvre came up with the idea of reflexive mutual awareness independently of my model. He cites Laing, Phillipson, and Lee (1966), a brief work devoted to a recursive model of mutual awareness that preceded Lefebvre´s book (1977).

> However, he also cites his own earliest work on recursive awareness,

> an article (1965) that precedes the Laing, Phillipson, and Lee book.

>             It is possible that Lefebvre´s work was based on my (1967) model of

> recursive awareness, even though the evidence is only circumstantial.

> As Laing, Phillipson, and Lee (1966) indicate, their book developed

> from my presentation of the model in Laing's seminar in 1964. Since

> there were some 20 persons there, Lefebvre could have heard about the seminar from one of those, or indirectly by way of others in contact with a seminar member.

> ===

> 

> However, despite all the heavy lifting involved in Scheff's

> name-dropping, the words "recursive" and "recursion" appear nowhere in

> the cited works by Laing, Phillipson & Lee (1966), Scheff (1967), or

> Goffman (1969). Lefebvre (1977, but not 1965) does use "recursive" in

> the two major mathematical senses, and even quotes Chomsky (although I

> think it likely--I haven't been able to get the Russian originals of

> Lefebvre--that all that was introduced by his translator, Rapaport of

> GS fame).

> Rather, Laing, Phillipson & Lee, Scheff, and Goffman consistently use

> the words "reflexive", "reflection", and "reflexivity". These are

> glossed by Scheff in a variety of ways:

> "recursive awareness", "mutual awareness" (harkening back to Goffman´s

> signature phrase, "mutual consideration"), "not only understanding the

> other, but also understanding that one is understood, and vice versa",

> "not only a first-level agreement, but, when necessary, second and

> higher levels of understanding that there is an agreement", etc. 

> Sheesh.

> 

> o  Finally (thank you for the reference, Nick), Peter Lipton and Nick

> Thompson published an article in 1988 titled "Comparative psychology

> and the recursive structure of filter explanations."

> It's a great article, but the sense in which it uses "recursive"

> (Lipton's coinage) is unrelated to any of the other senses (nor has it

> been taken up since, as far as I can tell).

> 

> [Here endeth the outline.]

> 

> The "common core", if there is one, is nothing more than the

> collocation of the morphemes "re-" and "-cur-", of which the former is

> still very productive in English, while the latter is (at most New)

> Latin and no longer productive at all; semantically, this makes the

> meaning of that common core approximately "RUN AGAIN", which I submit

> is AT BEST a trivial commonality of the various different uses, and

> (as far as I understand some of the woolier uses, which is not that

> far) not a commonality AT ALL of the entire set.

> If that be essence, make the least of it!

> 

> Lee Rudolph

>    

> 

> 

> ============================================================

> FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe

> at St. John's College to unsubscribe

> http://redfish.com/mailman/listinfo/friam_redfish.com

> 

 

 

============================================================

FRIAM Applied Complexity Group listserv

Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

lrudolph
In reply to this post by Nick Thompson
Nick to Owen:

...

> Yet, just as you could never get the world to
> agree that emotionality was just the number of fecal  boluses left by a rat
> in an open field maze, you will never get the world to agree that entropy is
> just the output of a mathematical formula.  They might say, "that is a
> useful measure of entropy, but that is not what it IS."   To put the matter
> more technically, no matter how much reliability a definition buys you, it
> still does not necessarily buy you validity.  The same point might be made
> about f=ma.  (I fear being flamed by Bruce, at this point, but let it go.)
> Non fingo hypotheses and all that.  One could, like a good positivist,
> simply assert that a thing IS that which most reliably measures it, but few
> people outside your field will be comfortable with that, and everybody, even
> including your closest colleagues, will continue to use the word in some
> other sense at cocktail parties.  It was my position that the lab bench
> meaning and the cocktail meaning have some common core that we have some
> responsibility to try to find.  

This is a very pure example of the semantic drift that drives me crazy,
in that "the lab bench meaning" was the *first* meaning: the word DID NOT
EXIST before it was coined (in its adjectival form, in German, by composing
badly-understood-by-its-coiner morphemes from Greek, by the physicist Clausius)
in 1865. Tait (an early knot theorist and somewhat of a religious nut, as well
as a thermodynamic theorist) brought it into English three years later, but
changed its sign (more or less).  By 1875 Maxwell had changed it back to what
it now is.  During this period of time the concept expressed by "entropy"
became clearer, as did the whole field of thermodynamics, and eventually a
good mathematical formalism for it developed--"good" in the sense that it
"made sense" of results from "the lab bench" by reducing downwards (if I
have your phrase right? I dunno, maybe upwards, or both ways?) so as to
(1) define "entropy" of a macroscopic system in terms of the statistical
behavior of the ensemble of microscopic entities participating in that
system, and (2) facilitate calculations (some exact, some asymptotic)
of the "entropy" (and similar thermodynamic quantities) which (3) often
agreed with "lab bench" observations.  

When Shannon came along to study signals and noise in communication
channels, he had the insight to see that *the same mathematical formalism*
could be applied.  He did *not* have the insight (or dumb luck) of Clausius,
so he overloaded the already-existing Common English word "information" with
a new, technical, mathematical meaning.  *That* rather quickly allowed
visionaries, hucksters, and cocktail partiers to talk about "information
theory" without understanding much or any of its technicalities.  It also
(I suspect; but here I am arguing ahead of what data I happen to have around,
so this may merely be my default Enraged Bloviator talking) encouraged the
same gangs of semantic vandals to appropriate the word "entropy" to their
various malign uses.  (For what it's worth, the OED doesn't have citations
of non-specialist uses of thermodynamic "entropy" until the mid 1930s--by
Freud [as translated by a pair of Stracheys, not by Jones] and a Christian
apologist; non-specialist uses of information-theoretic "entropy" appear
to hold off until the mid 1960s.)

So, to whatever extent the vernacular ("cocktail party") meaning(s) of
entropy has or have a common core with the technical ("lab bench")
meaning(s), it is because that core REMAINS FROM THE TECHNICAL MEANING
after the semantic shift, and not because (as I *think* you mean to
imply in your sentence containing the word "positivist") there is
some ("common core") concept which BOTH the technical AND the
vernacular meanings are INDEPENDENT ATTEMPTS to "reliably measure".
If "we have a responsibility to try to find" anything, I think it
is to try to find *why* some people insist on (1) glomming onto bits
of jargon with very well-defined in-domain meanings, (2) ignoring much
or all of those meanings while re-applying the jargon (often without
ANY definition to speak of) in a new domain, while (3) refusing to
let go of some (or all) of the Impressive Consequences derived in
the original domain by derivations that (4) depend on the jettisoned
definitions (and the rest of the technical apparatus of the original
domain).

Grrrh.

 

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Pamela McCorduck
In reply to this post by Nick Thompson
I thank you, Nick, for adding the term "fecal boluses" to my vocabulary. I'll have plenty of opportunity to use it, I know.

Pamela
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Owen Densmore
Administrator
In reply to this post by Nick Thompson
Oops, I didn't mean to sound insulted, merely trying to get a grasp of the conversation.

I am very aware of how equations are easily misunderstood .. but I'm at a bit of a loss as to how they "anthropomorphic" other than being interpreted by anthropoids.  :)

   -- Owen


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Nick Thompson
In reply to this post by lrudolph

Lee,

 

Grrr, yourself!

 

T

[NST==>larding below!<==nst]

His is a very pure example of the semantic drift that drives me crazy, in that "the lab bench meaning" was the *first* meaning: the word DID NOT EXIST before it was coined (in its adjectival form, in German, by composing badly-understood-by-its-coiner morphemes from Greek, by the physicist Clausius) in 1865. Tait (an early knot theorist and somewhat of a religious nut, as well as a thermodynamic theorist) brought it into English three years later, but changed its sign (more or less).  By 1875 Maxwell had changed it back to what it now is.  During this period of time the concept expressed by "entropy"

[NST==>OK.  In the spirit of checking the recipe after the cake is in the oven, I looked on Wikipedia.  OH BOY, are you right about this one!  If Wikipedia is correct, it turns out that a mathematical symbol for entropy predates the name.  Bathe in your rightness at http://en.wikipedia.org/wiki/History_of_entropy#1854_definition<==nst]

became clearer, as did the whole field of thermodynamics, and eventually a good mathematical formalism for it developed--"good" in the sense that it "made sense" of results from "the lab bench" by reducing downwards (if I have your phrase right? I dunno, maybe upwards, or both ways?) so as to

define "entropy" of a macroscopic system in terms of the statistical behavior of the ensemble of microscopic entities participating in that system[NST==>I am always abit uneasy about defining something we are frustrated by every day in terms of something so small and numerous that we will never see it.  <==nst]

(1)    ,

and (2) facilitate calculations (some exact, some asymptotic) of the "entropy" (and similar thermodynamic quantities) which (3) often agreed with "lab bench" observations. 

 

When Shannon came along to study signals and noise in communication channels, he had the insight to see that *the same mathematical formalism* could be applied.  He did *not* have the insight (or dumb luck) of Clausius, so he overloaded the already-existing Common English word "information" with a new, technical, mathematical meaning.

[NST==>Ok, Hoist by my own petard.  As you full well know, I HATE what people do with the concept of information.  So, perhaps I have to modify my position to assert that sometimes the search for a common core fails and results in a core divorce.  <==nst]

 *That* rather quickly allowed visionaries, hucksters, and cocktail partiers to talk about "information theory" without understanding much or any of its technicalities.  It also (I suspect; but here I am arguing ahead of what data I happen to have around, so this may merely be my default Enraged Bloviator talking) encouraged the same gangs of semantic vandals to appropriate the word "entropy" to their various malign uses.  (For what it's worth, the OED doesn't have citations of non-specialist uses of thermodynamic "entropy" until the mid 1930s--by Freud [as translated by a pair of Stracheys, not by Jones] and a Christian apologist; non-specialist uses of information-theoretic "entropy" appear to hold off until the mid 1960s.)

 

So, to whatever extent the vernacular ("cocktail party") meaning(s) of entropy has or have a common core with the technical ("lab bench") meaning(s), it is because that core REMAINS FROM THE TECHNICAL MEANING after the semantic shift, and not because (as I *think* you mean to imply in your sentence containing the word "positivist") there is some ("common core") concept which BOTH the technical AND the vernacular meanings are INDEPENDENT ATTEMPTS to "reliably measure".

If "we have a responsibility to try to find" anything, I think it is to try to find *why* some people insist on (1) glomming onto bits of jargon with very well-defined in-domain meanings, (2) ignoring much or all of those meanings while re-applying the jargon (often without ANY definition to speak of) in a new domain, while (3) refusing to let go of some (or all) of the Impressive Consequences derived in the original domain by derivations that (4) depend on the jettisoned definitions (and the rest of the technical apparatus of the original domain).

[NST==>I can both share the rage and frustration of your Bloviator and yet know, deep down, that this has something to do with the crucial role of metaphor in science.  Even Kuhn, right?, had something positive to say about having conceptual Genies escape from one scientific bottle and infect the next.  Perhaps I have to take a kind of pragmatist position here:  If we don’t assume (wrongly) that all uses of a word avert to a common core, then we will never have the sort of conversation in which the different meanings get articulated and the forementioned frauds (and Freuds) and hucksters get exposed. 

 

But I am way out of my league here, am having way too much fun, and it is way past time for me to stop.  Thanks for your patience, everybody, and, in some cases, for your Godlike forbearance.  <==nst]

 

 

 

============================================================

FRIAM Applied Complexity Group listserv

Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Owen Densmore
Administrator
In reply to this post by lrudolph
Spot on!

   -- Owen


On Sat, Oct 12, 2013 at 6:29 AM, <[hidden email]> wrote:
Nick to Owen:

...
> Yet, just as you could never get the world to
> agree that emotionality was just the number of fecal  boluses left by a rat
> in an open field maze, you will never get the world to agree that entropy is
> just the output of a mathematical formula.  They might say, "that is a
> useful measure of entropy, but that is not what it IS."   To put the matter
> more technically, no matter how much reliability a definition buys you, it
> still does not necessarily buy you validity.  The same point might be made
> about f=ma.  (I fear being flamed by Bruce, at this point, but let it go.)
> Non fingo hypotheses and all that.  One could, like a good positivist,
> simply assert that a thing IS that which most reliably measures it, but few
> people outside your field will be comfortable with that, and everybody, even
> including your closest colleagues, will continue to use the word in some
> other sense at cocktail parties.  It was my position that the lab bench
> meaning and the cocktail meaning have some common core that we have some
> responsibility to try to find.

This is a very pure example of the semantic drift that drives me crazy,
in that "the lab bench meaning" was the *first* meaning: the word DID NOT
EXIST before it was coined (in its adjectival form, in German, by composing
badly-understood-by-its-coiner morphemes from Greek, by the physicist Clausius)
in 1865. Tait (an early knot theorist and somewhat of a religious nut, as well
as a thermodynamic theorist) brought it into English three years later, but
changed its sign (more or less).  By 1875 Maxwell had changed it back to what
it now is.  During this period of time the concept expressed by "entropy"
became clearer, as did the whole field of thermodynamics, and eventually a
good mathematical formalism for it developed--"good" in the sense that it
"made sense" of results from "the lab bench" by reducing downwards (if I
have your phrase right? I dunno, maybe upwards, or both ways?) so as to
(1) define "entropy" of a macroscopic system in terms of the statistical
behavior of the ensemble of microscopic entities participating in that
system, and (2) facilitate calculations (some exact, some asymptotic)
of the "entropy" (and similar thermodynamic quantities) which (3) often
agreed with "lab bench" observations.

When Shannon came along to study signals and noise in communication
channels, he had the insight to see that *the same mathematical formalism*
could be applied.  He did *not* have the insight (or dumb luck) of Clausius,
so he overloaded the already-existing Common English word "information" with
a new, technical, mathematical meaning.  *That* rather quickly allowed
visionaries, hucksters, and cocktail partiers to talk about "information
theory" without understanding much or any of its technicalities.  It also
(I suspect; but here I am arguing ahead of what data I happen to have around,
so this may merely be my default Enraged Bloviator talking) encouraged the
same gangs of semantic vandals to appropriate the word "entropy" to their
various malign uses.  (For what it's worth, the OED doesn't have citations
of non-specialist uses of thermodynamic "entropy" until the mid 1930s--by
Freud [as translated by a pair of Stracheys, not by Jones] and a Christian
apologist; non-specialist uses of information-theoretic "entropy" appear
to hold off until the mid 1960s.)

So, to whatever extent the vernacular ("cocktail party") meaning(s) of
entropy has or have a common core with the technical ("lab bench")
meaning(s), it is because that core REMAINS FROM THE TECHNICAL MEANING
after the semantic shift, and not because (as I *think* you mean to
imply in your sentence containing the word "positivist") there is
some ("common core") concept which BOTH the technical AND the
vernacular meanings are INDEPENDENT ATTEMPTS to "reliably measure".
If "we have a responsibility to try to find" anything, I think it
is to try to find *why* some people insist on (1) glomming onto bits
of jargon with very well-defined in-domain meanings, (2) ignoring much
or all of those meanings while re-applying the jargon (often without
ANY definition to speak of) in a new domain, while (3) refusing to
let go of some (or all) of the Impressive Consequences derived in
the original domain by derivations that (4) depend on the jettisoned
definitions (and the rest of the technical apparatus of the original
domain).

Grrrh.



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: Notions of entropy

Steve Smith
In reply to this post by lrudolph
Lee -

> This is a very pure example of the semantic drift that drives me
> crazy, in that "the lab bench meaning" was the *first* meaning: the
> word DID NOT EXIST before it was coined (in its adjectival form, in
> German, by composing badly-understood-by-its-coiner morphemes from
> Greek, by the physicist Clausius) in 1865. Tait (an early knot
> theorist and somewhat of a religious nut, as well as a thermodynamic
> theorist) brought it into English three years later, but changed its
> sign (more or less). By 1875 Maxwell had changed it back to what it
> now is. During this period of time the concept expressed by "entropy"
> became clearer, as did the whole field of thermodynamics, and
> eventually a good mathematical formalism for it developed--"good" in
> the sense that it "made sense" of results from "the lab bench" by
> reducing downwards (if I have your phrase right? I dunno, maybe
> upwards, or both ways?) so as to (1) define "entropy" of a macroscopic
> system in terms of the statistical behavior of the ensemble of
> microscopic entities participating in that system, and (2) facilitate
> calculations (some exact, some asymptotic) of the "entropy" (and
> similar thermodynamic quantities) which (3) often agreed with "lab
> bench" observations.
I think that "semantic drift" of this type is a natural consequence of
the system in which it exists.  While I'm often confounded by others'
appropriation of highly technical terms, I also find it a necessary part
of the larger experience of conceptual trickle down, if you will.   I
find, for example, that the typical vernacular use of the term "entropy"
is good enough for who it's for.  The generic idea that order decreases
and chaos increases somewhat spontaneously is a pretty good
understanding of the phenomena by laymen and when applied to the world
around us (rust, rot, dissolution, etc) captures the essence pretty well.
> When Shannon came along to study signals and noise in communication
> channels, he had the insight to see that *the same mathematical
> formalism* could be applied. He did *not* have the insight (or dumb
> luck) of Clausius, so he overloaded the already-existing Common
> English word "information" with a new, technical, mathematical meaning.
I can't help but imagine that he thought he was digging below the
already built edifice and installing a sub-foundation which would
ultimately actually support the existing use of the term (information)?  
Do others think otherwise?  That his appropriation of the term
"information" was whimsical or without specific motivation?
> *That* rather quickly allowed visionaries, hucksters, and cocktail
> partiers to talk about "information theory" without understanding much
> or any of its technicalities.
But "visionaries, hucksters, cocktail partiers and FRIAMites) will
always attempt to talk about "xxxxx" without understanding many if any
of "xxxxx"s technicalities... isn't that the nature of the beast (or the
venue?).  Perhaps this is what has driven some from our ranks (or at
least to poise their fingers over the <delete> button on their mailer at
the first sign of masturbatory charlatanism)?
> It also (I suspect; but here I am arguing ahead of what data I happen
> to have around, so this may merely be my default Enraged Bloviator
> talking) encouraged the same gangs of semantic vandals to appropriate
> the word "entropy" to their various malign uses.
Very nice allusion...  I can see the hordes sweeping down from the
steppes with a gleam in their eye as they spy such sparkling words
amongst the "civilized folk".

> (For what it's worth, the OED doesn't have citations of non-specialist
> uses of thermodynamic "entropy" until the mid 1930s--by Freud [as
> translated by a pair of Stracheys, not by Jones] and a Christian
> apologist; non-specialist uses of information-theoretic "entropy"
> appear to hold off until the mid 1960s.) So, to whatever extent the
> vernacular ("cocktail party") meaning(s) of entropy has or have a
> common core with the technical ("lab bench") meaning(s), it is because
> that core REMAINS FROM THE TECHNICAL MEANING after the semantic shift,
> and not because (as I *think* you mean to imply in your sentence
> containing the word "positivist") there is some ("common core")
> concept which BOTH the technical AND the vernacular meanings are
> INDEPENDENT ATTEMPTS to "reliably measure".
This is a very (nicely) tightly packed paragraph representing no small
amount of research that I can't hope to reproduce easily.  Are you
saying what I said further up, that the vernacular meaning in fact
represents a low-fidelity version of the technical meaning? This
suggests more of a semantic defocus or pull-back than shift/drift, no?

In response to the last sentence, my experience with a wide range of
vernacular users of the term (and they are legion) is that the original
technical meaning had enough utility in it's "high order bits" to be
recognized and maintained in the face of appropriation by said semantic
vandals.   They appropriated it *because* they appreciated it's most
obvious qualities, even if it's many subtle details were lost on them?
> If "we have a responsibility to try to find" anything, I think it is
> to try to find *why* some people insist on (1) glomming onto bits of
> jargon with very well-defined in-domain meanings,
If you are speaking of the cocktail party appropriators, it seems that
it is surely (no data beyond personal experience as one and among them)
a combination of an eager attempt to understand something beyond one's
ken, to share that with others out of fascination and perhaps no small
amount of ego-stroking.

If you are speaking of why Claude Shannon would choose to use either
"information" or "entropy", I suppose the answer is probably more
specific and more interesting... I suspect there are yet more hints in
the literature.   I *do* think that both are in fact apt usage...
perhaps only because I am very familiar with both uses of the term
"entropy" myself and find comforting that Shannon and others put so much
effort into building a sub-foundation (as I apprehend the relation
between "information" and "information") for a term that was previously
widely used but not very well underpinned?

There is also a phenomena of "trying something on for size" in both
uses.  If one "acts as if" a certain reserved term from someone else's
lexicon is appropriate in a certain context, using it over and over
again may actually lead one to recognize how it actually fits, or
perhaps wear off some of it's inconvenient edges until it does fit.  
This *does* seem (on the face of it) like a very irresponsible and lazy
way to go about such business, but it does seem to fit what I think you
are describing?
> (2) ignoring much or all of those meanings while re-applying the
> jargon (often without ANY definition to speak of) in a new domain,
In my work related to scientific collaboration, I did find it surprising
how often specialists in one field would adopt a specialty term from
another without seemingly to either A) learn it's technical meaning and
remain true to it or B) provide a good solid modifier (e.g.
(information) entropy ) and distinguishing definitions.    I want to
generously agree with Nick's intuition that this is part of the
mechanism where Science hoists itself around with the petards of
metaphor.  I also liked the colorful use of the image of the genies from
one bottle of science escaping to infect(?) another, though I couldn't
find the source of Nick's attribution to Kuhn?
> while (3) refusing to let go of some (or all) of the Impressive
> Consequences derived in the original domain by derivations
hmm... I certainly see this in the projection of a term or idea from a
technical domain to the domains usually bandied about on FRIAM or other
type of cocktail party.   The entire movement (mostly in the 80's?)
known collectively as "new age" (rhymes with "sewage") seemed
particularly guilty (defined entirely by?) of this, invoking ideas such
as the Laser and Spectral this-n-that and Resonance and Dissonance and
Interference Patterns without more than a tiny bit of understanding of
the original meanings of the terms/concepts.

In the cross-fertilization between stovepiped scientific domains
(bottles containing many genies), I don't know that this is as big of a
problem, but perhaps it is...  the sub-brand of sloppy-science known
commonly as "wishful thinking"?
> that (4) depend on the jettisoned definitions (and the rest of the
> technical apparatus of the original domain).
Yes, to dismantle a complex concept and fetishize some of it's flashier
components.  The contemporary SteamPunk fashion movement seems like a
fair example of this in pop culture.  Merely gluing a gear onto the
surface of something does not imbue it with the technological power it
is intended to imply.

I suggest that our pop-culture fetishizing of science is something
similar to the cargo cults of melanesia.   Science, Engineering,
Technology has dropped a great many wonders into the common person's
lap, requiring virtually no understanding whatsoever of the inner
workings (or even broad principles) to take advantage of said wonders
and in response we appropriate the trappings and decorate our everyday
items with them.  In this case, we decorate our everyday language with
very specific terms with very specific utility and meaning much in the
way we might disassemble a clockwork mechanism and believe that by
gluing one or more of it's impressive gears onto the surface of our
favorite box or carton that we are now "an inventor"!
> Grrrh.
indeed!
> ===
Nick alluded to there maybe being some utility to this type of
appropriation, at least across domains if not from specialized domains
to more general ones.   I am left to wonder if this isn't an artifact of
the *evolutionary* nature of ideas.   To invoke a genetic analogy...  it
is perhaps more efficient in the scheme of things for a phenotype
(scientific discipline?) to appropriate memes (terms, concepts) from
other genotypes (the scientific literature of another domain) and then
(ab)use them (let semantic drift explore the adjacent likely space of
their meaning) until they fit (well enough) to have significant
utility.   I firmly believe that this is what happened with
nonlinear/complexity science over the past 30 years or so...   it
brought disparate scientific disciplines together based on potentially
universally useful memes (mathematical and algorithmic concepts, etc.).

Fortunately, *real scientists* (tm) were involved and in many cases
applied great rigor to what we have been complaining about above... to
the casual reader, for example, of SFI white papers, it could be easy to
assume otherwise, but the point of technical papers is NOT to read them
casually, or at least NOT to jump to any significant conclusions upon a
casual reading.

mumble,
  - Steve
- Steve

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

hypothetical causes for semantic drift (was Notions of entropy)

glen ropella

Nick's "metaphor" answer is generative (even if vague).  Steve's "selection" answer is constraint-based.  So, they're in different categories.  I'll posit another generative answer: finite capacities.  As social animals, we're bred to interact, even if there's nothing to actually interact _about_.  The best interactors (idealized by gossiping over too much coffee) often seem to have no subject at all.  They wander from subject to subject, never spending enough time on any one subject to satisfy anyone, including themselves.

But when they finally tire out, they're satisfied that they interacted.

The reality of it is that every one of these interactors would _love_ to have the time, energy, IQ, databases, etc. to do a complete analysis of every subject that might come up during gossip time.  But, of course, they don't.  So, the semantic drift is purely an artifact of finite capacity ... much to the chagrin of the privileged, who have plenty of time, energy, IQ, and database access to do a more complete analysis of any issue of their choosing.

Of course, one defining feature of the geek is that, when a subject with which they're familiar comes up during gossip time, the cork is popped and out comes a gush of data ("info" is too generous a word for it).  But when the subject is not something on which they've already familiarized themselves, they shush right up.  And that self-imposed shushing is what _prevents_ them from being a good interactor.

Yes, you heard me right.  The unwillingness to yap to no end about stuff you know nothing about _prevents_ you from being a good, social, citizen ... grooming your fellow morons, ensuring them that you're part of their clan. ;-)  I, for one, go to great lengths to ensure my fellow morons that I am a member of the clan!


[hidden email] wrote at 10/12/2013 05:29 AM:> If "we have a responsibility to try to find" anything, I think it
> is to try to find *why* some people insist on (1) glomming onto bits
> of jargon with very well-defined in-domain meanings, (2) ignoring much
> or all of those meanings while re-applying the jargon (often without
> ANY definition to speak of) in a new domain, while (3) refusing to
> let go of some (or all) of the Impressive Consequences derived in
> the original domain by derivations that (4) depend on the jettisoned
> definitions (and the rest of the technical apparatus of the original
> domain).

Nick Thompson wrote at 10/12/2013 09:52 AM:
> I [...] know, deep down, that this has something to do with the crucial role of
> metaphor in science.  Even Kuhn, right?, had something positive to say about
> having conceptual Genies escape from one scientific bottle and infect the
> next.  Perhaps I have to take a kind of pragmatist position here:  If we
> don't assume (wrongly) that all uses of a word avert to a common core, then
> we will never have the sort of conversation in which the different meanings
> get articulated and the forementioned frauds (and Freuds) and hucksters get
> exposed.


Steve Smith wrote at 10/12/2013 11:42 AM:
> I am left to wonder if this isn't an artifact of the *evolutionary* nature of ideas.   To invoke a genetic analogy...  it is perhaps more efficient in the scheme of things for a phenotype (scientific discipline?) to appropriate memes (terms, concepts) from other genotypes (the scientific literature of another domain) and then (ab)use them (let semantic drift explore the adjacent likely space of their meaning) until they fit (well enough) to have significant utility.

--
⇒⇐ glen e. p. ropella
Who cares to care when they're really scared
 

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: hypothetical causes for semantic drift (was Notions of entropy)

Roger Critchlow-2
I would rework Steve's explanation.  Just as infants babble to learn the correct sounds for their native language by feedback, older children babble explanations to see what works.  Unfortunately, correctly formed explanations can be uninformed opinions or fallacious reasonings or imaginary evidence, and flawed as they are they can still sound true to some social population, so people get positive feedback for ridiculous explanations and build up self-consistent systems of explanations.  Voila, the party of tea or the birthers or the church of scientology or sociologists crafting a bespoke vocabulary for linear algebra.

I really enjoyed reading http://chronicle.com/article/Why-Cant-the-Sciencesthe/142239/ this morning.  It's all about the evidence and the reasons.

-- rec --


On Mon, Oct 14, 2013 at 10:57 AM, glen <[hidden email]> wrote:

Nick's "metaphor" answer is generative (even if vague).  Steve's "selection" answer is constraint-based.  So, they're in different categories.  I'll posit another generative answer: finite capacities.  As social animals, we're bred to interact, even if there's nothing to actually interact _about_.  The best interactors (idealized by gossiping over too much coffee) often seem to have no subject at all.  They wander from subject to subject, never spending enough time on any one subject to satisfy anyone, including themselves.

But when they finally tire out, they're satisfied that they interacted.

The reality of it is that every one of these interactors would _love_ to have the time, energy, IQ, databases, etc. to do a complete analysis of every subject that might come up during gossip time.  But, of course, they don't.  So, the semantic drift is purely an artifact of finite capacity ... much to the chagrin of the privileged, who have plenty of time, energy, IQ, and database access to do a more complete analysis of any issue of their choosing.

Of course, one defining feature of the geek is that, when a subject with which they're familiar comes up during gossip time, the cork is popped and out comes a gush of data ("info" is too generous a word for it).  But when the subject is not something on which they've already familiarized themselves, they shush right up.  And that self-imposed shushing is what _prevents_ them from being a good interactor.

Yes, you heard me right.  The unwillingness to yap to no end about stuff you know nothing about _prevents_ you from being a good, social, citizen ... grooming your fellow morons, ensuring them that you're part of their clan. ;-)  I, for one, go to great lengths to ensure my fellow morons that I am a member of the clan!


[hidden email] wrote at 10/12/2013 05:29 AM:> If "we have a responsibility to try to find" anything, I think it
is to try to find *why* some people insist on (1) glomming onto bits
of jargon with very well-defined in-domain meanings, (2) ignoring much
or all of those meanings while re-applying the jargon (often without
ANY definition to speak of) in a new domain, while (3) refusing to
let go of some (or all) of the Impressive Consequences derived in
the original domain by derivations that (4) depend on the jettisoned
definitions (and the rest of the technical apparatus of the original
domain).

Nick Thompson wrote at 10/12/2013 09:52 AM:
I [...] know, deep down, that this has something to do with the crucial role of
metaphor in science.  Even Kuhn, right?, had something positive to say about
having conceptual Genies escape from one scientific bottle and infect the
next.  Perhaps I have to take a kind of pragmatist position here:  If we
don't assume (wrongly) that all uses of a word avert to a common core, then
we will never have the sort of conversation in which the different meanings
get articulated and the forementioned frauds (and Freuds) and hucksters get
exposed.


Steve Smith wrote at 10/12/2013 11:42 AM:
I am left to wonder if this isn't an artifact of the *evolutionary* nature of ideas.   To invoke a genetic analogy...  it is perhaps more efficient in the scheme of things for a phenotype (scientific discipline?) to appropriate memes (terms, concepts) from other genotypes (the scientific literature of another domain) and then (ab)use them (let semantic drift explore the adjacent likely space of their meaning) until they fit (well enough) to have significant utility.

--
⇒⇐ glen e. p. ropella
Who cares to care when they're really scared
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: hypothetical causes for semantic drift (was Notions of entropy)

Steve Smith
Roger -
I would rework Steve's explanation.  Just as infants babble to learn the correct sounds for their native language by feedback, older children babble explanations to see what works.  Unfortunately, correctly formed explanations can be uninformed opinions or fallacious reasonings or imaginary evidence, and flawed as they are they can still sound true to some social population, so people get positive feedback for ridiculous explanations and build up self-consistent systems of explanations.  Voila, the party of tea or the birthers or the church of scientology or sociologists crafting a bespoke vocabulary for linear algebra.
I do like this model of how language, even knowledge and understanding are formed.   It is about mutation (babbling) an fitness (what works).   In children, it seems (semi) obvious as it does later in all kinds of clicques and cults of personality.   In Science, presumably, this is the scientific method:  Forming a hypothesis (babbling) then seeing if it works (doing an experiment, taking data, comparing it to the hypothesis).

In this light, I entertain GEPRs (Glen) elaboration.   I have come to resonate with his description of "language as grooming" within reason.  And I use "resonate" deliberately, because I think this is the heuristic that we, the hairless apes use as we sit about over coffee (or keyboard, or at the barber/beauty shop).  And as Glen indicated, when we have nothing to talk about, we talk on anyway, testing our current resonance... call and response...   If we fail to get a hearty "hallelujiah" from the choir, then we check our sermon, tweak it and try again, this time with more conviction.

It may be this "need to find resonance" that brought us (in part) scientific study... preaching to nature and waiting for *it's* hallelujia

I really enjoyed reading http://chronicle.com/article/Why-Cant-the-Sciencesthe/142239/ this morning.  It's all about the evidence and the reasons.

-- rec --


On Mon, Oct 14, 2013 at 10:57 AM, glen <[hidden email]> wrote:

Nick's "metaphor" answer is generative (even if vague).  Steve's "selection" answer is constraint-based.  So, they're in different categories.  I'll posit another generative answer: finite capacities.  As social animals, we're bred to interact, even if there's nothing to actually interact _about_.  The best interactors (idealized by gossiping over too much coffee) often seem to have no subject at all.  They wander from subject to subject, never spending enough time on any one subject to satisfy anyone, including themselves.

But when they finally tire out, they're satisfied that they interacted.

The reality of it is that every one of these interactors would _love_ to have the time, energy, IQ, databases, etc. to do a complete analysis of every subject that might come up during gossip time.  But, of course, they don't.  So, the semantic drift is purely an artifact of finite capacity ... much to the chagrin of the privileged, who have plenty of time, energy, IQ, and database access to do a more complete analysis of any issue of their choosing.

Of course, one defining feature of the geek is that, when a subject with which they're familiar comes up during gossip time, the cork is popped and out comes a gush of data ("info" is too generous a word for it).  But when the subject is not something on which they've already familiarized themselves, they shush right up.  And that self-imposed shushing is what _prevents_ them from being a good interactor.

Yes, you heard me right.  The unwillingness to yap to no end about stuff you know nothing about _prevents_ you from being a good, social, citizen ... grooming your fellow morons, ensuring them that you're part of their clan. ;-)  I, for one, go to great lengths to ensure my fellow morons that I am a member of the clan!


[hidden email] wrote at 10/12/2013 05:29 AM:> If "we have a responsibility to try to find" anything, I think it
is to try to find *why* some people insist on (1) glomming onto bits
of jargon with very well-defined in-domain meanings, (2) ignoring much
or all of those meanings while re-applying the jargon (often without
ANY definition to speak of) in a new domain, while (3) refusing to
let go of some (or all) of the Impressive Consequences derived in
the original domain by derivations that (4) depend on the jettisoned
definitions (and the rest of the technical apparatus of the original
domain).

Nick Thompson wrote at 10/12/2013 09:52 AM:
I [...] know, deep down, that this has something to do with the crucial role of
metaphor in science.  Even Kuhn, right?, had something positive to say about
having conceptual Genies escape from one scientific bottle and infect the
next.  Perhaps I have to take a kind of pragmatist position here:  If we
don't assume (wrongly) that all uses of a word avert to a common core, then
we will never have the sort of conversation in which the different meanings
get articulated and the forementioned frauds (and Freuds) and hucksters get
exposed.


Steve Smith wrote at 10/12/2013 11:42 AM:
I am left to wonder if this isn't an artifact of the *evolutionary* nature of ideas.   To invoke a genetic analogy...  it is perhaps more efficient in the scheme of things for a phenotype (scientific discipline?) to appropriate memes (terms, concepts) from other genotypes (the scientific literature of another domain) and then (ab)use them (let semantic drift explore the adjacent likely space of their meaning) until they fit (well enough) to have significant utility.

--
⇒⇐ glen e. p. ropella
Who cares to care when they're really scared
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: hypothetical causes for semantic drift (was Notions of entropy)

Steve Smith
In reply to this post by Roger Critchlow-2
Roger/Glen -
I would rework Steve's explanation.  Just as infants babble to learn the correct sounds for their native language by feedback, older children babble explanations to see what works.  Unfortunately, correctly formed explanations can be uninformed opinions or fallacious reasonings or imaginary evidence, and flawed as they are they can still sound true to some social population, so people get positive feedback for ridiculous explanations and build up self-consistent systems of explanations.  Voila, the party of tea or the birthers or the church of scientology or sociologists crafting a bespoke vocabulary for linear algebra.
I like this description.   It is very mutation-selection and fits my experience.   Adding Glen's view of language-as-grooming (which is growing on me over time), I prefer to think in terms of resonances.

 We are (perhaps) driven to seek harmonizing notes like a barbershop quartet.   And if we have a pulpit/audience we play call-and-response.  If we don't get a consistent and confident enough round of "hallelujah" (thanks to Dean's tip about Dictionaries I found the standard spelling rather than using my own idiosyncratic choice of "hallelujia") from the crowd, we review our sermon, modify it and try again, probably with more fervor and conviction until our message (and it's delivery) gets a satisfying response.  This is where it comes in handy to have your own choir to try your sermons out on (e.g. FRIAMers, teabaggers, scientologists) but as the saying in that regard implies, "too easy of an audience can be a problem".

The pursuit of Truth has an overtone of an absolute or objective rather than the mere relativism of "finding resonance with others".   Here is where I think Natural Science emerged... from the activities of humans that roughly fit the model of seeking resonance with nature, of hypothesis and experiment as call and response.   Strike one hollow tree to hear it's frequencies, then strike another.

Unfortunately, capitalism and consumerism create another set of tuning forks... The "free market" (or any market, no matter how overtly or covertly manipulated or contrived) offers us resonances and those who learn to hit the right notes get (some of) it's fruits.   Those who know how to manipulate it's resonances get the bulk of it (to use the 1%/99% inequity argument).   So we learn to speak the "language" of the markets.  Period.  

I think this is what we used to go to church for... a weekly sermon on some other counterpoint topic.  Perhaps that is why some of us come to FRIAM (in person or virtually?).

I really enjoyed reading http://chronicle.com/article/Why-Cant-the-Sciencesthe/142239/ this morning.  It's all about the evidence and the reasons.
I also read this and enjoyed it (at your recommendation here) but did not find it to be directly responsive to the topic?   It is a fascinating analysis of the "Two Cultures" discussion with the topic of "filthy lucre" thrown on the fire to fuel it yet more...

This particular vignette struck me:
When Immanuel Kant called on people to "have the courage to use their own understanding," to "dare to know," he had in mind a broad expanse of inquiries, including those in the arts and sciences, and even the testing of truth claims offered in the name of religion. Although Kant wrote before practitioners of the various inquiries distinguished themselves from one another as physicists, historians, chemists, biologists, literary scholars, economists, geologists, metaphysicians, and so on, these several Wissenschaft were nurtured significantly by the same Enlightenment imperative, by the same broad cognitive ideal.
It seems (sadly?) that there is yet another "two cultures" spread which Glen alludes to and is definitely in the air today with all of the 99% talk.   It is the haves/have-nots, the elite, the plebians, the ignorant, the informed, the ... and the ...   .     Glen suggests that one "class" simply doesn't have the time or resources to think critically while the other does.   I think there *is* something to that, but it isn't as simple as time/$$... it is also perspective or will. 

I think Roger's article speaks a little to that... the differing ideas of "whence critical thinking?".

- Steve





============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: hypothetical causes for semantic drift (was Notions of entropy)

glen ep ropella
Roger Critchlow wrote at 10/15/2013 08:24 AM:
> [...] correctly formed > explanations can be uninformed opinions or fallacious reasonings or imaginary evidence, and flawed as they are they can still sound true to some social population, so people get positive feedback for ridiculous explanations and build up self-consistent systems of explanations.

Steve Smith wrote at 10/15/2013 10:41 AM:
> I like this description. [...] The pursuit of Truth has an overtone of an absolute or objective rather than the mere relativism of "finding resonance with others".   Here is where I think Natural Science emerged... from the activities of humans that roughly fit the model of seeking resonance with nature, of hypothesis and experiment as call and response. [...] Those who know how to manipulate it's resonances get the bulk of it (to use the 1%/99% inequity argument).

Excellent!  Roger posits a fundamental twitch at the center of the generation.  So, to sum up, we have:

1) metaphor as a source of mapping distinguishable constructs,
2) finite capacities as a source of error in such mappings,
3) a random (or mystery behind an event horizon) generator, and
4) selection for what (doesn't) work(s).

I think these fit together quite well enough to provide for some hypotheses to answer Lee's question.

--
glen e. p. ropella, 971-255-2847, http://tempusdictum.com
We must respect the other fellow's religion, but only in the sense and to the extent that we respect his theory that his wife is beautiful and his children smart. -- H.L. Mencken


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: hypothetical causes for semantic drift (was Notions of entropy)

Roger Critchlow-2
In reply to this post by Steve Smith
I think the article's plea to see the liberal arts and sciences as a united front pursuing evidence and reason based explanations has something to do with Lee's rant about semantic infelicities between disciplines.  They're all doing the same thing for a fuzzy enough definition of thing.

In particular they all steal vocabulary shamelessly as they struggle to name the stuff that appears to be important, so words reappear with different meanings in different disciplines over and over again.  This makes it easy to generate interdisciplinary snark about how those barbarians murder the language and ignore the established meanings.  And it makes it hard for interdisciplinary work to proceed at all if the terminological confusion gets sufficiently messy.  

The OED's first citation for recursion is 1616.  Vector had a meaning before physicists appropriated it, it's still finds technical use in a sense closer to the original latin outside linear algebra contexts.

And even when they use the same sub-discipline to describe the same kinds of phenomena, as when biologists use chemical thermodynamics (sometimes re-branded as "bioenergetics"), the usages can diverge because the phenomena diverge.  Though the molecules of biology are just as much molecules as the molecules of chemistry, they don't get studied in the same contexts, and the biological polypeptides, polynucleotides, and polysaccharides are pretty much left by chemists as a problem for the biologists.

I think your ability to find these sorts of semantic hiccups is only limited by your appetite to look.

-- rec --


On Tue, Oct 15, 2013 at 10:41 AM, Steve Smith <[hidden email]> wrote:
Roger/Glen -
I would rework Steve's explanation.  Just as infants babble to learn the correct sounds for their native language by feedback, older children babble explanations to see what works.  Unfortunately, correctly formed explanations can be uninformed opinions or fallacious reasonings or imaginary evidence, and flawed as they are they can still sound true to some social population, so people get positive feedback for ridiculous explanations and build up self-consistent systems of explanations.  Voila, the party of tea or the birthers or the church of scientology or sociologists crafting a bespoke vocabulary for linear algebra.
I like this description.   It is very mutation-selection and fits my experience.   Adding Glen's view of language-as-grooming (which is growing on me over time), I prefer to think in terms of resonances.

 We are (perhaps) driven to seek harmonizing notes like a barbershop quartet.   And if we have a pulpit/audience we play call-and-response.  If we don't get a consistent and confident enough round of "hallelujah" (thanks to Dean's tip about Dictionaries I found the standard spelling rather than using my own idiosyncratic choice of "hallelujia") from the crowd, we review our sermon, modify it and try again, probably with more fervor and conviction until our message (and it's delivery) gets a satisfying response.  This is where it comes in handy to have your own choir to try your sermons out on (e.g. FRIAMers, teabaggers, scientologists) but as the saying in that regard implies, "too easy of an audience can be a problem".

The pursuit of Truth has an overtone of an absolute or objective rather than the mere relativism of "finding resonance with others".   Here is where I think Natural Science emerged... from the activities of humans that roughly fit the model of seeking resonance with nature, of hypothesis and experiment as call and response.   Strike one hollow tree to hear it's frequencies, then strike another.

Unfortunately, capitalism and consumerism create another set of tuning forks... The "free market" (or any market, no matter how overtly or covertly manipulated or contrived) offers us resonances and those who learn to hit the right notes get (some of) it's fruits.   Those who know how to manipulate it's resonances get the bulk of it (to use the 1%/99% inequity argument).   So we learn to speak the "language" of the markets.  Period.  

I think this is what we used to go to church for... a weekly sermon on some other counterpoint topic.  Perhaps that is why some of us come to FRIAM (in person or virtually?).


I really enjoyed reading http://chronicle.com/article/Why-Cant-the-Sciencesthe/142239/ this morning.  It's all about the evidence and the reasons.
I also read this and enjoyed it (at your recommendation here) but did not find it to be directly responsive to the topic?   It is a fascinating analysis of the "Two Cultures" discussion with the topic of "filthy lucre" thrown on the fire to fuel it yet more...

This particular vignette struck me:
When Immanuel Kant called on people to "have the courage to use their own understanding," to "dare to know," he had in mind a broad expanse of inquiries, including those in the arts and sciences, and even the testing of truth claims offered in the name of religion. Although Kant wrote before practitioners of the various inquiries distinguished themselves from one another as physicists, historians, chemists, biologists, literary scholars, economists, geologists, metaphysicians, and so on, these several Wissenschaft were nurtured significantly by the same Enlightenment imperative, by the same broad cognitive ideal.
It seems (sadly?) that there is yet another "two cultures" spread which Glen alludes to and is definitely in the air today with all of the 99% talk.   It is the haves/have-nots, the elite, the plebians, the ignorant, the informed, the ... and the ...   .     Glen suggests that one "class" simply doesn't have the time or resources to think critically while the other does.   I think there *is* something to that, but it isn't as simple as time/$$... it is also perspective or will. 

I think Roger's article speaks a little to that... the differing ideas of "whence critical thinking?".

- Steve





============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: hypothetical causes for semantic drift (was Notions of entropy)

Nick Thompson

Roger,

 

I have stayed out of this one, pretty much, but I want to say how much I liked this post. 

 

Hope I run into you some time.

 

Nick

 

Nicholas S. Thompson

Emeritus Professor of Psychology and Biology

Clark University

http://home.earthlink.net/~nickthompson/naturaldesigns/

 

From: Friam [mailto:[hidden email]] On Behalf Of Roger Critchlow
Sent: Wednesday, October 16, 2013 12:59 PM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] hypothetical causes for semantic drift (was Notions of entropy)

 

I think the article's plea to see the liberal arts and sciences as a united front pursuing evidence and reason based explanations has something to do with Lee's rant about semantic infelicities between disciplines.  They're all doing the same thing for a fuzzy enough definition of thing.

 

In particular they all steal vocabulary shamelessly as they struggle to name the stuff that appears to be important, so words reappear with different meanings in different disciplines over and over again.  This makes it easy to generate interdisciplinary snark about how those barbarians murder the language and ignore the established meanings.  And it makes it hard for interdisciplinary work to proceed at all if the terminological confusion gets sufficiently messy.  

 

The OED's first citation for recursion is 1616.  Vector had a meaning before physicists appropriated it, it's still finds technical use in a sense closer to the original latin outside linear algebra contexts.

 

And even when they use the same sub-discipline to describe the same kinds of phenomena, as when biologists use chemical thermodynamics (sometimes re-branded as "bioenergetics"), the usages can diverge because the phenomena diverge.  Though the molecules of biology are just as much molecules as the molecules of chemistry, they don't get studied in the same contexts, and the biological polypeptides, polynucleotides, and polysaccharides are pretty much left by chemists as a problem for the biologists.

 

I think your ability to find these sorts of semantic hiccups is only limited by your appetite to look.

 

-- rec --

 

On Tue, Oct 15, 2013 at 10:41 AM, Steve Smith <[hidden email]> wrote:

Roger/Glen -

I would rework Steve's explanation.  Just as infants babble to learn the correct sounds for their native language by feedback, older children babble explanations to see what works.  Unfortunately, correctly formed explanations can be uninformed opinions or fallacious reasonings or imaginary evidence, and flawed as they are they can still sound true to some social population, so people get positive feedback for ridiculous explanations and build up self-consistent systems of explanations.  Voila, the party of tea or the birthers or the church of scientology or sociologists crafting a bespoke vocabulary for linear algebra.

I like this description.   It is very mutation-selection and fits my experience.   Adding Glen's view of language-as-grooming (which is growing on me over time), I prefer to think in terms of resonances.

 We are (perhaps) driven to seek harmonizing notes like a barbershop quartet.   And if we have a pulpit/audience we play call-and-response.  If we don't get a consistent and confident enough round of "hallelujah" (thanks to Dean's tip about Dictionaries I found the standard spelling rather than using my own idiosyncratic choice of "hallelujia") from the crowd, we review our sermon, modify it and try again, probably with more fervor and conviction until our message (and it's delivery) gets a satisfying response.  This is where it comes in handy to have your own choir to try your sermons out on (e.g. FRIAMers, teabaggers, scientologists) but as the saying in that regard implies, "too easy of an audience can be a problem".

The pursuit of Truth has an overtone of an absolute or objective rather than the mere relativism of "finding resonance with others".   Here is where I think Natural Science emerged... from the activities of humans that roughly fit the model of seeking resonance with nature, of hypothesis and experiment as call and response.   Strike one hollow tree to hear it's frequencies, then strike another.

Unfortunately, capitalism and consumerism create another set of tuning forks... The "free market" (or any market, no matter how overtly or covertly manipulated or contrived) offers us resonances and those who learn to hit the right notes get (some of) it's fruits.   Those who know how to manipulate it's resonances get the bulk of it (to use the 1%/99% inequity argument).   So we learn to speak the "language" of the markets.  Period.  

I think this is what we used to go to church for... a weekly sermon on some other counterpoint topic.  Perhaps that is why some of us come to FRIAM (in person or virtually?).



 

I really enjoyed reading http://chronicle.com/article/Why-Cant-the-Sciencesthe/142239/ this morning.  It's all about the evidence and the reasons.

I also read this and enjoyed it (at your recommendation here) but did not find it to be directly responsive to the topic?   It is a fascinating analysis of the "Two Cultures" discussion with the topic of "filthy lucre" thrown on the fire to fuel it yet more...

This particular vignette struck me:

When Immanuel Kant called on people to "have the courage to use their own understanding," to "dare to know," he had in mind a broad expanse of inquiries, including those in the arts and sciences, and even the testing of truth claims offered in the name of religion. Although Kant wrote before practitioners of the various inquiries distinguished themselves from one another as physicists, historians, chemists, biologists, literary scholars, economists, geologists, metaphysicians, and so on, these several Wissenschaft were nurtured significantly by the same Enlightenment imperative, by the same broad cognitive ideal.

It seems (sadly?) that there is yet another "two cultures" spread which Glen alludes to and is definitely in the air today with all of the 99% talk.   It is the haves/have-nots, the elite, the plebians, the ignorant, the informed, the ... and the ...   .     Glen suggests that one "class" simply doesn't have the time or resources to think critically while the other does.   I think there *is* something to that, but it isn't as simple as time/$$... it is also perspective or will. 

I think Roger's article speaks a little to that... the differing ideas of "whence critical thinking?".

- Steve




============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com

 


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
Reply | Threaded
Open this post in threaded view
|

Re: hypothetical causes for semantic drift (was Notions of entropy)

Roger Critchlow-2
Nature publishes a letter:

http://www.nature.com/nature/journal/v502/n7471/full/502303d.html Communication: Metaphors advance scientific research

which references a perspective:


and another letter:

http://www.nature.com/nature/journal/v502/n7470/full/502170c.html Engineering: Biologists borrow more than words

The perspective was scolding biologists for nonsense like "selfish genes" and "books of life", the letters are protesting that there are metaphors which have more useful content than that sort.

So even moneymaking publishing enterprises get sucked into these discussions now and then,

-- rec --

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
123