info and entro

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

info and entro

Nick Thompson
All --

Since Mike Oliker's post may have gotten lost in a midden of earlier
messages, I take the opportunity to repost in part here.

"My background is Chemical Engineering, which focusses on chemical
thermodynamics, transport, and kinetics. I've heard of Shannon et al, but
I'm not at all solid on the analogy. Perhaps there is a creative
opportunity here. What is the analogy? Is there something conserved in the
information world? In a communication system you might have a finite
bandwidth and a given amount of noise. Mostly, in complex systems,
information can grow and arise because there is a waterfall of free energy
in the background being degraded. My mind can juggle a fraction of a
trillion bits of information, while consuming trillions of trillions of
molecules of glucose. That's why information can seem to multiply and grow
in the mind without limit."

MY RESPONSE:

OH GOSH!  I never expected you to ASK ME a question.  Now we are in ds.

Here is the little I know.  

Shannon and Whassiz (or was it  Whoosiz?) were concerned with the
degradation of messages as they passed down noisy channels.  Think of Morse
code.  There is one binary bit in every dah or dit, right?  (I am not going
to worry about "spaces" today, if you dont mind).  So before he hears the
dah or the dit, the receiver of the message has one bit of uncertainty.
Now if the channel is working perfectly the pressing of a dah by the sender
reduces the uncertainty of the receiver by one bit.  The basic idea in
information theory is that noise in the channel decreases the power of the
message sender to reduce the uncertainty of the message receiver.  And this
loss of power can be calculated.   I think uncertainty and entropy are
calculated in similar ways, but here I am WAY out of my depth.  

I think the prevailing view among the sources I have been reading is that
there has been TOO MUCH creativity in the use of thenotion of entropy.
Some of this may be Rifkin Hatred,  found in the more general DSM-IV
category, Royalty Envy.  My own view is that it is a metaphor, and like all
metaphor's requires careful specification of its limitations of application
(what the philosopher of science called, its negative heuristic.)  

Let see if others have anything to say.  I know there is a range of opinion
on this subject, but I also think it is the sort of subject that most
people discuss avidly for a while and then pick an opinion and go with it
for the rest of their lives.

By the way, I should know who Whoosiz is.  It's either Bray, Wiener, or
.... or...... DAMN!


nICK


Nicholas S. Thompson
Professor of Psychology and Ethology
Clark University
[hidden email]
http://home.earthlink.net/~nickthompson/
 [hidden email]



Reply | Threaded
Open this post in threaded view
|

info and entro

Roger Critchlow-2
Nicholas Thompson wrote:
>
> Shannon and Whassiz (or was it  Whoosiz?) were concerned with the
[ ... ]
>
> By the way, I should know who Whoosiz is.  It's either Bray, Wiener, or
> .... or...... DAMN!

Weaver.  Warren Weaver.

-- rec --