Quote of the week

classic Classic list List threaded Threaded
28 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Re: Quote of the week

Marcos
A fascinating thing for me is that the amount of surprise (i.e.
information) is like the creating of a *knowledge gradient* that
compares in an interesting way to energy gradients within
thermodynamics.  And one might suggest that *observation* can
counter-act the 2nd Law of Thermodynamics by transforming an energy
gradient into observational/informational one.  E. g., the observation
of a fire-cracker exploding confers a large amount of information to
the conscious observer/listener (especially if they never knew of such
things) whilst the physical energy in the system has been dissipated.
This new type of gradient can't really be measured in the physical
sense as the brain has stored it as a *pattern*, so it sits orthogonal
to the physical one.  Further, this new [informational] gradient now
affects the behavior of the participant, so one might ask (again) what
is the relationship between consciousness and the evolution of the
universe?

Also, each fire-cracker explosion, whilst seemingly the same each
time, must be an exceedingly novel event at some level of perception
finer than cognition, otherwise it wouldn't seem that we would
continue to repeat it hundreds of times.  So the brain seems to be
parsing an enormous amount of information from each explosion....

There's probably a better example than a fire-cracker....

Marcos

On Sat, Jun 11, 2011 at 7:09 AM, Tom Johnson <[hidden email]> wrote:
> I certainly would be interested.  I have issues with Claude's work and what
> I think is its misconstrued application and definition, at least beyond
> physics.
>
> -tj

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: Quote of the week

Tom Johnson
Yes, but that firecracker -- as data not information -- needs to be understood in some context of space/time.  A firecracker in my backyard on a 4th of July afternoon is quite different than a firecracker of equal size throw at cops during a riot.

Could it be that what you call a "observational/informational gradient" is what I call context?

-tj

On Sat, Jun 18, 2011 at 8:46 PM, Marcos <[hidden email]> wrote:
A fascinating thing for me is that the amount of surprise (i.e.
information) is like the creating of a *knowledge gradient* that
compares in an interesting way to energy gradients within
thermodynamics.  And one might suggest that *observation* can
counter-act the 2nd Law of Thermodynamics by transforming an energy
gradient into observational/informational one.  E. g., the observation
of a fire-cracker exploding confers a large amount of information to
the conscious observer/listener (especially if they never knew of such
things) whilst the physical energy in the system has been dissipated.
This new type of gradient can't really be measured in the physical
sense as the brain has stored it as a *pattern*, so it sits orthogonal
to the physical one.  Further, this new [informational] gradient now
affects the behavior of the participant, so one might ask (again) what
is the relationship between consciousness and the evolution of the
universe?

Also, each fire-cracker explosion, whilst seemingly the same each
time, must be an exceedingly novel event at some level of perception
finer than cognition, otherwise it wouldn't seem that we would
continue to repeat it hundreds of times.  So the brain seems to be
parsing an enormous amount of information from each explosion....

There's probably a better example than a fire-cracker....

Marcos

On Sat, Jun 11, 2011 at 7:09 AM, Tom Johnson <[hidden email]> wrote:
> I certainly would be interested.  I have issues with Claude's work and what
> I think is its misconstrued application and definition, at least beyond
> physics.
>
> -tj

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org



--
==========================================
J. T. Johnson
Institute for Analytic Journalism   --   Santa Fe, NM USA
www.analyticjournalism.com
505.577.6482(c)                                    505.473.9646(h)
http://www.jtjohnson.com                  [hidden email]
==========================================

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: Quote of the week

Marcos
On Sat, Jun 18, 2011 at 9:53 PM, Tom Johnson <[hidden email]> wrote:
> Yes, but that firecracker -- as data not information -- needs to be
> understood in some context of space/time.  A firecracker in my backyard on a
> 4th of July afternoon is quite different than a firecracker of equal size
> throw at cops during a riot.
>
> Could it be that what you call a "observational/informational gradient" is
> what I call context?

No, I don't think so.  The notion of "context" exists within the
domain of the cognitive, although within that domain, one might
imagine that there are domains of gradients of their own which exists
in the social sphere.

But in this case, I'm talking at the level of raw data.  In the same
way that potential and kinetic energy reflect or are symmetric each
other (in the sense that the total amount at any given time is
constant), that, similarly, that the total sum e (energy) +  H
(information) always stays constant within a closed system.

So in the given example, the actual physical, energetic vibrations are
turned into data by tickling the fine hairs of the human listener.
And, furthermore, it would seem that the brain was the universe's
answer to the "entropy problem" as we seem naturally inclined to
continue repeating explosion after explosion because at some level
deeper than the cognitive, the brain is cataloging all that data and
rewarding us (at least boys) for the novelty (in the
information-theoretic sense) that it confers with each explosion even
though there's hardly anything new at our own cognitive level.
Consciousness was nature's way of solving the problem of "the heat
death of the universe", or alternately, those universes which didn't
have observers simply died out long ago and we're one that remained.

marcos

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: Quote of the week

Eric Charles
In reply to this post by Tom Johnson
I think Tom is right that the path to solving mysteries like this is often to look outward rather than inward. Part of the point of William James's somewhat mysterious "Stream of Consciousness" expositions was to point out that at the most basic level experience is a unified whole -> i.e. the experience of "the firecracker at the ball game after a win" is more basic than the experience "firecracker". While it is useful for some purposes, it is unnatural to break up experience and consider individual "experienced things" in isolation. Thus, there is novelty to be found not just in the difficult to discern differences between each firecracker, but also between firecrackers-in-context.

Eric

On Sat, Jun 18, 2011 11:27 PM, Marcos <[hidden email]> wrote:
On Sat, Jun 18, 2011 at 9:53 PM, Tom Johnson <[hidden email]> wrote:
> Yes, but that firecracker -- as data not information -- needs to be
> understood in some context of space/time.  A firecracker in my
backyard
on a
> 4th of July afternoon is quite different than a firecracker of equal size
> throw at cops during a riot.
>
> Could it be that what you call a "observational/informational
gradient" is
> what I call context?

No, I don't think so.  The notion of "context" exists within the
domain of the cognitive, although within that domain, one might
imagine that there are domains of gradients of their own which exists
in the social sphere.

But in this case, I'm talking at the level of raw data.  In the same
way that potential and kinetic energy reflect or are symmetric each
other (in the sense that the total amount at any given time is
constant), that, similarly, that the total sum e (energy) +  H
(information) always stays constant within a closed system.

So in the given example, the actual physical, energetic vibrations are
turned into data by tickling the fine hairs of the human listener.
And, furthermore, it would seem that the brain was the universe's
answer to the "entropy problem" as we seem naturally inclined to
continue repeating explosion after explosion because at some level
deeper than the cognitive, the brain is cataloging all that data and
rewarding us (at least boys) for the novelty (in the
information-theoretic sense) that it confers with each explosion even
though there's hardly anything new at our own cognitive level.
Consciousness was nature's way of solving the problem of "the heat
death of the universe", or alternately, those universes which didn't
have observers simply died out long ago and we're one that remained.

marcos

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


Eric Charles

Professional Student and
Assistant Professor of Psychology
Penn State University
Altoona, PA 16601



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Uncertainty vs Information - redux and resolution

Grant Holland
In reply to this post by Owen Densmore
In a thread early last month I was doing my thing of "stirring the pot" by making noise about the equivalence of 'information' and 'uncertainty' - and I was quoting Shannon to back me up.

We all know that the two concepts are ultimately semantically opposed - if for no other reason than uncertainty adds to confusion and information can help to clear it up. So, understandably, Owen - and I think also Frank - objected somewhat to my equating them. But I was able to overwhelm the thread with more Shannon quotes, so the thread kinda tapered off.

What we all were looking for, I believe, is for Information Theory to back up our common usage and support the notion that information and uncertainty are, indeed, semantically opposite; while at the same time they are both measured by the same function: Shannon's version of entropy (which is also Gibbs' formula with some constants established).

Of course, Shannon does equate them - at least mathematically so, if not semantically so. Within the span of three sentences in his famous 1948 paper, he uses the words "information", "uncertainty" and "choice" to describe what his concept of entropy measures. But he never does get into any semantic distinctions among the three - only that all three measured by the same formula.

Even contemporary information theorists like Vlatko Vedral, Professor of Quantum Information Science at Oxford, appear to be of no help with any distinction between 'information' and 'uncertainty'. In his 2010 book Decoding Reality: the universe as quantum information, he traces the notion of information back to the ancient Greeks.

"The ancient Greeks laid the foundation for its (information) development when they suggested that the information content of an event somehow depends only on how probable this event really is. Philosophers like Aristotle reasoned that the more surprised we are by an event the more information the event carries....

Following this logic, we conclude that information has to be inversely proportional to probability, i. e. events with smaller probability carry more information...." 

But it was the Russian probability theorist A. I. Khinchin who provides us the satisfaction we seek. Seeing that the Shannon paper (bless his soul) lacked both mathematical rigor and satisfying semantic justifications, he set about to set the situation right with his slim but essential little volume entitled The Mathematical Foundations of Information Theory (1957). He manages to make the pertinent distinction between 'information' and 'uncertainty' most cleanly in this single paragraph. (By "scheme" Khinchin means "probability distribution".)

"Thus we can say that the information given us by carrying out some experiment consists of removing the uncertainty which existed before the experiment. The larger this uncertainty, the larger we consider to be the amount of information obtained by removing it. Since we agreed to measure the uncertainty of a finite scheme A by its entropy, H(A), it is natural to express the amount of information given by removing this uncertainty by an increasing function of the quantity H(A)....

Thus, in all that follows, we can consider the amount of information given by the realization of a finite scheme to be equal to the entropy of the scheme."

On 6/6/11 8:17 AM, Owen Densmore wrote:
Nick: Next you are in town, lets read the original Shannon paper together.  Alas, it is a bit long, but I'm told its a Good Thing To Do.

	-- Owen

On Jun 6, 2011, at 7:44 AM, Nicholas Thompson wrote:

Grant,
 
This seems backwards to me, but I got properly thrashed for my last few postings so I am putting my hat over the wall very carefully here.
 
I thought……i thought …. the information in a message was the number of bits by which the arrival of the message decreased the uncertainty of the receiver.  So, let’s say you are sitting awaiting the result of a coin toss, and I am on the other end of the line flipping the coin.  Before I say “heads” you have 1 bit of uncertainty; afterwards, you have none. 
 
The reason I am particularly nervous about saying this is that it, of course, holds out the possibility of negative information.   Some forms of communication, appeasement gestures in animals, for instance, have the effect of increasing the range of behaviors likely to occur in the receiver.  This would seem to correspond to a negative value for the information calculation. 
 
Nick
From: [hidden email] [[hidden email]] On Behalf Of Grant Holland
Sent: Sunday, June 05, 2011 11:07 PM
To: The Friday Morning Applied Complexity Coffee Group; Steve Smith
Subject: Re: [FRIAM] Quote of the week
 
Interesting note on "information" and "uncertainty"...

Information is Uncertainty. The two words are synonyms.

Shannon called it "uncertainty", contemporary Information theory calls it "information".

It is often thought that the more information there is, the less uncertainty. The opposite is the case.

In Information Theory (aka the mathematical theory of communications) , the degree of information I(E) - or uncertainty U(E) - of an event is measurable as an inverse function of its probability, as follows:

U(E) = I(E) = log( 1/Pr(E) ) = log(1) - log( Pr(E) ) = -log( Pr(E) ).

Considering I(E) as a random variable, Shannon's entropy is, in fact, the first moment (or expectation) of I(E). Shannon entropy = exp( I(E) ).

Grant

On 6/5/2011 2:20 PM, Steve Smith wrote:
 

"Philosophy is to physics as pornography is to sex. It's cheaper, it's easier and some people seem to prefer it."

Modern Physics is  contained in Realism which is contained in Metaphysics which I contained in all of Philosophy.

I'd be tempted to counter:
"Physics is to Philosophy as the Missionary Position is to the Kama Sutra"

Physics also appeals to Phenomenology and Logic (the branch of Philosophy were Mathematics is rooted) and what we can know scientifically is constrained by Epistemology (the nature of knowledge) and phenomenology (the nature of conscious experience).

It might be fair to say that many (including many of us here) who hold Physics up in some exalted position simply dismiss or choose to ignore all the messy questions considered by  *the rest of* philosophy.   Even if we think we have clear/simple answers to the questions, I do not accept that the questions are not worthy of the asking.

The underlying point of the referenced podcast is, in fact, that Physics, or Science in general might be rather myopic and limited by it's own viewpoint by definition. 

 "The more we know, the less we understand."

Philosophy is about understanding, physics is about knowledge first and understanding only insomuch as it is a part of natural philosophy.  

Or at least this is how my understanding is structured around these matters.

- Steve

On Sun, Jun 5, 2011 at 1:15 PM, Robert Holmes [hidden email] wrote:
>From the BBC's science podcast "The Infinite Monkey Cage":

"Philosophy is to physics as pornography is to sex. It's cheaper, it's easier and some people seem to prefer it."
 
Not to be pedantic, but I suspect that s/he has conflated "philosophy" with "new age", as much of science owes itself to philosophy.
 
marcos
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


 
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Uncertainty vs Information - redux and resolution

Grant Holland
In reply to this post by Owen Densmore
In a thread early last month I was doing my thing of "stirring the pot" by making noise about the equivalence of 'information' and 'uncertainty' - and I was quoting Shannon to back me up.

We all know that the two concepts are ultimately semantically opposed - if for no other reason than uncertainty adds to confusion and information can help to clear it up. So, understandably, Owen - and I think also Frank - objected somewhat to my equating them. But I was able to overwhelm the thread with more Shannon quotes, so the thread kinda tapered off.

What we all were looking for, I believe, is for Information Theory to back up our common usage and support the notion that information and uncertainty are, in some sense, semantically opposite; while at the same time they are both measured by the same function: Shannon's version of entropy (which is also Gibbs' formula with some constants established).

Of course, Shannon does equate information and uncertainty - at least mathematically so, if not semantically so. Within the span of three sentences in his famous 1948 paper, he uses the words "information", "uncertainty" and "choice" to describe what his concept of entropy measures. But he never does get into any semantic distinctions among the three - only that all three are measured by entropy.

Even contemporary information theorists like Vlatko Vedral, Professor of Quantum Information Science at Oxford, appear to be of no help with any distinction between 'information' and 'uncertainty'. In his 2010 book Decoding Reality: The Universe as Quantum Information, he traces the notion of information back to the ancient Greeks.

"The ancient Greeks laid the foundation for its [information's] development when they suggested that the information content of an event somehow depends only on how probable this event really is. Philosophers like Aristotle reasoned that the more surprised we are by an event the more information the event carries....
Following this logic, we conclude that information has to be inversely proportional to probability, i. e. events with smaller probability carry more information...." 
But a simple inverse proportional formula like I(E) = 1/Pr(E), where E is an event, does not suffice as a measure of 'uncertainty/information', because it does not ensure the additivity of independent events. (We really like additivity in our measuring functions.) The formula needs to be tweaked to give us that.

Vedral does the tweaking for additivity and gives us the formula used by Information Theorists to measure the amount of 'uncertainty/information' in a single event. The formula is I(E) =  log (1/Pr(E)). (Any base will do.) It is interesting that if this function is treated as a random variable, then its first moment (expected value) is Shannon's formula for entropy.

But it was the Russian probability theorist A. I. Khinchin who provided us with the satisfaction we seek. Seeing that the Shannon paper (bless his soul) lacked both mathematical rigor and satisfying semantic justifications, he set about to put the situation right with his slim but essential little volume entitled The Mathematical Foundations of Information Theory (1957). He manages to make the pertinent distinction between 'information' and 'uncertainty' most cleanly in this single passage. (By "scheme" Khinchin means "probability distribution".)
"Thus we can say that the information given us by carrying out some experiment consists of removing the uncertainty which existed before the experiment. The larger this uncertainty, the larger we consider to be the amount of information obtained by removing it. Since we agreed to measure the uncertainty of a finite scheme A by its entropy, H(A), it is natural to express the amount of information given by removing this uncertainty by an increasing function of the quantity H(A)....
Thus, in all that follows, we can consider the amount of information given by the realization of a finite scheme [probability distribution] to be equal to the entropy of the scheme."
So, when an experiment is "realized" (the coin is flipped or the die is rolled), the uncertainty inherent in it "becomes" information. And there seems to be a conservation principle here. The amount of "stuff" inherent in the uncertainty prior to realization is conserved after realization when it becomes information.

Fun.

Grant

On 6/6/11 8:17 AM, Owen Densmore wrote:
Nick: Next you are in town, lets read the original Shannon paper together.  Alas, it is a bit long, but I'm told its a Good Thing To Do.

	-- Owen

On Jun 6, 2011, at 7:44 AM, Nicholas Thompson wrote:

Grant,
 
This seems backwards to me, but I got properly thrashed for my last few postings so I am putting my hat over the wall very carefully here.
 
I thought……i thought …. the information in a message was the number of bits by which the arrival of the message decreased the uncertainty of the receiver.  So, let’s say you are sitting awaiting the result of a coin toss, and I am on the other end of the line flipping the coin.  Before I say “heads” you have 1 bit of uncertainty; afterwards, you have none. 
 
The reason I am particularly nervous about saying this is that it, of course, holds out the possibility of negative information.   Some forms of communication, appeasement gestures in animals, for instance, have the effect of increasing the range of behaviors likely to occur in the receiver.  This would seem to correspond to a negative value for the information calculation. 
 
Nick
From: [hidden email] [[hidden email]] On Behalf Of Grant Holland
Sent: Sunday, June 05, 2011 11:07 PM
To: The Friday Morning Applied Complexity Coffee Group; Steve Smith
Subject: Re: [FRIAM] Quote of the week
 
Interesting note on "information" and "uncertainty"...

Information is Uncertainty. The two words are synonyms.

Shannon called it "uncertainty", contemporary Information theory calls it "information".

It is often thought that the more information there is, the less uncertainty. The opposite is the case.

In Information Theory (aka the mathematical theory of communications) , the degree of information I(E) - or uncertainty U(E) - of an event is measurable as an inverse function of its probability, as follows:

U(E) = I(E) = log( 1/Pr(E) ) = log(1) - log( Pr(E) ) = -log( Pr(E) ).

Considering I(E) as a random variable, Shannon's entropy is, in fact, the first moment (or expectation) of I(E). Shannon entropy = exp( I(E) ).

Grant

On 6/5/2011 2:20 PM, Steve Smith wrote:
 

"Philosophy is to physics as pornography is to sex. It's cheaper, it's easier and some people seem to prefer it."

Modern Physics is  contained in Realism which is contained in Metaphysics which I contained in all of Philosophy.

I'd be tempted to counter:
"Physics is to Philosophy as the Missionary Position is to the Kama Sutra"

Physics also appeals to Phenomenology and Logic (the branch of Philosophy were Mathematics is rooted) and what we can know scientifically is constrained by Epistemology (the nature of knowledge) and phenomenology (the nature of conscious experience).

It might be fair to say that many (including many of us here) who hold Physics up in some exalted position simply dismiss or choose to ignore all the messy questions considered by  *the rest of* philosophy.   Even if we think we have clear/simple answers to the questions, I do not accept that the questions are not worthy of the asking.

The underlying point of the referenced podcast is, in fact, that Physics, or Science in general might be rather myopic and limited by it's own viewpoint by definition. 

 "The more we know, the less we understand."

Philosophy is about understanding, physics is about knowledge first and understanding only insomuch as it is a part of natural philosophy.  

Or at least this is how my understanding is structured around these matters.

- Steve

On Sun, Jun 5, 2011 at 1:15 PM, Robert Holmes [hidden email] wrote:
>From the BBC's science podcast "The Infinite Monkey Cage":

"Philosophy is to physics as pornography is to sex. It's cheaper, it's easier and some people seem to prefer it."
 
Not to be pedantic, but I suspect that s/he has conflated "philosophy" with "new age", as much of science owes itself to philosophy.
 
marcos
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org


 
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: Uncertainty vs Information - redux and resolution

Eric Charles
In reply to this post by Owen Densmore
That is potentially fascinating. However, it is not terribly interesting to state that we can establish a conservation principle merely by giving a name to the absence of something, and then pointing out that if we start with a set amount of that something, and take it away in chunks, then the amount that is there plus the amount that is gone always equals the amount we started with. What is the additional insight?

Eric

On Wed, Jul 20, 2011 04:27 PM, Grant Holland <[hidden email]> wrote:
In a thread early last month I was doing my thing of "stirring the pot" by making noise about the equivalence of 'information' and 'uncertainty' - and I was quoting Shannon to back me up.

We all know that the two concepts are ultimately semantically opposed - if for no other reason than uncertainty adds to confusion and information can help to clear it up. So, understandably, Owen - and I think also Frank - objected somewhat to my equating them. But I was able to overwhelm the thread with more Shannon quotes, so the thread kinda tapered off.

What we all were looking for, I believe, is for Information Theory to back up our common usage and support the notion that information and uncertainty are, in some sense, semantically opposite; while at the same time they are both measured by the same function: Shannon's version of entropy (which is also Gibbs' formula with some constants established).

Of course, Shannon does equate information and uncertainty - at least mathematically so, if not semantically so. Within the span of three sentences in his famous 1948 paper, he uses the words "information", "uncertainty" and "choice" to describe what his concept of entropy measures. But he never does get into any semantic distinctions among the three - only that all three are measured by entropy.

Even contemporary information theorists like Vlatko Vedral, Professor of Quantum Information Science at Oxford, appear to be of no help with any distinction between 'information' and 'uncertainty'. In his 2010 book Decoding Reality: The Universe as Quantum Information, he traces the notion of information back to the ancient Greeks.

"The ancient Greeks laid the foundation for its [information's] development when they suggested that the information content of an event somehow depends only on how probable this event really is. Philosophers like Aristotle reasoned that the more surprised we are by an event the more information the event carries....
Following this logic, we conclude that information has to be inversely proportional to probability, i. e. events with smaller probability carry more information...." 
But a simple inverse proportional formula like I(E) = 1/Pr(E), where E is an event, does not suffice as a measure of 'uncertainty/information', because it does not ensure the additivity of independent events. (We really like additivity in our measuring functions.) The formula needs to be tweaked to give us that.

Vedral does the tweaking for additivity and gives us the formula used by Information Theorists to measure the amount of 'uncertainty/information' in a single event. The formula is I(E) =  log (1/Pr(E)). (Any base will do.) It is interesting that if this function is treated as a random variable, then its first moment (expected value) is Shannon's formula for entropy.

But it was the Russian probability theorist A. I. Khinchin who provided us with the satisfaction we seek. Seeing that the Shannon paper (bless his soul) lacked both mathematical rigor and satisfying semantic justifications, he set about to put the situation right with his slim but essential little volume entitled The Mathematical Foundations of Information Theory (1957). He manages to make the pertinent distinction between 'information' and 'uncertainty' most cleanly in this single passage. (By "scheme" Khinchin means "probability distribution".)
"Thus we can say that the information given us by carrying out some experiment consists of removing the uncertainty which existed before the experiment. The larger this uncertainty, the larger we consider to be the amount of information obtained by removing it. Since we agreed to measure the uncertainty of a finite scheme A by its entropy, H(A), it is natural to express the amount of information given by removing this uncertainty by an increasing function of the quantity H(A)....
Thus, in all that follows, we can consider the amount of information given by the realization of a finite scheme [probability distribution] to be equal to the entropy of the scheme."
So, when an experiment is "realized" (the coin is flipped or the die is rolled), the uncertainty inherent in it "becomes" information. And there seems to be a conservation principle here. The amount of "stuff" inherent in the uncertainty prior to realization is conserved after realization when it becomes information.

Fun.

Grant

On 6/6/11 8:17 AM, Owen Densmore wrote:
Nick: Next you are in town, lets read the original Shannon paper together. 
Alas, it is a bit long, but I'm told its a Good Thing To Do.

	-- Owen

On Jun 6, 2011, at 7:44 AM, Nicholas Thompson wrote:

Grant,
 
This seems backwards to me, but I got properly thrashed for my last few
postings so I am putting my hat over the wall very carefully here.
 
I thought……i thought …. the information in a message was the number of
bits by which the arrival of the message decreased the uncertainty of the
receiver.  So, let’s say you are sitting awaiting the result of a coin toss,
and I am on the other end of the line flipping the coin.  Before I say
“heads” you have 1 bit of uncertainty; afterwards, you have none. 
 
The reason I am particularly nervous about saying this is that it, of course,
holds out the possibility of negative information.   Some forms of
communication, appeasement gestures in animals, for instance, have the effect
of increasing the range of behaviors likely to occur in the receiver.  This
would seem to correspond to a negative value for the information calculation. 
 
Nick
From: friam-bounces@... [mailto:friam-bounces@...] On Behalf Of Grant Holland
Sent: Sunday, June 05, 2011 11:07 PM
To: The Friday Morning Applied Complexity Coffee Group; Steve Smith
Subject: Re: [FRIAM] Quote of the week
 
Interesting note on "information" and "uncertainty"...

Information is Uncertainty. The two words are synonyms.

Shannon called it "uncertainty", contemporary Information theory calls it
"information".

It is often thought that the more information there is, the less uncertainty.
The opposite is the case.

In Information Theory (aka the mathematical theory of communications) , the
degree of information I(E) - or uncertainty U(E) - of an event is measurable as
an inverse function of its probability, as follows:

U(E) = I(E) = log( 1/Pr(E) ) = log(1) - log( Pr(E) ) = -log( Pr(E) ).

Considering I(E) as a random variable, Shannon's entropy is, in fact, the first
moment (or expectation) of I(E). Shannon entropy = exp( I(E) ).

Grant

On 6/5/2011 2:20 PM, Steve Smith wrote:
 

"Philosophy is to physics as pornography is to sex. It's cheaper, it's easier
and some people seem to prefer it."

Modern Physics is  contained in Realism which is contained in Metaphysics which
I contained in all of Philosophy.

I'd be tempted to counter:
"Physics is to Philosophy as the Missionary Position is to the Kama Sutra"

Physics also appeals to Phenomenology and Logic (the branch of Philosophy were
Mathematics is rooted) and what we can know scientifically is constrained by
Epistemology (the nature of knowledge) and phenomenology (the nature of
conscious experience).

It might be fair to say that many (including many of us here) who hold Physics
up in some exalted position simply dismiss or choose to ignore all the messy
questions considered by  *the rest of* philosophy.   Even if we think we have
clear/simple answers to the questions, I do not accept that the questions are
not worthy of the asking.

The underlying point of the referenced podcast is, in fact, that Physics, or
Science in general might be rather myopic and limited by it's own viewpoint by
definition. 

 "The more we know, the less we understand."

Philosophy is about understanding, physics is about knowledge first and
understanding only insomuch as it is a part of natural philosophy.  

Or at least this is how my understanding is structured around these matters.

- Steve

On Sun, Jun 5, 2011 at 1:15 PM, Robert Holmes <robert@...> wrote:
>From the BBC's science podcast "The Infinite Monkey Cage":

"Philosophy is to physics as pornography is to sex. It's cheaper, it's easier
and some people seem to prefer it."
 
Not to be pedantic, but I suspect that s/he has conflated "philosophy" with
"new age", as much of science owes itself to philosophy.
 
marcos
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a class="moz-txt-link-freetext"
href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org


 
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a class="moz-txt-link-freetext"
href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a class="moz-txt-link-freetext"
href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a class="moz-txt-link-freetext"
href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Eric Charles

Professional Student and
Assistant Professor of Psychology
Penn State University
Altoona, PA 16601



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Reply | Threaded
Open this post in threaded view
|

Re: Uncertainty vs Information - redux and resolution

Grant Holland
Eric,

True enough. And yet, this is what Information Theory has decided to do: treat the amount of information that gets realized by performing an experiment as the same as the amount of uncertainty from which it was "liberated". That way, they can use entropy as the measure of both.

I'm personally sympathetic to an argument that they are not equivalent. My predilection suggests that there is more value in the uncertainty that exists before the experiment than there is in the information that results afterwards. I would expect there would be others who would put more value on the "liberated" information.

But I would have to put a lot more thought than I have into formalizing this.

I like your observation. It opens up the possibility of re-doing Information Theory, and ending up with one measure for uncertainty and another for information. And we could finally depose the word "entropy"!

Grant

On 7/20/11 3:18 PM, ERIC P. CHARLES wrote:
That is potentially fascinating. However, it is not terribly interesting to state that we can establish a conservation principle merely by giving a name to the absence of something, and then pointing out that if we start with a set amount of that something, and take it away in chunks, then the amount that is there plus the amount that is gone always equals the amount we started with. What is the additional insight?

Eric

On Wed, Jul 20, 2011 04:27 PM, Grant Holland [hidden email] wrote:
In a thread early last month I was doing my thing of "stirring the pot" by making noise about the equivalence of 'information' and 'uncertainty' - and I was quoting Shannon to back me up.

We all know that the two concepts are ultimately semantically opposed - if for no other reason than uncertainty adds to confusion and information can help to clear it up. So, understandably, Owen - and I think also Frank - objected somewhat to my equating them. But I was able to overwhelm the thread with more Shannon quotes, so the thread kinda tapered off.

What we all were looking for, I believe, is for Information Theory to back up our common usage and support the notion that information and uncertainty are, in some sense, semantically opposite; while at the same time they are both measured by the same function: Shannon's version of entropy (which is also Gibbs' formula with some constants established).

Of course, Shannon does equate information and uncertainty - at least mathematically so, if not semantically so. Within the span of three sentences in his famous 1948 paper, he uses the words "information", "uncertainty" and "choice" to describe what his concept of entropy measures. But he never does get into any semantic distinctions among the three - only that all three are measured by entropy.

Even contemporary information theorists like Vlatko Vedral, Professor of Quantum Information Science at Oxford, appear to be of no help with any distinction between 'information' and 'uncertainty'. In his 2010 book Decoding Reality: The Universe as Quantum Information, he traces the notion of information back to the ancient Greeks.

"The ancient Greeks laid the foundation for its [information's] development when they suggested that the information content of an event somehow depends only on how probable this event really is. Philosophers like Aristotle reasoned that the more surprised we are by an event the more information the event carries....
Following this logic, we conclude that information has to be inversely proportional to probability, i. e. events with smaller probability carry more information...." 
But a simple inverse proportional formula like I(E) = 1/Pr(E), where E is an event, does not suffice as a measure of 'uncertainty/information', because it does not ensure the additivity of independent events. (We really like additivity in our measuring functions.) The formula needs to be tweaked to give us that.

Vedral does the tweaking for additivity and gives us the formula used by Information Theorists to measure the amount of 'uncertainty/information' in a single event. The formula is I(E) =  log (1/Pr(E)). (Any base will do.) It is interesting that if this function is treated as a random variable, then its first moment (expected value) is Shannon's formula for entropy.

But it was the Russian probability theorist A. I. Khinchin who provided us with the satisfaction we seek. Seeing that the Shannon paper (bless his soul) lacked both mathematical rigor and satisfying semantic justifications, he set about to put the situation right with his slim but essential little volume entitled The Mathematical Foundations of Information Theory (1957). He manages to make the pertinent distinction between 'information' and 'uncertainty' most cleanly in this single passage. (By "scheme" Khinchin means "probability distribution".)
"Thus we can say that the information given us by carrying out some experiment consists of removing the uncertainty which existed before the experiment. The larger this uncertainty, the larger we consider to be the amount of information obtained by removing it. Since we agreed to measure the uncertainty of a finite scheme A by its entropy, H(A), it is natural to express the amount of information given by removing this uncertainty by an increasing function of the quantity H(A)....
Thus, in all that follows, we can consider the amount of information given by the realization of a finite scheme [probability distribution] to be equal to the entropy of the scheme."
So, when an experiment is "realized" (the coin is flipped or the die is rolled), the uncertainty inherent in it "becomes" information. And there seems to be a conservation principle here. The amount of "stuff" inherent in the uncertainty prior to realization is conserved after realization when it becomes information.

Fun.

Grant

On 6/6/11 8:17 AM, Owen Densmore wrote:
Nick: Next you are in town, lets read the original Shannon paper together. 
Alas, it is a bit long, but I'm told its a Good Thing To Do.

	-- Owen

On Jun 6, 2011, at 7:44 AM, Nicholas Thompson wrote:

Grant,
 
This seems backwards to me, but I got properly thrashed for my last few
postings so I am putting my hat over the wall very carefully here.
 
I thought……i thought …. the information in a message was the number of
bits by which the arrival of the message decreased the uncertainty of the
receiver.  So, let’s say you are sitting awaiting the result of a coin toss,
and I am on the other end of the line flipping the coin.  Before I say
“heads” you have 1 bit of uncertainty; afterwards, you have none. 
 
The reason I am particularly nervous about saying this is that it, of course,
holds out the possibility of negative information.   Some forms of
communication, appeasement gestures in animals, for instance, have the effect
of increasing the range of behaviors likely to occur in the receiver.  This
would seem to correspond to a negative value for the information calculation. 
 
Nick
From: friam-bounces@... [mailto:friam-bounces@...] On Behalf Of Grant Holland
Sent: Sunday, June 05, 2011 11:07 PM
To: The Friday Morning Applied Complexity Coffee Group; Steve Smith
Subject: Re: [FRIAM] Quote of the week
 
Interesting note on "information" and "uncertainty"...

Information is Uncertainty. The two words are synonyms.

Shannon called it "uncertainty", contemporary Information theory calls it
"information".

It is often thought that the more information there is, the less uncertainty.
The opposite is the case.

In Information Theory (aka the mathematical theory of communications) , the
degree of information I(E) - or uncertainty U(E) - of an event is measurable as
an inverse function of its probability, as follows:

U(E) = I(E) = log( 1/Pr(E) ) = log(1) - log( Pr(E) ) = -log( Pr(E) ).

Considering I(E) as a random variable, Shannon's entropy is, in fact, the first
moment (or expectation) of I(E). Shannon entropy = exp( I(E) ).

Grant

On 6/5/2011 2:20 PM, Steve Smith wrote:
 

"Philosophy is to physics as pornography is to sex. It's cheaper, it's easier
and some people seem to prefer it."

Modern Physics is  contained in Realism which is contained in Metaphysics which
I contained in all of Philosophy.

I'd be tempted to counter:
"Physics is to Philosophy as the Missionary Position is to the Kama Sutra"

Physics also appeals to Phenomenology and Logic (the branch of Philosophy were
Mathematics is rooted) and what we can know scientifically is constrained by
Epistemology (the nature of knowledge) and phenomenology (the nature of
conscious experience).

It might be fair to say that many (including many of us here) who hold Physics
up in some exalted position simply dismiss or choose to ignore all the messy
questions considered by  *the rest of* philosophy.   Even if we think we have
clear/simple answers to the questions, I do not accept that the questions are
not worthy of the asking.

The underlying point of the referenced podcast is, in fact, that Physics, or
Science in general might be rather myopic and limited by it's own viewpoint by
definition. 

 "The more we know, the less we understand."

Philosophy is about understanding, physics is about knowledge first and
understanding only insomuch as it is a part of natural philosophy.  

Or at least this is how my understanding is structured around these matters.

- Steve

On Sun, Jun 5, 2011 at 1:15 PM, Robert Holmes <robert@...> wrote:
>From the BBC's science podcast "The Infinite Monkey Cage":

"Philosophy is to physics as pornography is to sex. It's cheaper, it's easier
and some people seem to prefer it."
 
Not to be pedantic, but I suspect that s/he has conflated "philosophy" with
"new age", as much of science owes itself to philosophy.
 
marcos
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org


 
 
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at <a moz-do-not-send="true" class="moz-txt-link-freetext" href="http://www.friam.org" onclick="window.open('http://www.friam.org');return
false;">http://www.friam.org
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
Eric Charles

Professional Student and
Assistant Professor of Psychology
Penn State University
Altoona, PA 16601



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org
12