Robert Rosen

classic Classic list List threaded Threaded
26 messages Options
12
Reply | Threaded
Open this post in threaded view
|

not enough of Robert Rosen

glen ep ropella
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Marcus G. Daniels on 01/08/2008 04:11 PM:
> It seems to me it's the language that's important, and how suitable that
> language is to the environment at hand.
> That's not to say there aren't new useful primitives to be discovered.

It's not the language.  It's not any element of the language.

What's important is the ability to form, use, and abandon languages (at
will, obviously).

And any system where the language is fixed will be fragile to ambiguity
_because_ of G?del's result.

The only thing remaining is whether (and how much) contact and
interaction with the environment provides what's needed for forming,
using, and abandoning languages.  If, as may be the case, all
assemblages of formal systems merely amount to a more complicated formal
system, then even an assemblage won't do what we're after.  But if the
world is somehow "supra-computation", then perhaps sporadic interactions
with the environment can help a computer resolve unexpected exceptions
gracefully.

- From that perspective the phrase "holarchy of formal systems" may well
be self-contradictory and only reality is capable of forming holarchies.

- --
glen e. p. ropella, 971-219-3846, http://tempusdictum.com
Arms are the only true badge of liberty. The possession of arms is the
distinction of a free man from a slave. -- Andrew Fletcher

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFHhCZlZeB+vOTnLkoRAoxQAKDgVS8hc13CSQUSNVaQg7zts5MJvgCgr8Vi
I1WnIM1s7xRxreQBkXnL7YU=
=k7Da
-----END PGP SIGNATURE-----


Reply | Threaded
Open this post in threaded view
|

not enough of Robert Rosen

Joost Rekveld
Glen,

I missed part of this thread and please feel free to ignore my  
questions if I make you repeat things, but there's two things in your  
reply I don't get:

- what does 'fragile to ambiguity' mean ?
- what would a 'holarchy of formal systems' look like ? Is't a  
holarchy a structure where influence is not only top-down but also  
bottom-up ? And how could any such bi-directionality ever exist in  
some kind of nesting of formal systems ?

ciao,

Joost.


On Jan 9, 2008, at 2:41 AM, Glen E. P. Ropella wrote:

> And any system where the language is fixed will be fragile to  
> ambiguity
> _because_ of G?del's result.
>
> The only thing remaining is whether (and how much) contact and
> interaction with the environment provides what's needed for forming,
> using, and abandoning languages.  If, as may be the case, all
> assemblages of formal systems merely amount to a more complicated  
> formal
> system, then even an assemblage won't do what we're after.  But if the
> world is somehow "supra-computation", then perhaps sporadic  
> interactions
> with the environment can help a computer resolve unexpected exceptions
> gracefully.
>
> - From that perspective the phrase "holarchy of formal systems" may  
> well
> be self-contradictory and only reality is capable of forming  
> holarchies.



-------------------------------------------

                             Joost Rekveld
-----------    http://www.lumen.nu/rekveld

-------------------------------------------

?This alone I ask you, O reader, that when you peruse the
account of these marvels that you do not set up for yourself
as a standard human intellectual pride, but rather the great
size and vastness of earth and sky; and, comparing with
that Infinity these slender shadows in which miserably and
anxiously we are enveloped, you will easily know that I have
related nothing which is beyond belief.?
(Girolamo Cardano)

-------------------------------------------







Reply | Threaded
Open this post in threaded view
|

not enough of Robert Rosen

Marcus G. Daniels
In reply to this post by glen ep ropella
Glen E. P. Ropella wrote:

> What's important is the ability to form, use, and abandon languages (at
> will, obviously).
>
> And any system where the language is fixed will be fragile to ambiguity
> _because_ of G?del's result.
>
> The only thing remaining is whether (and how much) contact and
> interaction with the environment provides what's needed for forming,
> using, and abandoning languages.  If, as may be the case, all
> assemblages of formal systems merely amount to a more complicated formal
> system, then even an assemblage won't do what we're after.  But if the
> world is somehow "supra-computation", then perhaps sporadic interactions
> with the environment can help a computer resolve unexpected exceptions
> gracefully.
>
> - From that perspective the phrase "holarchy of formal systems" may well
> be self-contradictory and only reality is capable of forming holarchies.
>  
Well, I don't think requiring that a formal systems be grounded in
semantics leads to a hopeless cascade of unintelligible stigmergic
relations, uncomparable to others.   At least from the point of view of
building working control systems...


Reply | Threaded
Open this post in threaded view
|

not enough of Robert Rosen

Phil Henshaw-2
In reply to this post by glen ep ropella

The observation that a robot, with or without unchanging primitive
elements of design, can potentially be capable of learning because it
operates in a world that itself is complex and indefinably changing,...
seems a close match to what's happening, and both theoretically and
operationally valid.  That it becomes impossible to say how... because
impossible to specify the interactions or fully understand the responses
is relevant too.

At the moment, because robots are extensions of our thinking, 'their'
way of organically adapting is that we watch how they work and build new
ones.   That works perfectly well as a natural system!   'Hands off'
robot evolution is far more limited, because the way we build machines
is with invariant primitives of many kinds.

The root problem is that equations are *so* self-sufficient that they
have no environmental interaction at all, and that all natural systems
are entirely different on that count.   Natural systems actually all
directly *grow out of* the environments they will continue to interact
with.  That equations don't do that and robots don't do that, except as
the man-equation & man-robot couples they are, is the rub.  

That explaining why seems to be an ill-posed question, then, might it be
considered highly useful information too, about what information can
explain v. what we can only point to.



Phil Henshaw                       ????.?? ? `?.????
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
680 Ft. Washington Ave
NY NY 10040                      
tel: 212-795-4844                
e-mail: pfh at synapse9.com          
explorations: www.synapse9.com    
-- "it's not finding what people say interesting, but finding what's
interesting in what they say" --


> -----Original Message-----
> From: friam-bounces at redfish.com
> [mailto:friam-bounces at redfish.com] On Behalf Of Glen E. P. Ropella
> Sent: Tuesday, January 08, 2008 8:10 PM
> To: The Friday Morning Applied Complexity Coffee Group
> Subject: Re: [FRIAM] not enough of Robert Rosen
>
>
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
>
> I'm going to violate the bottom-post rule because all 3 of
> the following excerpts focus on the point I made (in response
> to G?nther) that there's a difference between "computation"
> as the software that runs on a machine and the machine, itself.
>
> When we talk about "computation", are we talking about a
> concrete _thing_ that exists out there in the world?  Or are
> we talking about an abstract machine that exists only in our
> minds (or software as the case may be)?
>
> Marcus' comments show that he's talking about the former...
> computers are real machines that can avail themselves of the
> full machinery of reality.  Hence, that type of "computation"
> isn't limited in the way RR suggests because that's not what
> "computability" refers to.
>
> A robot that can change itself based on sensory-motor
> interactions with the real world is not a computer in the
> same sense as a universal turing machine.
>
> This distinction provides plenty of fodder for long arguments
> and confusion between Rosenites.  Some even say that an
> extant, concrete machine in the real world actually is
> complex_rr in the same sense that a rock or a mountain is
> (but not a tree or a cat).  Others vehemently deny that.  The
> former seem to submit to degrees of complexity_rr whereas the
> others seem to think it's bivalent.
>
> So, I already asked this; but, the conversation really needs
> a clear understanding of what we mean by "computation".  
> Perhaps we could split it into two categories:  computation_c
> would indicate the activities of a concrete machine and
> computation_a would indicate the (supposed) activities of a
> universal turing machine.
>
> Joost Rekveld on 01/08/2008 02:13 PM:
> > isomorphism could be possible), but from what I understand from
> > Rosen, Pattee, Pask and Cariani is that novelty in a real, non-
> > platonic (let's say Aristotelic ?) world has to do with the
> > appearance of new primitives: new symbols with new meanings in a new
> >  syntax. The construction of symbols in the real world is an open-
> > ended process, which is why no isomorphism with a closed, formal
> > system is possible.
>
> Marcus G. Daniels on 01/08/2008 02:52 PM:
> > I don't see why this must be so.   One could imagine that a
> robot had
> > a field programmable gate array that could, in effect, burn
> an all new
> > processor and bring it online.  But, usually when new computer
> > architectures are being developed, the developers just write a
> > software simulator for it in initial stages (that mimics
> the intended
> > physics of the hardware design). Even the adiabatic quantum
> computer
> > people at DWave are using existing silicon process technologies to
> > design circuits..
>
> Joost Rekveld on 01/08/2008 03:24 PM:
> > I guess the crucial difference is that such a
> self-constructing robot  
> > would be grounded in the real world and not in a
> prespecified computed
> > universe. It would be able to evolve its own computed universe. I'm
> > not sure what to think of all this, but I like Cariani's
> ideas a lot
> > and so far I haven't found any basic flaw in them. But, as
> said, being
> > non-schooled in these matters that doesn't necessarily mean
> very much.
>
> - --
> glen e. p. ropella, 971-219-3846, http://tempusdictum.com 
> Government never furthered any enterprise but by the alacrity
> with which it got out of its way. -- Henry David Thoreau
>
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.6 (GNU/Linux)
> Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
>
> iD8DBQFHhB7PZeB+vOTnLkoRAkxPAJkBFRfqeFx/UOEwqm05yJOZ8WHO9gCfTefY
> HYWqQsjEqLVI5D13iIW0zoc=
> =WGg8
> -----END PGP SIGNATURE-----
>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> lectures, archives, unsubscribe, maps at http://www.friam.org
>
>




Reply | Threaded
Open this post in threaded view
|

not enough of Robert Rosen

glen ep ropella
In reply to this post by Joost Rekveld
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Joost Rekveld on 01/08/2008 06:14 PM:
> I missed part of this thread and please feel free to ignore my  
> questions if I make you repeat things, but there's two things in your  
> reply I don't get:

Sorry for the delay.  I had several neglected conversations I had to
catch up on.... other communities were blissfully free of my annoying
presence... can't let that happen.

> - what does 'fragile to ambiguity' mean ?

Ambiguity is a pretty large word.  But for this purpose, it means an
unanswerable question to which one needs an answer.  The typical logical
problem would be a situation where neither "true" nor "false" is the
right answer; instead the answer lies somewhere in between.  (This
generalizes to the vernacular sense of the word "ambiguity", but that's
not necessary for now.)

When a system comes to such a problem, it has to "handle" it.  There are
various ways of handling it, of course.  It might just choose randomly.
 It might be endowed with some fuzzy decision making method.  It might
have some heuristics available.  Etc.

If, however, it can't handle it.  I.e. if the system _must_ have an
answer and it can't find a way to coerce the answer into "false" or
"true", that means it's fragile to that particular ambiguous situation.

If there exists even a single ambiguous situation to which a system is
fragile, then it is fragile to ambiguity.

> - what would a 'holarchy of formal systems' look like ? Is't a  
> holarchy a structure where influence is not only top-down but also  
> bottom-up ? And how could any such bi-directionality ever exist in  
> some kind of nesting of formal systems ?

Yes, [de]composition are both complete in a holarchy.  (We do have to
allow that a holarchy might be a platonic ideal.... but let's just
assume they exist in reality for now.)

As we practice math (and programming) now, it seems like we have a
holarchy already.  Whenever we come to an axiom (A_1) or an irreducible
theorem in our current systems, we (humans) can hop outside of that
formal system, build a finer-grained formal system wherein we can derive
A_1 from smaller building blocks.  There doesn't seem to be a limit to
our (human) ability to do that.  Likewise, when we reach an undecidable
proposition (P_1) in any particular formal system (F_1), we can hop
outside F_1 and either modify it by adding P_1 to the axioms to create
F_1' or build a different formal system F_2 wherein we can derive P_1 or
we could build a different formal system F_3 that holds P_1 as an axiom.

Now, when we do such things, A_1 and P_1 have different _meanings_ (by
definition) because different formal systems have different semantic
grounding.  But, somehow, as humans, we can do it and, in doing it,
satisfy ourselves of the reasonableness of A_1 or P_1.  We may not be
able to say A_1 or P_1 are sound (true in reality).  But they are valid
according to their respective formal systems and if those formal systems
seem reasonable, then we can just accept it and get on with our work.


The fact that we, as humans, can engage in such system-hopping, at will,
and that there doesn't seem to be a limit on when/where/how-often we can
do it, leads one to believe that we have access to a formal systems
holarchy.  (If you're a platonist, the holarchy already exists and we
humans just explore it.  If you're a constructivist, then we're
constructing the holarchy as we go along.)

I hope that helps.  Sorry if it doesn't answer your question directly.

- --
glen e. p. ropella, 971-219-3846, http://tempusdictum.com
Never ascribe to malice that which is adequately explained by
incompetence.  -- attributed to Napolean Bonaparte

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFHjNzIZeB+vOTnLkoRApmyAKCjClwcGskvFQQQ9RMt2XCcv+hnGQCg06r4
i33KlfII62j/d4ewbM/NZ5s=
=BzuY
-----END PGP SIGNATURE-----


Reply | Threaded
Open this post in threaded view
|

not enough of Robert Rosen

glen ep ropella
In reply to this post by Phil Henshaw-2
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Phil Henshaw on 01/09/2008 08:18 AM:

> The root problem is that equations are *so* self-sufficient that they
> have no environmental interaction at all, and that all natural systems
> are entirely different on that count.   Natural systems actually all
> directly *grow out of* the environments they will continue to interact
> with.  That equations don't do that and robots don't do that, except as
> the man-equation & man-robot couples they are, is the rub.  
>
> That explaining why seems to be an ill-posed question, then, might it be
> considered highly useful information too, about what information can
> explain v. what we can only point to.

[grin]  Well, the way _I_ parse what you've said, I agree.  The problem
lies in the assumption of duality... the idea that we can _ever_ cleanly
separate inference from causality.

All models are always false.  The only access to truth is through
embedded interaction with the ambient context.

Hence, until/unless we build robots whose mind is directly and naturally
composed from the environment (like our minds are), the robots will be
fragile to ambiguity.

I take some exception to your use of the phrase "grow out of" and the
word "grown".  It conflates too many things.  For example, I think we
could, in principle, remove _time_ from the process and instantaneously
come up with a fully grounded robot mind.  I.e. I think it's logically
(but perhaps physically) possible.  Other examples of implications from
"growth" we might be able to remove are particular types of accumulation
like iteration.  It may not be necessary that each stage be fully
dependent on the stage that immediately precedes it.  I.e. perhaps s_n =
f(s_n-2, s_n-4, ..., s_0) so that only every other stage contributes to
the "grown" thing.

In any case, the term "growth" is too vague.

Also, "growth" may not be necessary at all.  The flaw in the mind/body
duality doesn't lie with how the system came about, necessarily.  The
flaw is that there is no such thing as a complete logical abstraction
layer.  One cannot completely separate thought from reality (and vice
versa), inference from causality, chemistry from physics.

Any "computer" where the software _cannot_ puncture the logical
abstraction layer and modify the hardware will be fragile to ambiguity.

- --
glen e. p. ropella, 971-219-3846, http://tempusdictum.com
If you disclose the solution to the mystery you are simply depriving the
other seekers of an important source of energy. -- Conchis, "The Magus"
by John Fowles

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFHjOBSZeB+vOTnLkoRAu+sAKCVV+aZMyS8mYMwXKITnX519D+SJACffMRU
IC2E4ivbuyRz0PVJXsuBEdg=
=TN9N
-----END PGP SIGNATURE-----


12