Posted by
Russell Standish-2 on
URL: http://friam.383.s1.nabble.com/Notions-of-entropy-tp7584006p7584012.html
On Fri, Oct 11, 2013 at 11:08:18PM +0200, Jochen Fromm wrote:
> Nice to see the list is still alive :-) Entropy as
> information in disguise. Interesting. Isn't Entropy
> related to disorder, that is to say lack of information?
>
> -J.
Something like that. The exact relationship is
S + I = SM
where S is entropy, I is information and SM is the log of total number
of states the system could be in.
Information is sometimes said to be "negentropy", because
\Delta I = - \Delta S
when your system size remains constant over time.
--
----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics
[hidden email]
University of New South Wales
http://www.hpcoders.com.au----------------------------------------------------------------------------
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe
http://redfish.com/mailman/listinfo/friam_redfish.com