Re: Notions of entropy

Posted by Jochen Fromm-5 on
URL: http://friam.383.s1.nabble.com/Notions-of-entropy-tp7584006p7584013.html

 From what I remember the Entropy S should be
equal to S = k ln(W) where W is the number of
microstates. Ordered states have a lower number
of microstates, or something like that.
http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

-J.

On 10/11/2013 11:38 PM, Russell Standish wrote:

> On Fri, Oct 11, 2013 at 11:08:18PM +0200, Jochen Fromm wrote:
>> Nice to see the list is still alive :-) Entropy as
>> information in disguise. Interesting. Isn't Entropy
>> related to disorder, that is to say lack of information?
>>
>> -J.
> Something like that. The exact relationship is
>
> S + I = SM
>
> where S is entropy, I is information and SM is the log of total number
> of states the system could be in.
>
> Information is sometimes said to be "negentropy", because
>
> \Delta I = - \Delta S
>
> when your system size remains constant over time.
>
>


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com