On Fri, Oct 11, 2013 at 11:08:18PM +0200, Jochen Fromm wrote:
> Nice to see the list is still alive :-) Entropy as
> information in disguise. Interesting. Isn't Entropy
> related to disorder, that is to say lack of information?
>
> -J.
Something like that. The exact relationship is
S + I = SM
where S is entropy, I is information and SM is the log of total number
of states the system could be in.
Information is sometimes said to be "negentropy", because
\Delta I = - \Delta S
when your system size remains constant over time.
Hi Nick,
| Free forum by Nabble | Edit this page |