Nick,
Part of the confusion is that thermodynamics deals with energy, not numbers. A random sequence of specific, unchanging numbers has no more or less entropy than a series of zeros. In entropy, we have energy distributed randomly over a vast number of states, constantly switching from one state to another (e.g. one specific arrangement of gas molecules to another), which limits the energy's availability and usefulness. Thermodynamics is a precise accounting for energy. The food and dung have free energies relative to being CO2 and H2O in air at a known temperature and pressure. Whether we have cows or beetles to do the conversion has no impact on the calculation. Gasoline would be poisonous to both, but it has a precise free energy relative to its combustion products. It is a fundamental law of thermodynamics that these calculations are NOT path dependent, they are functions of state. When entropy is translated to apply it to information, what takes the place of energy? What is conserved? It is worth remembering, that information as we perceive it, has no thermodynamic significance -- a paper with useful information will burn and release the same exact energy as a paper with gibberish written on it, as long as the amount of paper and ink is the same. -Mike Oliker (505) 821-3407 mad scientist Message: 3 Date: Mon, 22 Nov 2004 19:45:54 -0500 From: "Nicholas Thompson" <[hidden email]> Subject: [FRIAM] intentionality and entropy All, But allowing, for moment, for the world of objects to house a metaphor for energy quality, consider the following model. Ask you computer to splash out points randomly a long straight line. For kicks, allow the line to be of infinite length. Would you not say that the entropy of the points along that line is pretty much a hundred percent? Ok, now rotate the line on your computer so that it is end on. Now all the points appear on top of one another, Would you not say that the entropy is pretty near zero percent? And couldn't this principle be expanded to more and more dimensions, so that we could never be sure that above a dimension that we were currently looking at there might be a dimension from which all the points might seem to be grouped together and have zero entropy and/or a dimension in which the points appeared splayed out and therefore had infinite entropy. And since the number of dimensions is infinite, could we not suppose that for any set of objects there will always be one dimension from which the entropy is zero and one dimension from which their entropy is maximal? I get to this confusion through thinking about the dung fly. All the time the cow is wandering around the field it can be thought of as gathering high quality resources for itself and degrading them. The degraded product is of course the dung, which is precisely ordered for the advantage of the dung fly. So even as the entropy of the stuff in the cow's gut is being decreased but from the point of view of cow , it is being increased from the point of view of the dung fly. If this way of thinking makes any sense, then entropy is an intentional construct. Nicholas S. Thompson Professor of Psychology and Ethology Clark University [hidden email] <http://home.earthlink.net/~nickthompson/> http://home.earthlink.net/~nickthompson/ [hidden email] -------------- next part -------------- An HTML attachment was scrubbed... URL: /pipermail/friam_redfish.com/attachments/20041122/16510ad3/attachment.htm |
Free forum by Nabble | Edit this page |