# Information from energy flow?

Discussion in 'Physics & Math' started by spandrel, Sep 13, 2015.

1. ### spandrelRegistered Member

Messages:
35
I'm sure I read somewhere that the flow of energy can give rise to information. Does this ring any bells?

danshawen likes this.

3. ### originHeading towards oblivionValued Senior Member

Messages:
11,502
Well the flow of energy from a hot body gives you information about the temperature of the body.
The flow of energy through my cell phone tells me what to pick up at the store on my way home.

Do you mean something different than this?

5. ### exchemistValued Senior Member

Messages:
11,155
Apart from Origin's suggestion, I can think of two possible other senses in which this might be true. One is that I was told at school that analysing the energy and how it changes is one of the most powerful techniques for solving simple problems in mechanics. The other is more nebulous, to do with the connection between statistical thermodynamics and Shannon's information theory. But in that case it is not so much flow as a relation between disorder as defined by entropy and information.

danshawen likes this.

7. ### danshawenValued Senior Member

Messages:
3,950
In order to observe something by means of our senses, energy of some sort must be absorbed and analyzed, in our case, by means of sensory neurons and neurons. So, yes, this could be one way to define "information."

The only trouble I see with this definition is that a great deal of energy that is absorbed "flows", as you put it, does not fall on anything that is capable of observation; a stone, for instance, absorbs both heat and light from the environment, so energy "flows", and the only result is a stone that is slightly warmer. Until or unless something or someone notices it is warmer, what real difference does it make? Likewise, a long printout of billions of bits of information means nothing to anyone until or unless someone takes the trouble to read or otherwise make use of it.

Last edited: Sep 21, 2015
8. ### spandrelRegistered Member

Messages:
35
It's more to do with statistical thermodynamics than Shannon's theories. The spontaneous appearance of complex ordered systems, that sort of thing rather than our perception or transmission of any information. The ultimate example is the evolution of life. The problem is that complexity as a discipline is so fuzzy it has been hijacked by the business gurus, so there's a load of nonsense out there.

9. ### exchemistValued Senior Member

Messages:
11,155
Aha, now I begin to understand what you may be driving at. This is interesting, because I've come across (wrong-headed) arguments from creationists in the past that life cannot have arisen naturally, because the increase in order of living things implies a decrease in entropy, which would violate the 2nd Law of TD.

Yes it is is true that order can arise spontaneously in nature, provided there is a flow of energy driving it that results in an overall increase in entropy. A simple example is ice freezing. The order in the crystal lattice reflects a decrease in entropy of the water as it freezes. BUT, and this is the point, the freezing process releases the Latent Heat of Fusion from the water and this is low temperature heat that goes into the environment. As you may be aware, dS = dQ/T, i.e. the lower the temperature of a flow of heat, the greater the entropy increase associated with it. So the total entropy of ice + environment does not decrease. If the process is spontaneous, it increases. Or, if it takes place (theoretically) reversibly, there is no change in total entropy.

In the more complex case of life, as an organism develops from a fertilised egg, its order and complexity grow, creating a zone of high order that grows in size. Nutrients are taken in and converted to ordered biochemistry, reducing entropy inside the organism. But the organism is alive: it respires and metabolises, and this releases - yes, you've guessed it - low temperature heat into the environment. When it comes to evolution, one has thousands upon thousands of individual life-cycles of generations of organisms. Each one creates order as it grows, but increases the entropy of the environment as it does so - and then it dies, leading to destruction of its accumulated order (more entropy increase), apart from any egg or sperm cells it has released to continue the process. These egg or sperm cells may, perhaps, be infinitesimally more complex that the eggs or sperm cells of the parent, due to evolutionary change, but such an increase is absolutely tiny, compared to the increase in entropy of the environment that takes place over the lifetime of the organism. This entropy increase is due, as you say, to the heat flows involved in these biochemical processes of life.

Last edited: Sep 22, 2015
10. ### spandrelRegistered Member

Messages:
35
Thanks exchemist, that's the sort of thing I meant. I'm sure there was a site that did a treatment of entropy changes as a change of information but I can't find it now. And yes it would be useful in answer to creationist claims that information cannot come from nowhere. Not that they take any notice but others read those threads too.

11. ### exchemistValued Senior Member

Messages:
11,155
Yes, I do not pretend to understand the concept of "Shannon" or "information" entropy in any detail but, from what I have read, this is a different use of the term from that used in thermodynamics. There is apparently a connection, but it appears to me dangerous, and very likely inappropriate, to try to equate the two.

I also very much agree with you that the statistical view of entropy (S=k lnW etc) tends to catch the imagination of lay people and get distorted in popular science presentations of it. It can get hijacked by people trying to elevate common sense to a academic discipline, as many business gurus seem to. Understanding what W really is, in Boltzmann's formula, is not trivial (and I'm rusty on the definitions myself).

danshawen likes this.