# What is entropy and what is information?

Discussion in 'Physics & Math' started by arfa brane, Jul 28, 2017.

1. ### arfa branecall me arfValued Senior Member

Messages:
6,218
Is entropy really just a quantification of something that isn't known about a closed system, and is information only then defined by a difference in entropy, i.e. by relative entropies?

Why is there a problem with thermodynamic entropy being defined by heat and differences in heat, whereas one can also say that entropy is equivalent to what isn't known about particle momentum (or for instance in a gas, what isn't known about particle velocities)?

To even be able to say there is an entropy, doesn't there have to be a closed system. and don't you have to define probabilities? Is entropy even definable without probability?

3. ### exchemistValued Senior Member

Messages:
7,981

According to this, there is no getting away from it that thermodynamic and information entropy are not quite the same concept, although it shows they are closely related ideas.

I very much liked the following sentence: " To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes–no questions needed to be answered in order to fully specify the microstate, given that we know the macrostate."

This addresses something I have always felt uncomfortable about in discussions on information and entropy, namely the faint suggestion that there may be something subjective about it, in the sense of "information" being - at least in common parlance - something that a conscious entity possesses. But the above makes clear it is an objective property, related to the number of possibilities open to the system.

Last edited: Jul 28, 2017

5. ### arfa branecall me arfValued Senior Member

Messages:
6,218
Let's look at that sentence:
" To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes–no questions needed to be answered in order to fully specify the microstate, given that we know the macrostate."

A yes/no question has a probability of 0.5 of being true or false. Entropy does seem to be a thing that requires probability, because to even be able to define probabilities in a system it must be bounded, or what physicists call closed. Otherwise the entropy would be infinite, right?

Or roughly, the entropy is what you don't know (yet), given what you do know about such a system, but you have to have some information, like, there is a gas in a constant volume container, with pressure P and temperature T. So you also know that if the gas is made up of discrete particles, their number must also be constant. What you don't know is their velocities, say (among other things).

The thermodynamic entropy can in principle be represented by this unknown information, in bits because any physical system in any state, can be represented in principle by a string of bits.

So yeah, entropy has to be one of those known unknowns. You know . . .

Last edited: Jul 29, 2017

7. ### arfa branecall me arfValued Senior Member

Messages:
6,218
But note that has nothing to do with a physical system being in one state or another.

If I have a pair of dice, I can ask if they show a pair of sixes say, that can only be true or false, although the probability is 1/18.
Or with a coin, again I can ask if heads is showing, but the coin can be biased.

8. ### arfa branecall me arfValued Senior Member

Messages:
6,218
Thinking about black holes, their entropy is proportional to the surface area of the horizon. This horizon doesn't exist for someone falling through it, but does exist for external observers who can "see" that the entropy is 2-dimensional.
Or at least that's what the classical theory says.

And so, it must be (?) because the infalling observer, interacting with the gravitational field, has fewer degrees of freedom because of the motion towards the singularity.

The different observers are in a sense, computing the same thing but in different contexts, and there's some quantum algorithm we don't fully understand, yet.

Last edited: Jul 30, 2017
9. ### Michael 345New year. PRESENT is 69 years oldValued Senior Member

Messages:
7,359
I tend to think in simple terms because that's the only way Huey Dewey and Louie operate

Entropy is a PROCESS where what ever the start state is it will become more disorganised the longer it is in existence unless energy is applied to the system to force it to become more organised

10. ### exchemistValued Senior Member

Messages:
7,981
I don't think I would agree that "roughly, the entropy is what you don't know (yet)". This seems to be reintroducing exactly the subjective language I find objectionable. There is no suggestion that by getting values for all the variables you somehow decrease the entropy.

11. ### arfa branecall me arfValued Senior Member

Messages:
6,218
A physical system doesn't "have entropy", measurements define what the entropy is:
--http://rsta.royalsocietypublishing.org/content/374/2063/20150230#sec-4

12. ### exchemistValued Senior Member

Messages:
7,981
This strikes me as unmitigated rubbish. At least it is if the "entropy" this author is speaking about is thermodynamic entropy.

It must be wrong, surely, to claim that a physical system "does not have entropy" and that it varies according to what is measured? The entropy change in a heat engine or in a chemical reaction, that is to say, thermodynamic entropy, is entirely objective in nature. dS = dQrev/T is not in any way a subjective definition, so far as I can see.

There are tables of standard entropy changes for chemical reaction that you can look up. These have been objectively determined: https://en.wikipedia.org/wiki/Thermodynamic_databases_for_pure_substances

and are used daily by working chemists. I have no doubt that similar tables exist in engineering.

I have a horrible feeling you are working towards a conflation of entropy and the uncertainty principle. I am convinced this is a really bad idea.

Last edited: Jul 31, 2017
13. ### exchemistValued Senior Member

Messages:
7,981
I wouldn't call it a process. It is a property of a system that can be calculated from measurement of energy change and temperature: dS = dQrev/T. The process you are thinking of is that in most common process, namely the irreversible or "spontaneous" ones, the entropy, S, always increases.

Messages:
6,218

15. ### arfa branecall me arfValued Senior Member

Messages:
6,218
Information is whatever you want it to be, but it has to be encoded physically, there has to be one or more 'systems of particles'.

That is, it has to be stored or remembered; if there are unknowns, these can't be information, information has to be knowable, but knowable is just another word for stored somewhere. Entropy is a known, or at least bounded, unknown, so entropy is not information.

I can arbitrarily decide that some subset of 16-bit binary numbers is information, the rest isn't, it's random garbage. In fact that's what I do if I write a 16-bit program . . .

16. ### TheFroggerValued Senior Member

Messages:
1,398
Entropy is without discipline and therefore is without direction. It exists in every direction and everywhere.

17. ### arfa branecall me arfValued Senior Member

Messages:
6,218
In every direction and everywhere in what system? Entropy is system-dependent in that it depends on how you define the system.

Entropy cannot be physical, although thermodynamic entropy has physical units, that's a consequence of defining it in terms of the exchange of heat. If information is always physical, then something unknown isn't, because it isn't information.

Last edited: Aug 1, 2017
18. ### exchemistValued Senior Member

Messages:
7,981
Never heard of Jaynes but this sounds nonsensical. But as there is no context provided it is hard to be categoric.

Why not engage with the issues I raise yourself, instead of just producing dodgy or out of context quotes from other people?

19. ### TonycRegistered Member

Messages:
10

Quite clearly if you were to type into google, the word entropy. You would get a link displaced like this one: https://en.wikipedia.org/wiki/Entropy

Now this will tell you what entropy really is , in the context the rest of us understand. Is Entropy what we all have defined it to be? Yes

Now as for information,
, we all use the quoted definition . The word information does not change in meaning when discussing entropy.

20. ### Confused2Registered Senior Member

Messages:
516
Looking at (say) a litre of gas, on the one hand the 'information' would appear to require a knowledge of the path of every molecule and on the other hand the 'information' can be reduced to the statistics of N (where N is large) identical molecules. Is Arfa's point that the 'information' content depends on whether or not you choose to tag every molecule and treat it individually?
https://en.wikipedia.org/wiki/Maxwell–Boltzmann_distribution

21. ### arfa branecall me arfValued Senior Member

Messages:
6,218
What issues have you raised though? The issue that thermodynamic entropy "must be objective" because entropy is determined objectively, which is to say, via measurement?

Then the claim that information theory is a theory of the relative states of measuring devices is nonsensical? I can't see how that can possibly follow.
But you haven't told us what entropy really is. You've just posted a link.

I've seen it defined as energy which isn't available to do work. but what does that mean?

Last edited: Aug 2, 2017
22. ### arfa branecall me arfValued Senior Member

Messages:
6,218
I think you have the concept of what the information is, subjectively wrong. Information is related to the possibilities open to the system because the possibilities are the entropy. Possibility is not a physical thing, the particles in a gas are all moving around and when at equilibrium, the state is the most probable of all the possible states.

Interestingly I have a copy of a modern undergrad physics text, which has a chapter called statistical physics. The word entropy doesn't appear in this chapter (it isn't in the index either) but it is in a small footnote. I guess the rationale there is to underline the distributions of energy in physical systems, and the statistics, or possibly avoid what the author thinks isn't needed.

23. ### exchemistValued Senior Member

Messages:
7,981
I would agree that the entropy represents the amount of information that would be needed to completely specify the state. That is a way of saying it is a measure of the complexity of the state, the number of degrees of freedom it has.

But what I think is wrong is any suggestion that, by acquiring some of that information through measurement, an observer can decrease the entropy as perceived by him (but implicitly not by others without access to that information). It may be that you are not claiming this, but it would be helpful to be sure.

I maintain that entropy is an objective, determinable, property of a thermodynamic system. Would you agree with that?