What is entropy and what is information?

Discussion in 'Physics & Math' started by arfa brane, Jul 28, 2017.

  1. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Is entropy really just a quantification of something that isn't known about a closed system, and is information only then defined by a difference in entropy, i.e. by relative entropies?

    Why is there a problem with thermodynamic entropy being defined by heat and differences in heat, whereas one can also say that entropy is equivalent to what isn't known about particle momentum (or for instance in a gas, what isn't known about particle velocities)?

    To even be able to say there is an entropy, doesn't there have to be a closed system. and don't you have to define probabilities? Is entropy even definable without probability?
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. exchemist Valued Senior Member

    Messages:
    12,514
    This article looks to me like a good starting point: https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory

    According to this, there is no getting away from it that thermodynamic and information entropy are not quite the same concept, although it shows they are closely related ideas.

    I very much liked the following sentence: " To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes–no questions needed to be answered in order to fully specify the microstate, given that we know the macrostate."

    This addresses something I have always felt uncomfortable about in discussions on information and entropy, namely the faint suggestion that there may be something subjective about it, in the sense of "information" being - at least in common parlance - something that a conscious entity possesses. But the above makes clear it is an objective property, related to the number of possibilities open to the system.
     
    Last edited: Jul 28, 2017
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Let's look at that sentence:
    " To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes–no questions needed to be answered in order to fully specify the microstate, given that we know the macrostate."

    A yes/no question has a probability of 0.5 of being true or false. Entropy does seem to be a thing that requires probability, because to even be able to define probabilities in a system it must be bounded, or what physicists call closed. Otherwise the entropy would be infinite, right?

    Or roughly, the entropy is what you don't know (yet), given what you do know about such a system, but you have to have some information, like, there is a gas in a constant volume container, with pressure P and temperature T. So you also know that if the gas is made up of discrete particles, their number must also be constant. What you don't know is their velocities, say (among other things).

    The thermodynamic entropy can in principle be represented by this unknown information, in bits because any physical system in any state, can be represented in principle by a string of bits.

    So yeah, entropy has to be one of those known unknowns. You know . . .
     
    Last edited: Jul 29, 2017
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    But note that has nothing to do with a physical system being in one state or another.

    If I have a pair of dice, I can ask if they show a pair of sixes say, that can only be true or false, although the probability is 1/18.
    Or with a coin, again I can ask if heads is showing, but the coin can be biased.
     
  8. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Thinking about black holes, their entropy is proportional to the surface area of the horizon. This horizon doesn't exist for someone falling through it, but does exist for external observers who can "see" that the entropy is 2-dimensional.
    Or at least that's what the classical theory says.

    And so, it must be (?) because the infalling observer, interacting with the gravitational field, has fewer degrees of freedom because of the motion towards the singularity.

    The different observers are in a sense, computing the same thing but in different contexts, and there's some quantum algorithm we don't fully understand, yet.
     
    Last edited: Jul 30, 2017
  9. Michael 345 New year. PRESENT is 72 years oldl Valued Senior Member

    Messages:
    13,077
    I tend to think in simple terms because that's the only way Huey Dewey and Louie operate

    Entropy is a PROCESS where what ever the start state is it will become more disorganised the longer it is in existence unless energy is applied to the system to force it to become more organised

    Please Register or Log in to view the hidden image!

     
  10. exchemist Valued Senior Member

    Messages:
    12,514
    I don't think I would agree that "roughly, the entropy is what you don't know (yet)". This seems to be reintroducing exactly the subjective language I find objectionable. There is no suggestion that by getting values for all the variables you somehow decrease the entropy.
     
  11. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    A physical system doesn't "have entropy", measurements define what the entropy is:
    --http://rsta.royalsocietypublishing.org/content/374/2063/20150230#sec-4
     
  12. exchemist Valued Senior Member

    Messages:
    12,514
    This strikes me as unmitigated rubbish. At least it is if the "entropy" this author is speaking about is thermodynamic entropy.

    It must be wrong, surely, to claim that a physical system "does not have entropy" and that it varies according to what is measured? The entropy change in a heat engine or in a chemical reaction, that is to say, thermodynamic entropy, is entirely objective in nature. dS = dQrev/T is not in any way a subjective definition, so far as I can see.

    There are tables of standard entropy changes for chemical reaction that you can look up. These have been objectively determined: https://en.wikipedia.org/wiki/Thermodynamic_databases_for_pure_substances

    and are used daily by working chemists. I have no doubt that similar tables exist in engineering.

    I have a horrible feeling you are working towards a conflation of entropy and the uncertainty principle. I am convinced this is a really bad idea.
     
    Last edited: Jul 31, 2017
  13. exchemist Valued Senior Member

    Messages:
    12,514
    I wouldn't call it a process. It is a property of a system that can be calculated from measurement of energy change and temperature: dS = dQrev/T. The process you are thinking of is that in most common process, namely the irreversible or "spontaneous" ones, the entropy, S, always increases.
     
  14. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
     
  15. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Information is whatever you want it to be, but it has to be encoded physically, there has to be one or more 'systems of particles'.

    That is, it has to be stored or remembered; if there are unknowns, these can't be information, information has to be knowable, but knowable is just another word for stored somewhere. Entropy is a known, or at least bounded, unknown, so entropy is not information.

    I can arbitrarily decide that some subset of 16-bit binary numbers is information, the rest isn't, it's random garbage. In fact that's what I do if I write a 16-bit program . . .
     
  16. TheFrogger Banned Valued Senior Member

    Messages:
    2,175
    Entropy is without discipline and therefore is without direction. It exists in every direction and everywhere.
     
  17. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    In every direction and everywhere in what system? Entropy is system-dependent in that it depends on how you define the system.

    Entropy cannot be physical, although thermodynamic entropy has physical units, that's a consequence of defining it in terms of the exchange of heat. If information is always physical, then something unknown isn't, because it isn't information.
     
    Last edited: Aug 1, 2017
  18. exchemist Valued Senior Member

    Messages:
    12,514
    Never heard of Jaynes but this sounds nonsensical. But as there is no context provided it is hard to be categoric.

    Why not engage with the issues I raise yourself, instead of just producing dodgy or out of context quotes from other people?
     
  19. Tonyc Registered Member

    Messages:
    10
    Your wording of your posts seems different to the title of the thread. In the title you ask a question , but in your opening sentence, you start with a question about a speculation of ''Is entropy really just''.

    Quite clearly if you were to type into google, the word entropy. You would get a link displaced like this one: https://en.wikipedia.org/wiki/Entropy

    Now this will tell you what entropy really is , in the context the rest of us understand. Is Entropy what we all have defined it to be? Yes

    Now as for information,
    , we all use the quoted definition . The word information does not change in meaning when discussing entropy.
     
  20. Confused2 Registered Senior Member

    Messages:
    609
    Looking at (say) a litre of gas, on the one hand the 'information' would appear to require a knowledge of the path of every molecule and on the other hand the 'information' can be reduced to the statistics of N (where N is large) identical molecules. Is Arfa's point that the 'information' content depends on whether or not you choose to tag every molecule and treat it individually?
    https://en.wikipedia.org/wiki/Maxwell–Boltzmann_distribution
     
  21. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    What issues have you raised though? The issue that thermodynamic entropy "must be objective" because entropy is determined objectively, which is to say, via measurement?

    Then the claim that information theory is a theory of the relative states of measuring devices is nonsensical? I can't see how that can possibly follow.
    But you haven't told us what entropy really is. You've just posted a link.

    I've seen it defined as energy which isn't available to do work. but what does that mean?
     
    Last edited: Aug 2, 2017
  22. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    I think you have the concept of what the information is, subjectively wrong. Information is related to the possibilities open to the system because the possibilities are the entropy. Possibility is not a physical thing, the particles in a gas are all moving around and when at equilibrium, the state is the most probable of all the possible states.

    Interestingly I have a copy of a modern undergrad physics text, which has a chapter called statistical physics. The word entropy doesn't appear in this chapter (it isn't in the index either) but it is in a small footnote. I guess the rationale there is to underline the distributions of energy in physical systems, and the statistics, or possibly avoid what the author thinks isn't needed.
     
  23. exchemist Valued Senior Member

    Messages:
    12,514
    I would agree that the entropy represents the amount of information that would be needed to completely specify the state. That is a way of saying it is a measure of the complexity of the state, the number of degrees of freedom it has.

    But what I think is wrong is any suggestion that, by acquiring some of that information through measurement, an observer can decrease the entropy as perceived by him (but implicitly not by others without access to that information). It may be that you are not claiming this, but it would be helpful to be sure.

    I maintain that entropy is an objective, determinable, property of a thermodynamic system. Would you agree with that?
     

Share This Page