Reversing the Thermodynamic Arrow of Time:

Discussion in 'Physics & Math' started by paddoboy, Nov 27, 2017.

  1. paddoboy Valued Senior Member

    Messages:
    21,647
    https://arxiv.org/pdf/1711.03323.pdf

    Reversing the thermodynamic arrow of time using quantum correlations:
    9 Nov 2017:

    Abstract:

    The second law permits the prediction of the direction of natural processes, thus defining a thermodynamic arrow of time. However, standard thermodynamics presupposes the absence of initial correlations between interacting systems. We here experimentally demonstrate the reversal of the arrow of time for two initially quantum correlated spins-1/2, prepared in local thermal states at different temperatures, employing a Nuclear Magnetic Resonance setup. We observe a spontaneous heat flow from the cold to the hot system. This process is enabled by a trade off between correlations and entropy that we quantify with information-theoretical quantities.
    :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

    In other words your cold cup of coffee will not heat itself up, but quantum systems seemingly do not adhere to such restraints.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. exchemist Valued Senior Member

    Messages:
    6,624
    One for Arfa, I think. He's into information entropy.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. arfa brane call me arf Valued Senior Member

    Messages:
    5,365
    The only comment I can muster so far (I've only skimmed the paper) is that information is really like arbitrary coordinates; information isn't intrinsic to any kind of system of particles, but something abstract that we (or some "observer") assigns to a system whatever it is.

    Entropy thus falls into the very same category, it doesn't exist physically, particles and their interactions do ("of course"), beyond which there is mere abstraction . . .

    That said, maybe I don't know what the hell I'm talking about.
    However, this comment from the authors is interesting and perhaps supports (up to you) my own:
    .
    What we call time seems to be then, a choice we make which is correlated with energy flow (in one direction), just like choosing which direction is up.
     
    Last edited: Nov 27, 2017
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Q-reeus Valued Senior Member

    Messages:
    2,605
    Huh? I see that kind of comment often enough elsewhere and it makes me wince. Just read 2nd sentence, first line under "5.3. The Second Law of Thermodynamics" here:
    https://web.stanford.edu/~peastman/statmech/thermodynamics.html
    Then on to '5.4. Heat and Entropy' for a formal mathematical definition/application. What is subjective or not physical there?

    As for the OP article, while such 'micro refrigeration' is a great technical achievement, a few points worth noting:
    1: The reverse heat flow is temporal and after ~ 1ms reverts to the usual hot-to-cold flow.
    2: Authors carefully state there is only an apparent violation of 2nd Law - local entropy drop does not hold globally i.e for system as a whole.
    How is this not just a novel variation on what happens with your every-day fridge?
     
  8. arfa brane call me arf Valued Senior Member

    Messages:
    5,365
    Well, would you agree that heat is physical? If you do, can you use that concept to "prove" that thermodynamic entropy is also physical, and not just a concept which is given physical units because it's convenient?

    On the other hand, information has a physical basis, like coordinates do, but both are arbitrary. That's why you can talk about thermodynamic entropy as being some kind of information (since it has a physical basis, namely heat). But IS says information itself "has" an entropy; note there are many ways to define entropy in IS and in QIS, it isn't fixed. However the concept of heat energy is a fixed concept.
     
  9. paddoboy Valued Senior Member

    Messages:
    21,647
    Yet worth a scientific article and paper according to the professionals.
     
  10. Q-reeus Valued Senior Member

    Messages:
    2,605
    Well you may wish to argue entropy is just a derived quantity but it plays a central role in many areas of engineering and chemistry etc. I realize there are more or less exotic situations where the usual definition may need modification but that does not relegate it to an up-for-grabs arbitrary concept. Not imho.
     
  11. Q-reeus Valued Senior Member

    Messages:
    2,605
    And I do not dispute that. The question is - is it at rock bottom any more than a novel means to 'refrigerate' - one that does not bust the 2nd Law? Methinks not.
     
  12. paddoboy Valued Senior Member

    Messages:
    21,647
    The question is if you read the article and paper properly, a cold cup of coffee will not heat itself up, but quantum systems seemingly do not adhere to such restraints.
     
  13. Q-reeus Valued Senior Member

    Messages:
    2,605
    And if you read my comments in #4, the distinction between local and global entropy changes should have gotten through.
     
  14. paddoboy Valued Senior Member

    Messages:
    21,647
    Read the article and post qreeus, and if you have any criticism of the article or paper, why not take it to the appropriate professional quarters with another paper. easy peasy.
     
  15. arfa brane call me arf Valued Senior Member

    Messages:
    5,365
    Entropy is a derived quantity. I haven't said it's an arbitrary quantity. Information is arbitrary in the sense it's whatever you decide it is.

    The earth's equatorial line is "just" a derived quantity too, right? It's derived from the fact the earth rotates, right?
     
  16. Q-reeus Valued Senior Member

    Messages:
    2,605
    Yes but given that condition, it is uniquely defined. Similarly for any given thermodynamic system, entropy is uniquely defined - at least as given by the usual thermodynamic definition.
     
  17. exchemist Valued Senior Member

    Messages:
    6,624
    I don't see any mutual exclusivity between Q-reeus being right - if he is - and the research being worth writing up - if it was. It is an interesting curiosity of a spin-correlated system and maybe one day somebody will take it further in some way we cannot yet guess.
     
  18. paddoboy Valued Senior Member

    Messages:
    21,647
    from the paper......
    Error analysis.

    The main sources of error in the experiments are small non-homogeneities of the transverse rf-field, non-idealities in its time modulation, and non-idealities in the longitudinal field gradient. In order to estimate the error propagation, we have used a Monte Carlo method, to sample deviations of the quantum sate tomography (QST) data with a Gaussian distribution having widths determined by the variances corresponding to such data. The standard deviation of the distribution of values for the relevant information quantities is estimated from this sampling. The variances of the tomographic data are obtained by preparing the same state one hundred times, taking the full state tomography and comparing it with the theoretical expectation. These variances include random and systematic errors in both state preparation and data acquisition by QST. The error in each element of the density matrix estimated from this analysis is about 1%. All parameters in the experimental implementation, such as pulses intensity and its time duration, are optimized in order to minimize errors.
     
  19. exchemist Valued Senior Member

    Messages:
    6,624
    I assume the difference from a fridge is that this is said to be a "spontaneous" process, which normally is always associated with entropy increase. But indeed it is a very short-lived effect.
     
  20. arfa brane call me arf Valued Senior Member

    Messages:
    5,365
    Well sure. I agree. Then again I don't see that I've posted anything that would imply otherwise.
    But what is this "thermodynamic definition"? Is that simply because entropy was first defined, historically speaking, in a thermodynamic context, way before the current understanding of information and what it "really" is?

    Bearing in mind, the Maxwell's demon was postulated to be able to gain information at no cost, which is something we know today to be impossible.
     
  21. Q-reeus Valued Senior Member

    Messages:
    2,605
    There are QM subtleties involved that are over my head and won't pretend to anything like fully understand it. Overall though it seems clear that a very careful initial state preparation has primed the system for that spontaneous local entropy reversal. Such that increased disorder has been shuffled elsewhere. In keeping with their admission no net i.e global/system-wide violation of the 2nd Law has occurred.
     
  22. Q-reeus Valued Senior Member

    Messages:
    2,605
    As per my reference to MIT article linked to in #4. And of course many other standard references. Statistical mechanics - within the thermodynamic not IS context - just rigorously establishes it's validity via quantization procedures, in contrast to the otherwise ad hoc assumptions resorted to in earlier classical approaches.
    You know far more on the IS science side than myself so will take your word about there being a plethora of definitions of information entropy.
     
  23. arfa brane call me arf Valued Senior Member

    Messages:
    5,365
    I could argue that thermodynamic, or Boltzmann-Gibbs entropy, is the same as information entropy, modulo the Boltzmann constant.

    Entropy is really just uncertainty; in thermodynamics it amounts to information which isn't known about the internal states. That is, since a system of particles will always tend towards maximum randomness (e.g. adding milk to a cup of coffee, or covfefe if that's your tipple), the description of the overall state, at some instant, amounts to one of a very large number of random strings and you can't know which one, so you effectively have to 'write' all of them down.

    Does that make any sense?
     

Share This Page